WorldWideScience

Sample records for improve distribution models

  1. How can model comparison help improving species distribution models?

    Directory of Open Access Journals (Sweden)

    Emmanuel Stephan Gritti

    Full Text Available Today, more than ever, robust projections of potential species range shifts are needed to anticipate and mitigate the impacts of climate change on biodiversity and ecosystem services. Such projections are so far provided almost exclusively by correlative species distribution models (correlative SDMs. However, concerns regarding the reliability of their predictive power are growing and several authors call for the development of process-based SDMs. Still, each of these methods presents strengths and weakness which have to be estimated if they are to be reliably used by decision makers. In this study we compare projections of three different SDMs (STASH, LPJ and PHENOFIT that lie in the continuum between correlative models and process-based models for the current distribution of three major European tree species, Fagussylvatica L., Quercusrobur L. and Pinussylvestris L. We compare the consistency of the model simulations using an innovative comparison map profile method, integrating local and multi-scale comparisons. The three models simulate relatively accurately the current distribution of the three species. The process-based model performs almost as well as the correlative model, although parameters of the former are not fitted to the observed species distributions. According to our simulations, species range limits are triggered, at the European scale, by establishment and survival through processes primarily related to phenology and resistance to abiotic stress rather than to growth efficiency. The accuracy of projections of the hybrid and process-based model could however be improved by integrating a more realistic representation of the species resistance to water stress for instance, advocating for pursuing efforts to understand and formulate explicitly the impact of climatic conditions and variations on these processes.

  2. Improvement for Amelioration Inventory Model with Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Han-Wen Tuan

    2017-01-01

    Full Text Available Most inventory models dealt with deteriorated items. On the contrary, just a few papers considered inventory systems under amelioration environment. We study an amelioration inventory model with Weibull distribution. However, there are some questionable results in the amelioration paper. We will first point out those questionable results in the previous paper that did not derive the optimal solution and then provide some improvements. We will provide a rigorous analytical work for different cases dependent on the size of the shape parameter. We present a detailed numerical example for different ranges of the sharp parameter to illustrate that our solution method attains the optimal solution. We developed a new amelioration model and then provided a detailed analyzed procedure to find the optimal solution. Our findings will help researchers develop their new inventory models.

  3. Improving permafrost distribution modelling using feature selection algorithms

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2016-04-01

    The availability of an increasing number of spatial data on the occurrence of mountain permafrost allows the employment of machine learning (ML) classification algorithms for modelling the distribution of the phenomenon. One of the major problems when dealing with high-dimensional dataset is the number of input features (variables) involved. Application of ML classification algorithms to this large number of variables leads to the risk of overfitting, with the consequence of a poor generalization/prediction. For this reason, applying feature selection (FS) techniques helps simplifying the amount of factors required and improves the knowledge on adopted features and their relation with the studied phenomenon. Moreover, taking away irrelevant or redundant variables from the dataset effectively improves the quality of the ML prediction. This research deals with a comparative analysis of permafrost distribution models supported by FS variable importance assessment. The input dataset (dimension = 20-25, 10 m spatial resolution) was constructed using landcover maps, climate data and DEM derived variables (altitude, aspect, slope, terrain curvature, solar radiation, etc.). It was completed with permafrost evidences (geophysical and thermal data and rock glacier inventories) that serve as training permafrost data. Used FS algorithms informed about variables that appeared less statistically important for permafrost presence/absence. Three different algorithms were compared: Information Gain (IG), Correlation-based Feature Selection (CFS) and Random Forest (RF). IG is a filter technique that evaluates the worth of a predictor by measuring the information gain with respect to the permafrost presence/absence. Conversely, CFS is a wrapper technique that evaluates the worth of a subset of predictors by considering the individual predictive ability of each variable along with the degree of redundancy between them. Finally, RF is a ML algorithm that performs FS as part of its

  4. Improved Mathematical Models for Particle-Size Distribution Data

    African Journals Online (AJOL)

    BirukEdimon

    School of Civil & Environmental Engineering, Addis Ababa Institute of Technology,. 3. Murray Rix ... two improved mathematical models to describe ... demand further improvement to handle the PSD ... statistics and the range of the optimized.

  5. Improved Testing of Distributed Lag Model in Presence of ...

    African Journals Online (AJOL)

    The finite distributed lag models (DLM) are often used in econometrics and statistics. Application of the ordinary least square (OLS) directly on the DLM for estimation may have serious problems. To overcome these problems, some alternative estimation procedures are available in the literature. One popular method to ...

  6. Enhanced Vehicle Beddown Approximations for the Improved Theater Distribution Model

    Science.gov (United States)

    2014-03-27

    processed utilizing a heuristic routing and scheduling procedure the authors called the Airlift Planning Algorithm ( APA ). The linear programming model...LINGO 13 environment. The model is then solved by LINGO 13 and solution data is passed back to the Excel environment in a readable format . All original...DSS is relatively unchanged when solutions to the ITDM are referenced for comparison testing. Readers are encouraged to see Appendix I for ITDM VBA

  7. Projecting future expansion of invasive species: comparing and improving methodologies for species distribution modeling.

    Science.gov (United States)

    Mainali, Kumar P; Warren, Dan L; Dhileepan, Kunjithapatham; McConnachie, Andrew; Strathie, Lorraine; Hassan, Gul; Karki, Debendra; Shrestha, Bharat B; Parmesan, Camille

    2015-12-01

    Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships for Parthenium hysterophorus L. (Asteraceae) with four modeling methods run with multiple scenarios of (i) sources of occurrences and geographically isolated background ranges for absences, (ii) approaches to drawing background (absence) points, and (iii) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved using a global dataset for model training, rather than restricting data input to the species' native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e., into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g., boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post hoc test conducted on a new Parthenium dataset from Nepal validated excellent predictive performance of our 'best' model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for parthenium

  8. A GIS Tool for evaluating and improving NEXRAD and its application in distributed hydrologic modeling

    Science.gov (United States)

    Zhang, X.; Srinivasan, R.

    2008-12-01

    In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.

  9. Improved Predictions of the Geographic Distribution of Invasive Plants Using Climatic Niche Models

    Science.gov (United States)

    Ramírez-Albores, Jorge E.; Bustamante, Ramiro O.

    2016-01-01

    Climatic niche models for invasive plants are usually constructed with occurrence records taken from literature and collections. Because these data neither discriminate among life-cycle stages of plants (adult or juvenile) nor the origin of individuals (naturally established or man-planted), the resulting models may mispredict the distribution ranges of these species. We propose that more accurate predictions could be obtained by modelling climatic niches with data of naturally established individuals, particularly with occurrence records of juvenile plants because this would restrict the predictions of models to those sites where climatic conditions allow the recruitment of the species. To test this proposal, we focused on the Peruvian peppertree (Schinus molle), a South American species that has largely invaded Mexico. Three climatic niche models were constructed for this species using high-resolution dataset gathered in the field. The first model included all occurrence records, irrespective of the life-cycle stage or origin of peppertrees (generalized niche model). The second model only included occurrence records of naturally established mature individuals (adult niche model), while the third model was constructed with occurrence records of naturally established juvenile plants (regeneration niche model). When models were compared, the generalized climatic niche model predicted the presence of peppertrees in sites located farther beyond the climatic thresholds that naturally established individuals can tolerate, suggesting that human activities influence the distribution of this invasive species. The adult and regeneration climatic niche models concurred in their predictions about the distribution of peppertrees, suggesting that naturally established adult trees only occur in sites where climatic conditions allow the recruitment of juvenile stages. These results support the proposal that climatic niches of invasive plants should be modelled with data of

  10. Improved CFD Model to Predict Flow and Temperature Distributions in a Blast Furnace Hearth

    Science.gov (United States)

    Komiyama, Keisuke M.; Guo, Bao-Yu; Zughbi, Habib; Zulli, Paul; Yu, Ai-Bing

    2014-10-01

    The campaign life of a blast furnace is limited by the erosion of hearth refractories. Flow and temperature distributions of the liquid iron have a significant influence on the erosion mechanism. In this work, an improved three-dimensional computational fluid dynamics model is developed to simulate the flow and heat transfer phenomena in the hearth of BlueScope's Port Kembla No. 5 Blast Furnace. Model improvements feature more justified input parameters in turbulence modeling, buoyancy modeling, wall boundary conditions, material properties, and modeling of the solidification of iron. The model is validated by comparing the calculated temperatures with the thermocouple data available, where agreements are established within ±3 pct. The flow distribution in the hearth is discussed for intact and eroded hearth profiles, for sitting and floating coke bed states. It is shown that natural convection affects the flow in several ways: for example, the formation of (a) stagnant zones preventing hearth bottom from eroding or (b) the downward jetting of molten liquid promoting side wall erosion, or (c) at times, a vortex-like peripheral flow, promoting the "elephant foot" type erosion. A significant influence of coke bed permeability on the macroscopic flow pattern and the refractory temperature is observed.

  11. Regional climate model downscaling may improve the prediction of alien plant species distributions

    Science.gov (United States)

    Liu, Shuyan; Liang, Xin-Zhong; Gao, Wei; Stohlgren, Thomas J.

    2014-12-01

    Distributions of invasive species are commonly predicted with species distribution models that build upon the statistical relationships between observed species presence data and climate data. We used field observations, climate station data, and Maximum Entropy species distribution models for 13 invasive plant species in the United States, and then compared the models with inputs from a General Circulation Model (hereafter GCM-based models) and a downscaled Regional Climate Model (hereafter, RCM-based models).We also compared species distributions based on either GCM-based or RCM-based models for the present (1990-1999) to the future (2046-2055). RCM-based species distribution models replicated observed distributions remarkably better than GCM-based models for all invasive species under the current climate. This was shown for the presence locations of the species, and by using four common statistical metrics to compare modeled distributions. For two widespread invasive taxa ( Bromus tectorum or cheatgrass, and Tamarix spp. or tamarisk), GCM-based models failed miserably to reproduce observed species distributions. In contrast, RCM-based species distribution models closely matched observations. Future species distributions may be significantly affected by using GCM-based inputs. Because invasive plants species often show high resilience and low rates of local extinction, RCM-based species distribution models may perform better than GCM-based species distribution models for planning containment programs for invasive species.

  12. A supply chain model to improve the beef quality distribution using investment analysis: A case study

    Science.gov (United States)

    Lupita, Alessandra; Rangkuti, Sabrina Heriza; Sutopo, Wahyudi; Hisjam, Muh.

    2017-11-01

    There are significant differences related to the quality and price of the beef commodity in traditional market and modern market in Indonesia. Those are caused by very different treatments of the commodity. The different treatments are in the slaughter lines, the transportation from the abattoir to the outlet, the display system, and the control system. If the problem is not solved by the Government, the gap will result a great loss of the consumer regarding to the quality and sustainability of traditional traders business because of the declining interest in purchasing beef in the traditional markets. This article aims to improve the quality of beef in traditional markets. This study proposed A Supply Chain Model that involves the schemes of investment and government incentive for improving the distribution system. The supply chain model is can be formulated using the Mix Integer Linear Programming (MILP) and solved using the IBM®ILOG®CPLEX software. The results show that the proposed model can be used to determine the priority of programs for improving the quality and sustainability business of traditional beef merchants. By using the models, The Government can make a decision to consider incentives for improving the condition.

  13. Landscape and flow metrics affecting the distribution of a federally-threatened fish: Improving management, model fit, and model transferability

    Science.gov (United States)

    Worthington, Thomas A.; Zhang, T.; Logue, Daniel R.; Mittelstet, Aaron R.; Brewer, Shannon K.

    2016-01-01

    Truncated distributions of pelagophilic fishes have been observed across the Great Plains of North America, with water use and landscape fragmentation implicated as contributing factors. Developing conservation strategies for these species is hindered by the existence of multiple competing flow regime hypotheses related to species persistence. Our primary study objective was to compare the predicted distributions of one pelagophil, the Arkansas River Shiner Notropis girardi, constructed using different flow regime metrics. Further, we investigated different approaches for improving temporal transferability of the species distribution model (SDM). We compared four hypotheses: mean annual flow (a baseline), the 75th percentile of daily flow, the number of zero-flow days, and the number of days above 55th percentile flows, to examine the relative importance of flows during the spawning period. Building on an earlier SDM, we added covariates that quantified wells in each catchment, point source discharges, and non-native species presence to a structured variable framework. We assessed the effects on model transferability and fit by reducing multicollinearity using Spearman’s rank correlations, variance inflation factors, and principal component analysis, as well as altering the regularization coefficient (β) within MaxEnt. The 75th percentile of daily flow was the most important flow metric related to structuring the species distribution. The number of wells and point source discharges were also highly ranked. At the default level of β, model transferability was improved using all methods to reduce collinearity; however, at higher levels of β, the correlation method performed best. Using β = 5 provided the best model transferability, while retaining the majority of variables that contributed 95% to the model. This study provides a workflow for improving model transferability and also presents water-management options that may be considered to improve the

  14. Improvement, calibration and validation of a distributed hydrological model over France

    Directory of Open Access Journals (Sweden)

    P. Quintana Seguí

    2009-02-01

    Full Text Available The hydrometeorological model SAFRAN-ISBA-MODCOU (SIM computes water and energy budgets on the land surface and riverflows and the level of several aquifers at the scale of France. SIM is composed of a meteorological analysis system (SAFRAN, a land surface model (ISBA, and a hydrogeological model (MODCOU. In this study, an exponential profile of hydraulic conductivity at saturation is introduced to the model and its impact analysed. It is also studied how calibration modifies the performance of the model. A very simple method of calibration is implemented and applied to the parameters of hydraulic conductivity and subgrid runoff. The study shows that a better description of the hydraulic conductivity of the soil is important to simulate more realistic discharges. It also shows that the calibrated model is more robust than the original SIM. In fact, the calibration mainly affects the processes related to the dynamics of the flow (drainage and runoff, and the rest of relevant processes (like evaporation remain stable. It is also proven that it is only worth introducing the new empirical parameterization of hydraulic conductivity if it is accompanied by a calibration of its parameters, otherwise the simulations can be degraded. In conclusion, it is shown that the new parameterization is necessary to obtain good simulations. Calibration is a tool that must be used to improve the performance of distributed models like SIM that have some empirical parameters.

  15. Using Environmental DNA to Improve Species Distribution Models for Freshwater Invaders

    Directory of Open Access Journals (Sweden)

    Teja P. Muha

    2017-12-01

    Full Text Available Species Distribution Models (SDMs have been reported as a useful tool for the risk assessment and modeling of the pathways of dispersal of freshwater invasive alien species (IAS. Environmental DNA (eDNA is a novel tool that can help detect IAS at their early stage of introduction and additionally improve the data available for a more efficient management. SDMs rely on presence and absence of the species in the study area to infer the predictors affecting species distributions. Presence is verified once a species is detected, but confirmation of absence can be problematic because this depends both on the detectability of the species and the sampling strategy. eDNA is a technique that presents higher detectability and accuracy in comparison to conventional sampling techniques, and can effectively differentiate between presence or absence of specific species or entire communities by using a barcoding or metabarcoding approach. However, a number of potential bias can be introduced during (i sampling, (ii amplification, (iii sequencing, or (iv through the usage of bioinformatics pipelines. Therefore, it is important to report and conduct the field and laboratory procedures in a consistent way, by (i introducing eDNA independent observations, (ii amplifying and sequencing control samples, (iii achieving quality sequence reads by appropriate clean-up steps, (iv controlling primer amplification preferences, (v introducing PCR-free sequence capturing, (vi estimating primer detection capabilities through controlled experiments and/or (vii post-hoc introduction of “site occupancy-detection models.” With eDNA methodology becoming increasingly routine, its use is strongly recommended to retrieve species distributional data for SDMs.

  16. Improving Modeling of Extreme Events using Generalized Extreme Value Distribution or Generalized Pareto Distribution with Mixing Unconditional Disturbances

    OpenAIRE

    Suarez, R

    2001-01-01

    In this paper an alternative non-parametric historical simulation approach, the Mixing Unconditional Disturbances model with constant volatility, where price paths are generated by reshuffling disturbances for S&P 500 Index returns over the period 1950 - 1998, is used to estimate a Generalized Extreme Value Distribution and a Generalized Pareto Distribution. An ordinary back-testing for period 1999 - 2008 was made to verify this technique, providing higher accuracy returns level under upper ...

  17. A distribution benefits model for improved information on worldwide crop production. Volume 1: Model structure and application to wheat

    Science.gov (United States)

    Andrews, J.

    1976-01-01

    The improved model is suitable for the study of benefits of worldwide information on a variety of crops. Application to the previously studied case of worldwide wheat production shows that about $108 million per year of distribution benefits to the United States would be achieved by a satellite-based wheat information system meeting the goals of LACIE. The model also indicates that improved information alone will not change world stock levels unless production itself is stabilized. The United States benefits mentioned above are associated with the reduction of price fluctuations within the year and the more effective use of international trade to balance supply and demand. Price fluctuations from year to year would be reduced only if production variability were itself reduced.

  18. Better Water Demand and Pipe Description Improve the Distribution Network Modeling Results

    Science.gov (United States)

    Distribution system modeling simplifies pipe network in skeletonization and simulates the flow and water quality by using generalized water demand patterns. While widely used, the approach has not been examined fully on how it impacts the modeling fidelity. This study intends to ...

  19. Modeling the distribution of colonial species to improve estimation of plankton concentration in ballast water

    Science.gov (United States)

    Rajakaruna, Harshana; VandenByllaardt, Julie; Kydd, Jocelyn; Bailey, Sarah

    2018-03-01

    The International Maritime Organization (IMO) has set limits on allowable plankton concentrations in ballast water discharge to minimize aquatic invasions globally. Previous guidance on ballast water sampling and compliance decision thresholds was based on the assumption that probability distributions of plankton are Poisson when spatially homogenous, or negative binomial when heterogeneous. We propose a hierarchical probability model, which incorporates distributions at the level of particles (i.e., discrete individuals plus colonies per unit volume) and also within particles (i.e., individuals per particle) to estimate the average plankton concentration in ballast water. We examined the performance of the models using data for plankton in the size class ≥ 10 μm and test ballast water compliance using the above models.

  20. Modeling a Distributed Power Flow Controller with a PEM Fuel Cell for Power Quality Improvement

    Directory of Open Access Journals (Sweden)

    J. Chakravorty

    2018-02-01

    Full Text Available Electrical power demand is increasing at a relatively fast rate over the last years. Because of this increasing demand the power system is becoming very complex. Both electric utilities and end users of electric power are becoming increasingly concerned about power quality. This paper presents a new concept of distributed power flow controller (DPFC, which has been implemented with a proton exchange membrane (PEM fuel cell. In this paper, a PEM fuel cell has been simulated in Simulink/MATLAB and then has been used in the proposed DPFC model. The new proposed DPFC model has been tested on a IEEE 30 bus system.

  1. An Equivalent cross-section Framework for improving computational efficiency in Distributed Hydrologic Modelling

    Science.gov (United States)

    Khan, Urooj; Tuteja, Narendra; Ajami, Hoori; Sharma, Ashish

    2014-05-01

    While the potential uses and benefits of distributed catchment simulation models is undeniable, their practical usage is often hindered by the computational resources they demand. To reduce the computational time/effort in distributed hydrological modelling, a new approach of modelling over an equivalent cross-section is investigated where topographical and physiographic properties of first-order sub-basins are aggregated to constitute modelling elements. To formulate an equivalent cross-section, a homogenization test is conducted to assess the loss in accuracy when averaging topographic and physiographic variables, i.e. length, slope, soil depth and soil type. The homogenization test indicates that the accuracy lost in weighting the soil type is greatest, therefore it needs to be weighted in a systematic manner to formulate equivalent cross-sections. If the soil type remains the same within the sub-basin, a single equivalent cross-section is formulated for the entire sub-basin. If the soil type follows a specific pattern, i.e. different soil types near the centre of the river, middle of hillslope and ridge line, three equivalent cross-sections (left bank, right bank and head water) are required. If the soil types are complex and do not follow any specific pattern, multiple equivalent cross-sections are required based on the number of soil types. The equivalent cross-sections are formulated for a series of first order sub-basins by implementing different weighting methods of topographic and physiographic variables of landforms within the entire or part of a hillslope. The formulated equivalent cross-sections are then simulated using a 2-dimensional, Richards' equation based distributed hydrological model. The simulated fluxes are multiplied by the weighted area of each equivalent cross-section to calculate the total fluxes from the sub-basins. The simulated fluxes include horizontal flow, transpiration, soil evaporation, deep drainage and soil moisture. To assess

  2. Wrong, but useful: regional species distribution models may not be improved by range-wide data under biased sampling.

    Science.gov (United States)

    El-Gabbas, Ahmed; Dormann, Carsten F

    2018-02-01

    Species distribution modeling (SDM) is an essential method in ecology and conservation. SDMs are often calibrated within one country's borders, typically along a limited environmental gradient with biased and incomplete data, making the quality of these models questionable. In this study, we evaluated how adequate are national presence-only data for calibrating regional SDMs. We trained SDMs for Egyptian bat species at two different scales: only within Egypt and at a species-specific global extent. We used two modeling algorithms: Maxent and elastic net, both under the point-process modeling framework. For each modeling algorithm, we measured the congruence of the predictions of global and regional models for Egypt, assuming that the lower the congruence, the lower the appropriateness of the Egyptian dataset to describe the species' niche. We inspected the effect of incorporating predictions from global models as additional predictor ("prior") to regional models, and quantified the improvement in terms of AUC and the congruence between regional models run with and without priors. Moreover, we analyzed predictive performance improvements after correction for sampling bias at both scales. On average, predictions from global and regional models in Egypt only weakly concur. Collectively, the use of priors did not lead to much improvement: similar AUC and high congruence between regional models calibrated with and without priors. Correction for sampling bias led to higher model performance, whatever prior used, making the use of priors less pronounced. Under biased and incomplete sampling, the use of global bats data did not improve regional model performance. Without enough bias-free regional data, we cannot objectively identify the actual improvement of regional models after incorporating information from the global niche. However, we still believe in great potential for global model predictions to guide future surveys and improve regional sampling in data

  3. Improving simulated spatial distribution of productivity and biomass in Amazon forests using the ACME land model

    Science.gov (United States)

    Yang, X.; Thornton, P. E.; Ricciuto, D. M.; Shi, X.; Xu, M.; Hoffman, F. M.; Norby, R. J.

    2017-12-01

    Tropical forests play a crucial role in the global carbon cycle, accounting for one third of the global NPP and containing about 25% of global vegetation biomass and soil carbon. This is particularly true for tropical forests in the Amazon region, as it comprises approximately 50% of the world's tropical forests. It is therefore important for us to understand and represent the processes that determine the fluxes and storage of carbon in these forests. In this study, we show that the implementation of phosphorus (P) cycle and P limitation in the ACME Land Model (ALM) improves simulated spatial pattern of NPP. The P-enabled ALM is able to capture the west-to-east gradient of productivity, consistent with field observations. We also show that by improving the representation of mortality processes, ALM is able to reproduce the observed spatial pattern of above ground biomass across the Amazon region.

  4. Distributed intelligence improves availability

    International Nuclear Information System (INIS)

    Einholf, C.W.; Ciaramitaro, W.

    1982-01-01

    The new generation of instrumentation which is being developed to monitor critical variables in nuclear power plants is described. Powerful, compact microprocessors have been built into monitors to simplify data display. Some of the benefits of digital systems are improved plant availability, reduction in maintenance costs, reduction in manpower, lessening of test times and less frequent inspection and overhaul. (U.K.)

  5. Improved high-frequency equivalent circuit model based on distributed effects for SiGe HBTs with CBE layout

    International Nuclear Information System (INIS)

    Sun Ya-Bin; Li Xiao-Jin; Zhang Jin-Zhong; Shi Yan-Ling

    2017-01-01

    In this paper, we present an improved high-frequency equivalent circuit for SiGe heterojunction bipolar transistors (HBTs) with a CBE layout, where we consider the distributed effects along the base region. The actual device structure is divided into three parts: a link base region under a spacer oxide, an intrinsic transistor region under the emitter window, and an extrinsic base region. Each region is considered as a two-port network, and is composed of a distributed resistance and capacitance. We solve the admittance parameters by solving the transmission-line equation. Then, we obtain the small-signal equivalent circuit depending on the reasonable approximations. Unlike previous compact models, in our proposed model, we introduce an additional internal base node, and the intrinsic base resistance is shifted into this internal base node, which can theoretically explain the anomalous change in the intrinsic bias-dependent collector resistance in the conventional compact model. (paper)

  6. Improving the modelling of redshift-space distortions - I. A bivariate Gaussian description for the galaxy pairwise velocity distributions

    Science.gov (United States)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2015-01-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation r, such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and dispersion σ. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distortions on all scales, fully capturing the overall linear and non-linear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of RSD is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. We also show how this description naturally allows for the Taylor expansion of 1 + ξS(s) around 1 + ξR(r), which leads to the Kaiser linear formula when truncated to second order, explicating its connection with the moments of the velocity distribution functions. More work is needed, but these results indicate a very promising path to make definitive progress in our programme to improve RSD estimators.

  7. Consideration of time-evolving capacity distributions and improved degradation models for seismic fragility assessment of aging highway bridges

    International Nuclear Information System (INIS)

    Ghosh, Jayadipta; Sood, Piyush

    2016-01-01

    This paper presents a methodology to develop seismic fragility curves for deteriorating highway bridges by uniquely accounting for realistic pitting corrosion deterioration and time-dependent capacity distributions for reinforced concrete columns under chloride attacks. The proposed framework offers distinct improvements over state-of-the-art procedures for fragility assessment of degrading bridges which typically assume simplified uniform corrosion deterioration model and pristine limit state capacities. Depending on the time in service life and deterioration mechanism, this study finds that capacity limit states for deteriorating bridge columns follow either lognormal distribution or generalized extreme value distributions (particularly for pitting corrosion). Impact of column degradation mechanism on seismic response and fragility of bridge components and system is assessed using nonlinear time history analysis of three-dimensional finite element bridge models reflecting the uncertainties across structural modeling parameters, deterioration parameters and ground motion. Comparisons are drawn between the proposed methodology and traditional approaches to develop aging bridge fragility curves. Results indicate considerable underestimations of system level fragility across different damage states using the traditional approach compared to the proposed realistic pitting model for chloride induced corrosion. Time-dependent predictive functions are provided to interpolate logistic regression coefficients for continuous seismic reliability evaluation along the service life with reasonable accuracy. - Highlights: • Realistic modeling of chloride induced corrosion deterioration in the form of pitting. • Time-evolving capacity distribution for aging bridge columns under chloride attacks. • Time-dependent seismic fragility estimation of highway bridges at component and system level. • Mathematical functions for continuous tracking of seismic fragility along service

  8. Combining satellite data and appropriate objective functions for improved spatial pattern performance of a distributed hydrologic model

    Directory of Open Access Journals (Sweden)

    M. C. Demirel

    2018-02-01

    Full Text Available Satellite-based earth observations offer great opportunities to improve spatial model predictions by means of spatial-pattern-oriented model evaluations. In this study, observed spatial patterns of actual evapotranspiration (AET are utilised for spatial model calibration tailored to target the pattern performance of the model. The proposed calibration framework combines temporally aggregated observed spatial patterns with a new spatial performance metric and a flexible spatial parameterisation scheme. The mesoscale hydrologic model (mHM is used to simulate streamflow and AET and has been selected due to its soil parameter distribution approach based on pedo-transfer functions and the build in multi-scale parameter regionalisation. In addition two new spatial parameter distribution options have been incorporated in the model in order to increase the flexibility of root fraction coefficient and potential evapotranspiration correction parameterisations, based on soil type and vegetation density. These parameterisations are utilised as they are most relevant for simulated AET patterns from the hydrologic model. Due to the fundamental challenges encountered when evaluating spatial pattern performance using standard metrics, we developed a simple but highly discriminative spatial metric, i.e. one comprised of three easily interpretable components measuring co-location, variation and distribution of the spatial data. The study shows that with flexible spatial model parameterisation used in combination with the appropriate objective functions, the simulated spatial patterns of actual evapotranspiration become substantially more similar to the satellite-based estimates. Overall 26 parameters are identified for calibration through a sequential screening approach based on a combination of streamflow and spatial pattern metrics. The robustness of the calibrations is tested using an ensemble of nine calibrations based on different seed numbers using the

  9. Combining satellite data and appropriate objective functions for improved spatial pattern performance of a distributed hydrologic model

    Science.gov (United States)

    Demirel, Mehmet C.; Mai, Juliane; Mendiguren, Gorka; Koch, Julian; Samaniego, Luis; Stisen, Simon

    2018-02-01

    Satellite-based earth observations offer great opportunities to improve spatial model predictions by means of spatial-pattern-oriented model evaluations. In this study, observed spatial patterns of actual evapotranspiration (AET) are utilised for spatial model calibration tailored to target the pattern performance of the model. The proposed calibration framework combines temporally aggregated observed spatial patterns with a new spatial performance metric and a flexible spatial parameterisation scheme. The mesoscale hydrologic model (mHM) is used to simulate streamflow and AET and has been selected due to its soil parameter distribution approach based on pedo-transfer functions and the build in multi-scale parameter regionalisation. In addition two new spatial parameter distribution options have been incorporated in the model in order to increase the flexibility of root fraction coefficient and potential evapotranspiration correction parameterisations, based on soil type and vegetation density. These parameterisations are utilised as they are most relevant for simulated AET patterns from the hydrologic model. Due to the fundamental challenges encountered when evaluating spatial pattern performance using standard metrics, we developed a simple but highly discriminative spatial metric, i.e. one comprised of three easily interpretable components measuring co-location, variation and distribution of the spatial data. The study shows that with flexible spatial model parameterisation used in combination with the appropriate objective functions, the simulated spatial patterns of actual evapotranspiration become substantially more similar to the satellite-based estimates. Overall 26 parameters are identified for calibration through a sequential screening approach based on a combination of streamflow and spatial pattern metrics. The robustness of the calibrations is tested using an ensemble of nine calibrations based on different seed numbers using the shuffled complex

  10. Understanding Effective Program Improvement Schools through a Distributed Leadership Task Context Model

    Science.gov (United States)

    Gipson, Frances Marie

    2012-01-01

    Federal, state, and local agencies face challenges organizing resources that create the conditions necessary to create, sustain, and replicate effective high performing schools. Knowing that leadership does impact achievement outcomes and that school districts tackle growing numbers of sanctioned Program Improvement schools, a distributed…

  11. Evaluating the impact of improvements to the FLAMBE smoke source model on forecasts of aerosol distribution from NAAPS

    Science.gov (United States)

    Hyer, E. J.; Reid, J. S.

    2006-12-01

    As more forecast models aim to include aerosol and chemical species, there is a need for source functions for biomass burning emissions that are accurate, robust, and operable in real-time. NAAPS is a global aerosol forecast model running every six hours and forecasting distributions of biomass burning, industrial sulfate, dust, and sea salt aerosols. This model is run operationally by the U.S. Navy as an aid to planning. The smoke emissions used as input to the model are calculated from the data collected by the FLAMBE system, driven by near-real-time active fire data from GOES WF_ABBA and MODIS Rapid Response. The smoke source function uses land cover data to predict properties of detected fires based on literature data from experimental burns. This scheme is very sensitive to the choice of land cover data sets. In areas of rapid land cover change, the use of static land cover data can produce artifactual changes in emissions unrelated to real changes in fire patterns. In South America, this change may be as large as 40% over five years. We demonstrate the impact of a modified land cover scheme on FLAMBE emissions and NAAPS forecasts, including a fire size algorithm developed using MODIS burned area data. We also describe the effects of corrections to emissions estimates for cloud and satellite coverage. We outline areas where existing data sources are incomplete and improvements are required to achieve accurate modeling of biomass burning emissions in real time.

  12. Global distribution and climate forcing of marine organic aerosol: 1. Model improvements and evaluation

    Directory of Open Access Journals (Sweden)

    N. Meskhidze

    2011-11-01

    Full Text Available Marine organic aerosol emissions have been implemented and evaluated within the National Center of Atmospheric Research (NCAR's Community Atmosphere Model (CAM5 with the Pacific Northwest National Laboratory's 7-mode Modal Aerosol Module (MAM-7. Emissions of marine primary organic aerosols (POA, phytoplankton-produced isoprene- and monoterpenes-derived secondary organic aerosols (SOA and methane sulfonate (MS are shown to affect surface concentrations of organic aerosols in remote marine regions. Global emissions of submicron marine POA is estimated to be 7.9 and 9.4 Tg yr−1, for the Gantt et al. (2011 and Vignati et al. (2010 emission parameterizations, respectively. Marine sources of SOA and particulate MS (containing both sulfur and carbon atoms contribute an additional 0.2 and 5.1 Tg yr−1, respectively. Widespread areas over productive waters of the Northern Atlantic, Northern Pacific, and the Southern Ocean show marine-source submicron organic aerosol surface concentrations of 100 ng m−3, with values up to 400 ng m−3 over biologically productive areas. Comparison of long-term surface observations of water insoluble organic matter (WIOM with POA concentrations from the two emission parameterizations shows that despite revealed discrepancies (often more than a factor of 2, both Gantt et al. (2011 and Vignati et al. (2010 formulations are able to capture the magnitude of marine organic aerosol concentrations, with the Gantt et al. (2011 parameterization attaining better seasonality. Model simulations show that the mixing state of the marine POA can impact the surface number concentration of cloud condensation nuclei (CCN. The largest increases (up to 20% in CCN (at a supersaturation (S of 0.2% number concentration are obtained over biologically productive ocean waters when marine organic aerosol is assumed to be externally mixed with sea-salt. Assuming

  13. Elasto-dynamic analysis of a gear pump-Part IV: Improvement in the pressure distribution modelling

    Science.gov (United States)

    Mucchi, E.; Dalpiaz, G.; Fernàndez del Rincòn, A.

    2015-01-01

    This work concerns external gear pumps for automotive applications, which operate at high speed and low pressure. In previous works of the authors (Part I and II, [1,2]), a non-linear lumped-parameter kineto-elastodynamic model for the prediction of the dynamic behaviour of external gear pumps was presented. It takes into account the most important phenomena involved in the operation of this kind of machine. The two main sources of noise and vibration are considered: pressure pulsation and gear meshing. The model has been used in order to foresee the influence of working conditions and design modifications on vibration generation. The model experimental validation is a difficult task. Thus, Part III proposes a novel methodology for the validation carried out by the comparison of simulations and experimental results concerning forces and moments: it deals with the external and inertial components acting on the gears, estimated by the model, and the reactions and inertial components on the pump casing and the test plate, obtained by measurements. The validation is carried out by comparing the level of the time synchronous average in the time domain and the waterfall maps in the frequency domain, with particular attention to identify system resonances. The validation results are satisfactory global, but discrepancies are still present. Moreover, the assessed model has been properly modified for the application to a new virtual pump prototype with helical gears in order to foresee gear accelerations and dynamic forces. Part IV is focused on improvements in the modelling and analysis of the phenomena bound to the pressure distribution around the gears in order to achieve results closer to the measured values. As a matter of fact, the simulation results have shown that a variable meshing stiffness has a notable contribution on the dynamic behaviour of the pump but this is not as important as the pressure phenomena. As a consequence, the original model was modified with

  14. Hierarchical species distribution models

    Science.gov (United States)

    Hefley, Trevor J.; Hooten, Mevin B.

    2016-01-01

    Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.

  15. Combining satellite data and appropriate objective functions for improved spatial pattern performance of a distributed hydrologic model

    DEFF Research Database (Denmark)

    Demirel, Mehmet C.; Mai, Juliane; Mendiguren Gonzalez, Gorka

    2018-01-01

    selected due to its soil parameter distribution approach based on pedo-transfer functions and the build in multi-scale parameter regionalisation. In addition two new spatial parameter distribution options have been incorporated in the model in order to increase the flexibility of root fraction coefficient...

  16. Productivity improvements in gas distribution

    International Nuclear Information System (INIS)

    Young, M.R.

    1997-01-01

    In 1993, the Hilmer Report resulted in the introduction of the National Competition Policy which, in the case of the gas industry, aims to promote gas-on-gas competition where to date it has been excluded. In response, and to prepare for wide gas industry reform, Gas and Fuel formed three fundamentally different core businesses on 1 July 1996 - Energy Retail, Network, and Contestable Services. In one productivity improvement initiative which is believed to be unique, Gas and Fuel appointed three companies as strategic alliance partners for distribution system maintenance. Gas and Fuel can now concentrate on its core role as asset manager which owns and operates the distribution system while procuring all services from what will become non-regulated businesses. This Paper details this initiative and the benefits which have resulted from overall changes and improvements, and outlines the challenges facing Gas and Fuel in the future. (au)

  17. Using numerical model simulations to improve the understanding of micro-plastic distribution and pathways in the marine environment

    NARCIS (Netherlands)

    Hardesty, Britta D.; Harari, Joseph; Isobe, Atsuhiko; Lebreton, Laurent; Maximenko, Nikolai; Potemra, Jim; van Sebille, Erik; Vethaak, A.Dick; Wilcox, Chris

    2017-01-01

    Numerical modeling is one of the key tools with which we can gain insight into the distribution of marine litter, especially micro-plastics. Over the past decade, a series of numerical simulations have been constructed that specifically target floating marine litter, based on ocean models of various

  18. Distributed Wind Competitiveness Improvement Project

    Energy Technology Data Exchange (ETDEWEB)

    2018-02-27

    The Competitiveness Improvement Project (CIP) is a periodic solicitation through the U.S. Department of Energy and its National Renewable Energy Laboratory. The Competitiveness Improvement Project (CIP) is a periodic solicitation through the U.S. Department of Energy and its National Renewable Energy Laboratory. Manufacturers of small and medium wind turbines are awarded cost-shared grants via a competitive process to optimize their designs, develop advanced manufacturing processes, and perform turbine testing. The goals of the CIP are to make wind energy cost competitive with other distributed generation technology and increase the number of wind turbine designs certified to national testing standards. This fact sheet describes the CIP and funding awarded as part of the project.ufacturers of small and medium wind turbines are awarded cost-shared grants via a competitive process to optimize their designs, develop advanced manufacturing processes, and perform turbine testing. The goals of the CIP are to make wind energy cost competitive with other distributed generation technology and increase the number of wind turbine designs certified to national testing standards. This fact sheet describes the CIP and funding awarded as part of the project.

  19. Combining satellite data and appropriate objective functions for improved spatial pattern performance of a distributed hydrologic model

    DEFF Research Database (Denmark)

    Demirel, Mehmet C.; Mai, Juliane; Mendiguren Gonzalez, Gorka

    2018-01-01

    Satellite-based earth observations offer great opportunities to improve spatial model predictions by means of spatial-pattern-oriented model evaluations. In this study, observed spatial patterns of actual evapotranspiration (AET) are utilised for spatial model calibration tailored to target...... and potential evapotranspiration correction parameterisations, based on soil type and vegetation density. These parameterisations are utilised as they are most relevant for simulated AET patterns from the hydrologic model. Due to the fundamental challenges encountered when evaluating spatial pattern performance...

  20. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    Nowadays, emerging applications such as wireless visual sensor networks and wireless video surveillance are requiring lightweight video encoding with high coding efficiency and error-resilience. Distributed Video Coding (DVC) is a new coding paradigm which exploits the source statistics...... and noise modeling and also learn from the previous decoded Wyner-Ziv (WZ) frames, side information and noise learning (SING) is proposed. The SING scheme introduces an optical flow technique to compensate the weaknesses of the block based SI generation and also utilizes clustering of DCT blocks to capture...... cross band correlation and increase local adaptivity in noise modeling. During decoding, the updated information is used to iteratively reestimate the motion and reconstruction in the proposed motion and reconstruction reestimation (MORE) scheme. The MORE scheme not only reestimates the motion vectors...

  1. Bounding species distribution models

    Directory of Open Access Journals (Sweden)

    Thomas J. STOHLGREN, Catherine S. JARNEVICH, Wayne E. ESAIAS,Jeffrey T. MORISETTE

    2011-10-01

    Full Text Available Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for “clamping” model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART and maximum entropy (Maxent models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5: 642–647, 2011].

  2. Bounding Species Distribution Models

    Science.gov (United States)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  3. Improvement of distributed snowmelt energy balance modeling with MODIS-based NDSI-derived fractional snow-covered area data

    Science.gov (United States)

    Joel W. Homan; Charles H. Luce; James P. McNamara; Nancy F. Glenn

    2011-01-01

    Describing the spatial variability of heterogeneous snowpacks at a watershed or mountain-front scale is important for improvements in large-scale snowmelt modelling. Snowmelt depletion curves, which relate fractional decreases in snowcovered area (SCA) against normalized decreases in snow water equivalent (SWE), are a common approach to scale-up snowmelt models....

  4. Vaginal drug distribution modeling.

    Science.gov (United States)

    Katz, David F; Yuan, Andrew; Gao, Yajing

    2015-09-15

    This review presents and applies fundamental mass transport theory describing the diffusion and convection driven mass transport of drugs to the vaginal environment. It considers sources of variability in the predictions of the models. It illustrates use of model predictions of microbicide drug concentration distribution (pharmacokinetics) to gain insights about drug effectiveness in preventing HIV infection (pharmacodynamics). The modeling compares vaginal drug distributions after different gel dosage regimens, and it evaluates consequences of changes in gel viscosity due to aging. It compares vaginal mucosal concentration distributions of drugs delivered by gels vs. intravaginal rings. Finally, the modeling approach is used to compare vaginal drug distributions across species with differing vaginal dimensions. Deterministic models of drug mass transport into and throughout the vaginal environment can provide critical insights about the mechanisms and determinants of such transport. This knowledge, and the methodology that obtains it, can be applied and translated to multiple applications, involving the scientific underpinnings of vaginal drug distribution and the performance evaluation and design of products, and their dosage regimens, that achieve it. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Calibration by Hydrological Response Unit of a National Hydrologic Model to Improve Spatial Representation and Distribution of Parameters

    Science.gov (United States)

    Norton, P. A., II

    2015-12-01

    The U. S. Geological Survey is developing a National Hydrologic Model (NHM) to support consistent hydrologic modeling across the conterminous United States (CONUS). The Precipitation-Runoff Modeling System (PRMS) simulates daily hydrologic and energy processes in watersheds, and is used for the NHM application. For PRMS each watershed is divided into hydrologic response units (HRUs); by default each HRU is assumed to have a uniform hydrologic response. The Geospatial Fabric (GF) is a database containing initial parameter values for input to PRMS and was created for the NHM. The parameter values in the GF were derived from datasets that characterize the physical features of the entire CONUS. The NHM application is composed of more than 100,000 HRUs from the GF. Selected parameter values commonly are adjusted by basin in PRMS using an automated calibration process based on calibration targets, such as streamflow. Providing each HRU with distinct values that captures variability within the CONUS may improve simulation performance of the NHM. During calibration of the NHM by HRU, selected parameter values are adjusted for PRMS based on calibration targets, such as streamflow, snow water equivalent (SWE) and actual evapotranspiration (AET). Simulated SWE, AET, and runoff were compared to value ranges derived from multiple sources (e.g. the Snow Data Assimilation System, the Moderate Resolution Imaging Spectroradiometer (i.e. MODIS) Global Evapotranspiration Project, the Simplified Surface Energy Balance model, and the Monthly Water Balance Model). This provides each HRU with a distinct set of parameter values that captures the variability within the CONUS, leading to improved model performance. We present simulation results from the NHM after preliminary calibration, including the results of basin-level calibration for the NHM using: 1) default initial GF parameter values, and 2) parameter values calibrated by HRU.

  6. Family of Quantum Sources for Improving Near Field Accuracy in Transducer Modeling by the Distributed Point Source Method

    Directory of Open Access Journals (Sweden)

    Dominique Placko

    2016-10-01

    Full Text Available The distributed point source method, or DPSM, developed in the last decade has been used for solving various engineering problems—such as elastic and electromagnetic wave propagation, electrostatic, and fluid flow problems. Based on a semi-analytical formulation, the DPSM solution is generally built by superimposing the point source solutions or Green’s functions. However, the DPSM solution can be also obtained by superimposing elemental solutions of volume sources having some source density called the equivalent source density (ESD. In earlier works mostly point sources were used. In this paper the DPSM formulation is modified to introduce a new kind of ESD, replacing the classical single point source by a family of point sources that are referred to as quantum sources. The proposed formulation with these quantum sources do not change the dimension of the global matrix to be inverted to solve the problem when compared with the classical point source-based DPSM formulation. To assess the performance of this new formulation, the ultrasonic field generated by a circular planer transducer was compared with the classical DPSM formulation and analytical solution. The results show a significant improvement in the near field computation.

  7. Chance-constrained overland flow modeling for improving conceptual distributed hydrologic simulations based on scaling representation of sub-daily rainfall variability

    International Nuclear Information System (INIS)

    Han, Jing-Cheng; Huang, Guohe; Huang, Yuefei; Zhang, Hua; Li, Zhong; Chen, Qiuwen

    2015-01-01

    Lack of hydrologic process representation at the short time-scale would lead to inadequate simulations in distributed hydrological modeling. Especially for complex mountainous watersheds, surface runoff simulations are significantly affected by the overland flow generation, which is closely related to the rainfall characteristics at a sub-time step. In this paper, the sub-daily variability of rainfall intensity was considered using a probability distribution, and a chance-constrained overland flow modeling approach was proposed to capture the generation of overland flow within conceptual distributed hydrologic simulations. The integrated modeling procedures were further demonstrated through a watershed of China Three Gorges Reservoir area, leading to an improved SLURP-TGR hydrologic model based on SLURP. Combined with rainfall thresholds determined to distinguish various magnitudes of daily rainfall totals, three levels of significance were simultaneously employed to examine the hydrologic-response simulation. Results showed that SLURP-TGR could enhance the model performance, and the deviation of runoff simulations was effectively controlled. However, rainfall thresholds were so crucial for reflecting the scaling effect of rainfall intensity that optimal levels of significance and rainfall threshold were 0.05 and 10 mm, respectively. As for the Xiangxi River watershed, the main runoff contribution came from interflow of the fast store. Although slight differences of overland flow simulations between SLURP and SLURP-TGR were derived, SLURP-TGR was found to help improve the simulation of peak flows, and would improve the overall modeling efficiency through adjusting runoff component simulations. Consequently, the developed modeling approach favors efficient representation of hydrological processes and would be expected to have a potential for wide applications. - Highlights: • We develop an improved hydrologic model considering the scaling effect of rainfall. • A

  8. Chance-constrained overland flow modeling for improving conceptual distributed hydrologic simulations based on scaling representation of sub-daily rainfall variability

    Energy Technology Data Exchange (ETDEWEB)

    Han, Jing-Cheng [State Key Laboratory of Hydroscience & Engineering, Department of Hydraulic Engineering, Tsinghua University, Beijing 100084 (China); Huang, Guohe, E-mail: huang@iseis.org [Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan S4S 0A2 (Canada); Huang, Yuefei [State Key Laboratory of Hydroscience & Engineering, Department of Hydraulic Engineering, Tsinghua University, Beijing 100084 (China); Zhang, Hua [College of Science and Engineering, Texas A& M University — Corpus Christi, Corpus Christi, TX 78412-5797 (United States); Li, Zhong [Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan S4S 0A2 (Canada); Chen, Qiuwen [Center for Eco-Environmental Research, Nanjing Hydraulics Research Institute, Nanjing 210029 (China)

    2015-08-15

    Lack of hydrologic process representation at the short time-scale would lead to inadequate simulations in distributed hydrological modeling. Especially for complex mountainous watersheds, surface runoff simulations are significantly affected by the overland flow generation, which is closely related to the rainfall characteristics at a sub-time step. In this paper, the sub-daily variability of rainfall intensity was considered using a probability distribution, and a chance-constrained overland flow modeling approach was proposed to capture the generation of overland flow within conceptual distributed hydrologic simulations. The integrated modeling procedures were further demonstrated through a watershed of China Three Gorges Reservoir area, leading to an improved SLURP-TGR hydrologic model based on SLURP. Combined with rainfall thresholds determined to distinguish various magnitudes of daily rainfall totals, three levels of significance were simultaneously employed to examine the hydrologic-response simulation. Results showed that SLURP-TGR could enhance the model performance, and the deviation of runoff simulations was effectively controlled. However, rainfall thresholds were so crucial for reflecting the scaling effect of rainfall intensity that optimal levels of significance and rainfall threshold were 0.05 and 10 mm, respectively. As for the Xiangxi River watershed, the main runoff contribution came from interflow of the fast store. Although slight differences of overland flow simulations between SLURP and SLURP-TGR were derived, SLURP-TGR was found to help improve the simulation of peak flows, and would improve the overall modeling efficiency through adjusting runoff component simulations. Consequently, the developed modeling approach favors efficient representation of hydrological processes and would be expected to have a potential for wide applications. - Highlights: • We develop an improved hydrologic model considering the scaling effect of rainfall. • A

  9. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    Science.gov (United States)

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946

  10. Not to put too fine a point on it - does increasing precision of geographic referencing improve species distribution models for a wide-ranging migratory bat?

    Science.gov (United States)

    Hayes, Mark A.; Ozenberger, Katharine; Cryan, Paul M.; Wunder, Michael B.

    2015-01-01

    Bat specimens held in natural history museum collections can provide insights into the distribution of species. However, there are several important sources of spatial error associated with natural history specimens that may influence the analysis and mapping of bat species distributions. We analyzed the importance of geographic referencing and error correction in species distribution modeling (SDM) using occurrence records of hoary bats (Lasiurus cinereus). This species is known to migrate long distances and is a species of increasing concern due to fatalities documented at wind energy facilities in North America. We used 3,215 museum occurrence records collected from 1950–2000 for hoary bats in North America. We compared SDM performance using five approaches: generalized linear models, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy models. We evaluated results using three SDM performance metrics (AUC, sensitivity, and specificity) and two data sets: one comprised of the original occurrence data, and a second data set consisting of these same records after the locations were adjusted to correct for identifiable spatial errors. The increase in precision improved the mean estimated spatial error associated with hoary bat records from 5.11 km to 1.58 km, and this reduction in error resulted in a slight increase in all three SDM performance metrics. These results provide insights into the importance of geographic referencing and the value of correcting spatial errors in modeling the distribution of a wide-ranging bat species. We conclude that the considerable time and effort invested in carefully increasing the precision of the occurrence locations in this data set was not worth the marginal gains in improved SDM performance, and it seems likely that gains would be similar for other bat species that range across large areas of the continent, migrate, and are habitat generalists.

  11. Assimilation of ground and satellite snow observations in a distributed hydrologic model to improve water supply forecasts in the Upper Colorado River Basin

    Science.gov (United States)

    Micheletty, P. D.; Day, G. N.; Quebbeman, J.; Carney, S.; Park, G. H.

    2016-12-01

    The Upper Colorado River Basin above Lake Powell is a major source of water supply for 25 million people and provides irrigation water for 3.5 million acres. Approximately 85% of the annual runoff is produced from snowmelt. Water supply forecasts of the April-July runoff produced by the National Weather Service (NWS) Colorado Basin River Forecast Center (CBRFC), are critical to basin water management. This project leverages advanced distributed models, datasets, and snow data assimilation techniques to improve operational water supply forecasts made by CBRFC in the Upper Colorado River Basin. The current work will specifically focus on improving water supply forecasts through the implementation of a snow data assimilation process coupled with the Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM). Three types of observations will be used in the snow data assimilation system: satellite Snow Covered Area (MODSCAG), satellite Dust Radiative Forcing in Snow (MODDRFS), and SNOTEL Snow Water Equivalent (SWE). SNOTEL SWE provides the main source of high elevation snowpack information during the snow season, however, these point measurement sites are carefully selected to provide consistent indices of snowpack, and may not be representative of the surrounding watershed. We address this problem by transforming the SWE observations to standardized deviates and interpolating the standardized deviates using a spatial regression model. The interpolation process will also take advantage of the MODIS Snow Covered Area and Grainsize (MODSCAG) product to inform the model on the spatial distribution of snow. The interpolated standardized deviates are back-transformed and used in an Ensemble Kalman Filter (EnKF) to update the model simulated SWE. The MODIS Dust Radiative Forcing in Snow (MODDRFS) product will be used more directly through temporary adjustments to model snowmelt parameters, which should improve melt estimates in areas affected by dust on snow. In

  12. From Logical to Distributional Models

    Directory of Open Access Journals (Sweden)

    Anne Preller

    2014-12-01

    Full Text Available The paper relates two variants of semantic models for natural language, logical functional models and compositional distributional vector space models, by transferring the logic and reasoning from the logical to the distributional models. The geometrical operations of quantum logic are reformulated as algebraic operations on vectors. A map from functional models to vector space models makes it possible to compare the meaning of sentences word by word.

  13. Strategies for improving the surveillance of drinking water quality in distribution networks : application of emerging modeling approaches

    OpenAIRE

    Francisque, Alex

    2009-01-01

    Cette thèse est consacrée à l'amélioration de la surveillance de la qualité de l'eau potable en réseau de distribution (RD) et à son. Le principal RD de la ville de Québec (Canada) est étudié. La thèse comporte quatre chapitres. Le premier porte sur la qualité microbiologique de l'eau. Il introduit de nouvelles approches statistiques pour modéliser les comptes de bactéries hétérotrophes anaérobies et aérobies facultatives (BHAA) utilisées comme indicateur de la variabilité de la qualité de l'...

  14. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  15. Distributed Parameter Modelling Applications

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    and the development of a short-path evaporator. The oil shale processing problem illustrates the interplay amongst particle flows in rotating drums, heat and mass transfer between solid and gas phases. The industrial application considers the dynamics of an Alberta-Taciuk processor, commonly used in shale oil and oil...... the steady state, distributed behaviour of a short-path evaporator....

  16. Improved work zone design guidelines and enhanced model of travel delays in work zones : Phase I, portability and scalability of interarrival and service time probability distribution functions for different locations in Ohio and the establishment of impr

    Science.gov (United States)

    2006-01-01

    The project focuses on two major issues - the improvement of current work zone design practices and an analysis of : vehicle interarrival time (IAT) and speed distributions for the development of a digital computer simulation model for : queues and t...

  17. DESIGN IMPROVEMENTS IN MODERN DISTRIBUTION TRANSFORMERS

    OpenAIRE

    Ćućić, Branimir; Meško, Nina; Mikulić, Martina; Trstoglavec, Dominik

    2017-01-01

    In the paper design improvements of distribution transformers related to improved energy efficiency and environmental awareness are discussed. Eco design of transformers, amorphous transformers, voltage regulated transformers and transformers filled with ester liquids are analyzed. As a consequence of growing energy efficiency importance, European Commission has adopted new regulation which defines maximum permissible levels of load and no-load losses of transformers with rated...

  18. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  19. Improved steamflood analytical model

    Energy Technology Data Exchange (ETDEWEB)

    Chandra, S.; Mamora, D.D. [Society of Petroleum Engineers, Richardson, TX (United States)]|[Texas A and M Univ., TX (United States)

    2005-11-01

    Predicting the performance of steam flooding can help in the proper execution of enhanced oil recovery (EOR) processes. The Jones model is often used for analytical steam flooding performance prediction, but it does not accurately predict oil production peaks. In this study, an improved steam flood model was developed by modifying 2 of the 3 components of the capture factor in the Jones model. The modifications were based on simulation results from a Society of Petroleum Engineers (SPE) comparative project case model. The production performance of a 5-spot steamflood pattern unit was simulated and compared with results obtained from the Jones model. Three reservoir types were simulated through the use of 3-D Cartesian black oil models. In order to correlate the simulation and the Jones analytical model results for the start and height of the production peak, the dimensionless steam zone size was modified to account for a decrease in oil viscosity during steam flooding and its dependence on the steam injection rate. In addition, the dimensionless volume of displaced oil produced was modified from its square-root format to an exponential form. The modified model improved results for production performance by up to 20 years of simulated steam flooding, compared to the Jones model. Results agreed with simulation results for 13 different cases, including 3 different sets of reservoir and fluid properties. Reservoir engineers will benefit from the improved accuracy of the model. Oil displacement calculations were based on methods proposed in earlier research, in which the oil displacement rate is a function of cumulative oil steam ratio. The cumulative oil steam ratio is a function of overall thermal efficiency. Capture factor component formulae were presented, as well as charts of oil production rates and cumulative oil-steam ratios for various reservoirs. 13 refs., 4 tabs., 29 figs.

  20. Modeled ground water age distributions

    Science.gov (United States)

    Woolfenden, Linda R.; Ginn, Timothy R.

    2009-01-01

    The age of ground water in any given sample is a distributed quantity representing distributed provenance (in space and time) of the water. Conventional analysis of tracers such as unstable isotopes or anthropogenic chemical species gives discrete or binary measures of the presence of water of a given age. Modeled ground water age distributions provide a continuous measure of contributions from different recharge sources to aquifers. A numerical solution of the ground water age equation of Ginn (1999) was tested both on a hypothetical simplified one-dimensional flow system and under real world conditions. Results from these simulations yield the first continuous distributions of ground water age using this model. Complete age distributions as a function of one and two space dimensions were obtained from both numerical experiments. Simulations in the test problem produced mean ages that were consistent with the expected value at the end of the model domain for all dispersivity values tested, although the mean ages for the two highest dispersivity values deviated slightly from the expected value. Mean ages in the dispersionless case also were consistent with the expected mean ages throughout the physical model domain. Simulations under real world conditions for three dispersivity values resulted in decreasing mean age with increasing dispersivity. This likely is a consequence of an edge effect. However, simulations for all three dispersivity values tested were mass balanced and stable demonstrating that the solution of the ground water age equation can provide estimates of water mass density distributions over age under real world conditions.

  1. Improving flow distribution in influent channels using computational fluid dynamics.

    Science.gov (United States)

    Park, No-Suk; Yoon, Sukmin; Jeong, Woochang; Lee, Seungjae

    2016-10-01

    Although the flow distribution in an influent channel where the inflow is split into each treatment process in a wastewater treatment plant greatly affects the efficiency of the process, and a weir is the typical structure for the flow distribution, to the authors' knowledge, there is a paucity of research on the flow distribution in an open channel with a weir. In this study, the influent channel of a real-scale wastewater treatment plant was used, installing a suppressed rectangular weir that has a horizontal crest to cross the full channel width. The flow distribution in the influent channel was analyzed using a validated computational fluid dynamics model to investigate (1) the comparison of single-phase and two-phase simulation, (2) the improved procedure of the prototype channel, and (3) the effect of the inflow rate on flow distribution. The results show that two-phase simulation is more reliable due to the description of the free-surface fluctuations. It should first be considered for improving flow distribution to prevent a short-circuit flow, and the difference in the kinetic energy with the inflow rate makes flow distribution trends different. The authors believe that this case study is helpful for improving flow distribution in an influent channel.

  2. Improved spectral absorption coefficient grouping strategy of wide band k-distribution model used for calculation of infrared remote sensing signal of hot exhaust systems

    Science.gov (United States)

    Hu, Haiyang; Wang, Qiang

    2018-07-01

    A new strategy for grouping spectral absorption coefficients, considering the influences of both temperature and species mole ratio inhomogeneities on correlated-k characteristics of the spectra of gas mixtures, has been deduced to match the calculation method of spectral overlap parameter used in multiscale multigroup wide band k-distribution model. By comparison with current spectral absorption coefficient grouping strategies, for which only the influence of temperature inhomogeneity on the correlated-k characteristics of spectra of single species was considered, the improvements in calculation accuracies resulting from the new grouping strategy were evaluated using a series of 0D cases in which radiance under 3-5-μm wave band emitted by hot combustion gas of hydrocarbon fuel was attenuated by atmosphere with quite different temperature and mole ratios of water vapor and carbon monoxide to carbon dioxide. Finally, evaluations are presented on the calculation of remote sensing thermal images of transonic hot jet exhausted from a chevron ejecting nozzle with solid wall cooling system.

  3. Distributed power quality improvement in residential microgrids

    DEFF Research Database (Denmark)

    Naderi Zarnaghi, Yahya; Hosseini, Seyed Hossein; Ghassem Zadeh, Saeid

    2017-01-01

    The importance of power quality issue on micro grids and also the changing nature of power system distortions will lead the future power systems to use distributed power quality improvement (DPQI) devices. One possible choice of these DPQIs are multifunctional DGs that could compensate some...... harmonics in the location of generation and prevent the harmonics to enter main power grid. In this paper a control method based on virtual harmonic impedance is presented for these multifunctional DGs and the effect of the location of these DGs on compensation procedure is studied with simulating...

  4. Incorporating uncertainty in predictive species distribution modelling.

    Science.gov (United States)

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  5. Distributed modeling for road authorities

    NARCIS (Netherlands)

    Luiten, G.T.; Bõhms, H.M.; Nederveen, S. van; Bektas, E.

    2013-01-01

    A great challenge for road authorities is to improve the effectiveness and efficiency of their core processes by improving data exchange and sharing using new technologies such as building information modeling (BIM). BIM has already been successfully implemented in other sectors, such as

  6. Assessing safety risk in electricity distribution processes using ET & BA improved technique and its ranking by VIKOR and TOPSIS models in fuzzy environment

    Directory of Open Access Journals (Sweden)

    S. Rahmani

    2016-04-01

      Conclusion: The height and electricity are of the main causes of accidents in electricity transmission and distribution industry which caused the overhead power networks to be ranked as high risk. Application of decision-making models in fuzzy environment minimizes the judgment of assessors in the risk assessment process.

  7. An Improved Harmony Search Algorithm for Power Distribution Network Planning

    Directory of Open Access Journals (Sweden)

    Wei Sun

    2015-01-01

    Full Text Available Distribution network planning because of involving many variables and constraints is a multiobjective, discrete, nonlinear, and large-scale optimization problem. Harmony search (HS algorithm is a metaheuristic algorithm inspired by the improvisation process of music players. HS algorithm has several impressive advantages, such as easy implementation, less adjustable parameters, and quick convergence. But HS algorithm still has some defects such as premature convergence and slow convergence speed. According to the defects of the standard algorithm and characteristics of distribution network planning, an improved harmony search (IHS algorithm is proposed in this paper. We set up a mathematical model of distribution network structure planning, whose optimal objective function is to get the minimum annual cost and constraint conditions are overload and radial network. IHS algorithm is applied to solve the complex optimization mathematical model. The empirical results strongly indicate that IHS algorithm can effectively provide better results for solving the distribution network planning problem compared to other optimization algorithms.

  8. Video distribution system cost model

    Science.gov (United States)

    Gershkoff, I.; Haspert, J. K.; Morgenstern, B.

    1980-01-01

    A cost model that can be used to systematically identify the costs of procuring and operating satellite linked communications systems is described. The user defines a network configuration by specifying the location of each participating site, the interconnection requirements, and the transmission paths available for the uplink (studio to satellite), downlink (satellite to audience), and voice talkback (between audience and studio) segments of the network. The model uses this information to calculate the least expensive signal distribution path for each participating site. Cost estimates are broken downy by capital, installation, lease, operations and maintenance. The design of the model permits flexibility in specifying network and cost structure.

  9. Community energy storage and distribution SCADA improvements

    International Nuclear Information System (INIS)

    Riggins, M.

    2010-01-01

    The mission of American Electric Power (AEP) is to sustain the real time balance of energy supply and demand. Approximately 2.5 percent of energy generated in the United States (USA) is stored as pumped hydro, compressed air, or in batteries and other devices. This power point presentation discussed the use of SCADA for improving community energy storage (CES) and distribution systems. CES is a distributed fleet of small energy units connected to the transformers in order to serve houses or small commercial loads. CES is operated as a fleet offering multi-megawatt (MW) multi-hour storage. The benefits of CES include backup power, flicker mitigation, and renewable integration. Benefits to the electricity grid include power factor correct, ancillary services, and load leveling at the substation level. SCADA is being used to determine when emergency load reductions are required or when emergency inspections on fans, oil pumps or other devices are needed. An outline of AEP's monitoring system installation plan was also included. tabs., figs.

  10. Assessing the value of variational assimilation of streamflow data into distributed hydrologic models for improved streamflow monitoring and prediction at ungauged and gauged locations in the catchment

    Science.gov (United States)

    Lee, Hak Su; Seo, Dong-Jun; Liu, Yuqiong; McKee, Paul; Corby, Robert

    2010-05-01

    State updating of distributed hydrologic models via assimilation of streamflow data is subject to "overfitting" because large dimensionality of the state space of the model may render the assimilation problem seriously underdetermined. To examine the issue in the context of operational hydrology, we carried out a set of real-world experiments in which we assimilate streamflow data at interior and/or outlet locations into gridded SAC and kinematic-wave routing models of the U.S. National Weather Service (NWS) Research Distributed Hydrologic Model (RDHM). We used for the experiments nine basins in the southern plains of the U.S. The experiments consist of selectively assimilating streamflow at different gauge locations, outlet and/or interior, and carrying out both dependent and independent validation. To assess the sensitivity of the quality of assimilation-aided streamflow simulation to the reduced dimensionality of the state space, we carried out data assimilation at spatially semi-distributed or lumped scale and by adjusting biases in precipitation and potential evaporation at a 6-hourly or larger scale. In this talk, we present the results and findings.

  11. Turboelectric Distributed Propulsion System Modelling

    OpenAIRE

    Liu, Chengyuan

    2013-01-01

    The Blended-Wing-Body is a conceptual aircraft design with rear-mounted, over wing engines. Turboelectric distributed propulsion system with boundary layer ingestion has been considered for this aircraft. It uses electricity to transmit power from the core turbine to the fans, therefore dramatically increases bypass ratio to reduce fuel consumption and noise. This dissertation presents methods on designing the TeDP system, evaluating effects of boundary layer ingestion, modelling engine perfo...

  12. Water Distribution and Removal Model

    International Nuclear Information System (INIS)

    Y. Deng; N. Chipman; E.L. Hardin

    2005-01-01

    The design of the Yucca Mountain high level radioactive waste repository depends on the performance of the engineered barrier system (EBS). To support the total system performance assessment (TSPA), the Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is developed to describe the thermal, mechanical, chemical, hydrological, biological, and radionuclide transport processes within the emplacement drifts, which includes the following major analysis/model reports (AMRs): (1) EBS Water Distribution and Removal (WD and R) Model; (2) EBS Physical and Chemical Environment (P and CE) Model; (3) EBS Radionuclide Transport (EBS RNT) Model; and (4) EBS Multiscale Thermohydrologic (TH) Model. Technical information, including data, analyses, models, software, and supporting documents will be provided to defend the applicability of these models for their intended purpose of evaluating the postclosure performance of the Yucca Mountain repository system. The WD and R model ARM is important to the site recommendation. Water distribution and removal represents one component of the overall EBS. Under some conditions, liquid water will seep into emplacement drifts through fractures in the host rock and move generally downward, potentially contacting waste packages. After waste packages are breached by corrosion, some of this seepage water will contact the waste, dissolve or suspend radionuclides, and ultimately carry radionuclides through the EBS to the near-field host rock. Lateral diversion of liquid water within the drift will occur at the inner drift surface, and more significantly from the operation of engineered structures such as drip shields and the outer surface of waste packages. If most of the seepage flux can be diverted laterally and removed from the drifts before contacting the wastes, the release of radionuclides from the EBS can be controlled, resulting in a proportional reduction in dose release at the accessible environment

  13. Water Distribution and Removal Model

    Energy Technology Data Exchange (ETDEWEB)

    Y. Deng; N. Chipman; E.L. Hardin

    2005-08-26

    The design of the Yucca Mountain high level radioactive waste repository depends on the performance of the engineered barrier system (EBS). To support the total system performance assessment (TSPA), the Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is developed to describe the thermal, mechanical, chemical, hydrological, biological, and radionuclide transport processes within the emplacement drifts, which includes the following major analysis/model reports (AMRs): (1) EBS Water Distribution and Removal (WD&R) Model; (2) EBS Physical and Chemical Environment (P&CE) Model; (3) EBS Radionuclide Transport (EBS RNT) Model; and (4) EBS Multiscale Thermohydrologic (TH) Model. Technical information, including data, analyses, models, software, and supporting documents will be provided to defend the applicability of these models for their intended purpose of evaluating the postclosure performance of the Yucca Mountain repository system. The WD&R model ARM is important to the site recommendation. Water distribution and removal represents one component of the overall EBS. Under some conditions, liquid water will seep into emplacement drifts through fractures in the host rock and move generally downward, potentially contacting waste packages. After waste packages are breached by corrosion, some of this seepage water will contact the waste, dissolve or suspend radionuclides, and ultimately carry radionuclides through the EBS to the near-field host rock. Lateral diversion of liquid water within the drift will occur at the inner drift surface, and more significantly from the operation of engineered structures such as drip shields and the outer surface of waste packages. If most of the seepage flux can be diverted laterally and removed from the drifts before contacting the wastes, the release of radionuclides from the EBS can be controlled, resulting in a proportional reduction in dose release at the accessible environment. The purposes

  14. Development of the North Pacific Ocean model for the assessment of the distribution of the radioactive materials. Improvement for formation of the North Pacific intermediate water

    International Nuclear Information System (INIS)

    Tsubono, Takaki; Misumi, Kazuhiro; Tsumune, Daisuke; Bryan, Frank

    2014-01-01

    The radioactive materials such as 137 Cs were released to the North Pacific Ocean (NP) through the major pathway; direct release from the accident site and atmospheric deposition, after the accidents at the Fukushima Dai-ichi Nuclear Power Plant following the earthquake and tsunami. The behavior of the materials in the NP has been paid great attention after the accident. The North Pacific Model for the calculation of the distribution of radionuclides has been developed using Regional Ocean Modeling System (ROMS). The model domain is NP with an eddy-resolving grid. A series of numerical experiments conducted by models suggests that the computational diffusivity caused by the advection scheme and the topography roughness are critical in representing the separation of Kuroshio, the Kuroshio Extension, the mixed-water region between Kuroshio Extension and Oyashio front and the formation of the North Pacific Intermediate Water (NPIW). The model requires the forth order scheme in the tracer advection and the smoothing of topography for these problems. Moreover the tidal mixing process around the straits in the North Pacific Ocean and the sea ice play important roles to reproduce the formation of lon salinity around the NPIW as well as the isopycnal mixing process represented by an eddy-resolving model. (author)

  15. Integrating SMOS brightness temperatures with a new conceptual spatially distributed hydrological model for improving flood and drought predictions at large scale.

    Science.gov (United States)

    Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick

    2017-04-01

    Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model

  16. Model improvements to simulate charging in SEM

    Science.gov (United States)

    Arat, K. T.; Klimpel, T.; Hagen, C. W.

    2018-03-01

    Charging of insulators is a complex phenomenon to simulate since the accuracy of the simulations is very sensitive to the interaction of electrons with matter and electric fields. In this study, we report model improvements for a previously developed Monte-Carlo simulator to more accurately simulate samples that charge. The improvements include both modelling of low energy electron scattering and charging of insulators. The new first-principle scattering models provide a more realistic charge distribution cloud in the material, and a better match between non-charging simulations and experimental results. Improvements on charging models mainly focus on redistribution of the charge carriers in the material with an induced conductivity (EBIC) and a breakdown model, leading to a smoother distribution of the charges. Combined with a more accurate tracing of low energy electrons in the electric field, we managed to reproduce the dynamically changing charging contrast due to an induced positive surface potential.

  17. Real-time modeling of heat distributions

    Science.gov (United States)

    Hamann, Hendrik F.; Li, Hongfei; Yarlanki, Srinivas

    2018-01-02

    Techniques for real-time modeling temperature distributions based on streaming sensor data are provided. In one aspect, a method for creating a three-dimensional temperature distribution model for a room having a floor and a ceiling is provided. The method includes the following steps. A ceiling temperature distribution in the room is determined. A floor temperature distribution in the room is determined. An interpolation between the ceiling temperature distribution and the floor temperature distribution is used to obtain the three-dimensional temperature distribution model for the room.

  18. Planning of distributed generation in distribution network based on improved particle swarm optimization algorithm

    Science.gov (United States)

    Li, Jinze; Qu, Zhi; He, Xiaoyang; Jin, Xiaoming; Li, Tie; Wang, Mingkai; Han, Qiu; Gao, Ziji; Jiang, Feng

    2018-02-01

    Large-scale access of distributed power can improve the current environmental pressure, at the same time, increasing the complexity and uncertainty of overall distribution system. Rational planning of distributed power can effectively improve the system voltage level. To this point, the specific impact on distribution network power quality caused by the access of typical distributed power was analyzed and from the point of improving the learning factor and the inertia weight, an improved particle swarm optimization algorithm (IPSO) was proposed which could solve distributed generation planning for distribution network to improve the local and global search performance of the algorithm. Results show that the proposed method can well reduce the system network loss and improve the economic performance of system operation with distributed generation.

  19. Improved Load Shedding Scheme considering Distributed Generation

    DEFF Research Database (Denmark)

    Das, Kaushik; Nitsas, Antonios; Altin, Müfit

    2017-01-01

    With high penetration of distributed generation (DG), the conventional under-frequency load shedding (UFLS) face many challenges and may not perform as expected. This article proposes new UFLS schemes, which are designed to overcome the shortcomings of traditional load shedding scheme...

  20. Supply chain solutions to improve the distribution of antiretroviral ...

    African Journals Online (AJOL)

    Recommendations to address the problems include: Implementing a supply chain planning and design process; improving inventory management and warehousing practices; implementing more effective and reliable distribution and transportation processes; as well as improving supply chain coordination and overall ...

  1. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  2. The value of oxygen-isotope data and multiple discharge records in calibrating a fully-distributed, physically-based rainfall-runoff model (CRUM3) to improve predictive capability

    Science.gov (United States)

    Neill, Aaron; Reaney, Sim

    2015-04-01

    Fully-distributed, physically-based rainfall-runoff models attempt to capture some of the complexity of the runoff processes that operate within a catchment, and have been used to address a variety of issues including water quality and the effect of climate change on flood frequency. Two key issues are prevalent, however, which call into question the predictive capability of such models. The first is the issue of parameter equifinality which can be responsible for large amounts of uncertainty. The second is whether such models make the right predictions for the right reasons - are the processes operating within a catchment correctly represented, or do the predictive abilities of these models result only from the calibration process? The use of additional data sources, such as environmental tracers, has been shown to help address both of these issues, by allowing for multi-criteria model calibration to be undertaken, and by permitting a greater understanding of the processes operating in a catchment and hence a more thorough evaluation of how well catchment processes are represented in a model. Using discharge and oxygen-18 data sets, the ability of the fully-distributed, physically-based CRUM3 model to represent the runoff processes in three sub-catchments in Cumbria, NW England has been evaluated. These catchments (Morland, Dacre and Pow) are part of the of the River Eden demonstration test catchment project. The oxygen-18 data set was firstly used to derive transit-time distributions and mean residence times of water for each of the catchments to gain an integrated overview of the types of processes that were operating. A generalised likelihood uncertainty estimation procedure was then used to calibrate the CRUM3 model for each catchment based on a single discharge data set from each catchment. Transit-time distributions and mean residence times of water obtained from the model using the top 100 behavioural parameter sets for each catchment were then compared to

  3. Improving the understanding of rainfall distribution and ...

    African Journals Online (AJOL)

    2016-10-04

    Oct 4, 2016 ... facilities and development of robust methods, especially geosta- tistically-based .... Cathedral Peak historical rainfall dataset, quality control pro- cedures .... used to assess the predictive power of the developed model. The.

  4. Improvement of power quality using distributed generation

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Munoz, A.; Lopez-Rodriguez, M.A.; Flores-Arias, J.M.; Bellido-Outerino, F.J. [Universidad de Cordoba, Departamento A.C., Electronica y T.E., Escuela Politecnica Superior, Campus de Rabanales, E-14071 Cordoba (Spain); de-la-Rosa, J.J.G. [Universidad de Cadiz, Area de Electronica, Dpto. ISA, TE y Electronica, Escuela Politecnica Superior Avda, Ramon Puyol, S/N, E-11202-Algeciras-Cadiz (Spain); Ruiz-de-Adana, M. [Universidad de Cordoba, Departamento de Quimica Fisica y Termodinamica Aplicada, Campus de Rabanales, E-14071 Cordoba (Spain)

    2010-12-15

    This paper addresses how Distributed Generation (DG), particularly when configured in Combined Heat and Power (CHP) mode, can become a powerful reliability solution in highlight automated factories, especially when integrated with complimentary Power Quality (PQ) measures. The paper presents results from the PQ audit conducted at a highly automated plant over last year. It was found that the main problems for the equipment installed were voltage sags. Among all categories of electrical disturbances, the voltage sag (dip) and momentary interruption are the nemeses of the automated industrial process. The paper analyzes the capabilities of modern electronic power supplies and the convenience of embedded solution. Finally it is addressed the role of the DG/CHP on the reliability of digital factories. (author)

  5. An improved AVC strategy applied in distributed wind power system

    Science.gov (United States)

    Zhao, Y. N.; Liu, Q. H.; Song, S. Y.; Mao, W.

    2016-08-01

    Traditional AVC strategy is mainly used in wind farm and only concerns about grid connection point, which is not suitable for distributed wind power system. Therefore, this paper comes up with an improved AVC strategy applied in distributed wind power system. The strategy takes all nodes of distribution network into consideration and chooses the node having the most serious voltage deviation as control point to calculate the reactive power reference. In addition, distribution principles can be divided into two conditions: when wind generators access to network on single node, the reactive power reference is distributed according to reactive power capacity; when wind generators access to network on multi-node, the reference is distributed according to sensitivity. Simulation results show the correctness and reliability of the strategy. Compared with traditional control strategy, the strategy described in this paper can make full use of generators reactive power output ability according to the distribution network voltage condition and improve the distribution network voltage level effectively.

  6. Improved quasi parton distribution through Wilson line renormalization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jiunn-Wei [Department of Physics, Center for Theoretical Sciences, and Leung Center for Cosmology and Particle Astrophysics, National Taiwan University, Taipei, 106, Taiwan (China); Center for Theoretical Physics, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Ji, Xiangdong [INPAC, Department of Physics and Astronomy, Shanghai Jiao Tong University, Shanghai, 200240 (China); Maryland Center for Fundamental Physics, Department of Physics, University of Maryland, College Park, MD 20742 (United States); Zhang, Jian-Hui, E-mail: jianhui.zhang@physik.uni-regensburg.de [Institut für Theoretische Physik, Universität Regensburg, D-93040 Regensburg (Germany)

    2017-02-15

    Recent developments showed that hadron light-cone parton distributions could be directly extracted from spacelike correlators, known as quasi parton distributions, in the large hadron momentum limit. Unlike the normal light-cone parton distribution, a quasi parton distribution contains ultraviolet (UV) power divergence associated with the Wilson line self energy. We show that to all orders in the coupling expansion, the power divergence can be removed by a “mass” counterterm in the auxiliary z-field formalism, in the same way as the renormalization of power divergence for an open Wilson line. After adding this counterterm, the quasi quark distribution is improved such that it contains at most logarithmic divergences. Based on a simple version of discretized gauge action, we present the one-loop matching kernel between the improved non-singlet quasi quark distribution with a lattice regulator and the corresponding quark distribution in dimensional regularization.

  7. Improved quasi parton distribution through Wilson line renormalization

    Directory of Open Access Journals (Sweden)

    Jiunn-Wei Chen

    2017-02-01

    Full Text Available Recent developments showed that hadron light-cone parton distributions could be directly extracted from spacelike correlators, known as quasi parton distributions, in the large hadron momentum limit. Unlike the normal light-cone parton distribution, a quasi parton distribution contains ultraviolet (UV power divergence associated with the Wilson line self energy. We show that to all orders in the coupling expansion, the power divergence can be removed by a “mass” counterterm in the auxiliary z-field formalism, in the same way as the renormalization of power divergence for an open Wilson line. After adding this counterterm, the quasi quark distribution is improved such that it contains at most logarithmic divergences. Based on a simple version of discretized gauge action, we present the one-loop matching kernel between the improved non-singlet quasi quark distribution with a lattice regulator and the corresponding quark distribution in dimensional regularization.

  8. Distributions with given marginals and statistical modelling

    CERN Document Server

    Fortiana, Josep; Rodriguez-Lallena, José

    2002-01-01

    This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.

  9. Photovoltaic subsystem marketing and distribution model: programming manual. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.

  10. Assessing safety risk in electricity distribution processes using ET & BA improved technique and its ranking by VIKOR and TOPSIS models in fuzzy environment

    OpenAIRE

    S. Rahmani; M. Omidvari

    2016-01-01

    Introduction: Electrical industries are among high risk industries. The present study aimed to assess safety risk in electricity distribution processes using  ET&BA technique and also to compare with both VIKOR & TOPSIS methods in fuzzy environments.   Material and Methods: The present research is a descriptive study and ET&BA worksheet is the main data collection tool. Both Fuzzy TOPSIS and Fuzzy VIKOR methods were used for the worksheet analysis.   Result: Findi...

  11. Property Improvement in CZT via Modeling and Processing Innovations . Te-particles in vertical gradient freeze CZT: Size and Spatial Distributions and Constitutional Supercooling

    Energy Technology Data Exchange (ETDEWEB)

    Henager, Charles H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Alvine, Kyle J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bliss, Mary [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Riley, Brian J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stave, Jean A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-10-01

    A section of a vertical gradient freeze CZT boule approximately 2100-mm3 with a planar area of 300-mm2 was prepared and examined using transmitted IR microscopy at various magnifications to determine the three-dimensional spatial and size distributions of Te-particles over large longitudinal and radial length scales. The boule section was approximately 50-mm wide by 60-mm in length by 7-mm thick and was doubly polished for TIR work. Te-particles were imaged through the thickness using extended focal imaging to locate the particles in thickness planes spaced 15-µm apart and then in plane of the image using xy-coordinates of the particle center of mass so that a true three dimensional particle map was assembled for a 1-mm by 45-mm longitudinal strip and for a 1-mm by 50-mm radial strip. Te-particle density distributions were determined as a function of longitudinal and radial positions in these strips, and treating the particles as vertices of a network created a 3D image of the particle spatial distribution. Te-particles exhibited a multi-modal log-normal size density distribution that indicated a slight preference for increasing size with longitudinal growth time, while showing a pronounced cellular network structure throughout the boule that can be correlated to dislocation network sizes in CZT. Higher magnification images revealed a typical Rayleigh-instability pearl string morphology with large and small satellite droplets. This study includes solidification experiments in small crucibles of 30:70 mixtures of Cd:Te to reduce the melting point below 1273 K (1000°C). These solidification experiments were performed over a wide range of cooling rates and clearly demonstrated a growth instability with Te-particle capture that is suggested to be responsible for one of the peaks in the size distribution using size discrimination visualization. The results are discussed with regard to a manifold Te-particle genesis history as 1) Te

  12. A Distributional Representation Model For Collaborative Filtering

    OpenAIRE

    Junlin, Zhang; Heng, Cai; Tongwen, Huang; Huiping, Xue

    2015-01-01

    In this paper, we propose a very concise deep learning approach for collaborative filtering that jointly models distributional representation for users and items. The proposed framework obtains better performance when compared against current state-of-art algorithms and that made the distributional representation model a promising direction for further research in the collaborative filtering.

  13. Rapid Prototyping of Formally Modelled Distributed Systems

    OpenAIRE

    Buchs, Didier; Buffo, Mathieu; Titsworth, Frances M.

    1999-01-01

    This paper presents various kinds of prototypes, used in the prototyping of formally modelled distributed systems. It presents the notions of prototyping techniques and prototype evolution, and shows how to relate them to the software life-cycle. It is illustrated through the use of the formal modelling language for distributed systems CO-OPN/2.

  14. Distributed collaborative team effectiveness: measurement and process improvement

    Science.gov (United States)

    Wheeler, R.; Hihn, J.; Wilkinson, B.

    2002-01-01

    This paper describes a measurement methodology developed for assessing the readiness, and identifying opportunities for improving the effectiveness, of distributed collaborative design teams preparing to conduct a coccurent design session.

  15. Modeling of scroll compressors - Improvements

    Energy Technology Data Exchange (ETDEWEB)

    Duprez, Marie-Eve; Dumont, Eric; Frere, Marc [Thermodynamics Department, Universite de Mons - Faculte Polytechnique, 31 bd Dolez, 7000 Mons (Belgium)

    2010-06-15

    This paper presents an improvement of the scroll compressors model previously published by. This improved model allows the calculation of refrigerant mass flow rate, power consumption and heat flow rate that would be released at the condenser of a heat pump equipped with the compressor, from the knowledge of operating conditions and parameters. Both basic and improved models have been tested on scroll compressors using different refrigerants. This study has been limited to compressors with a maximum electrical power of 14 kW and for evaporation temperatures ranging from -40 to 15 C and condensation temperatures from 10 to 75 C. The average discrepancies on mass flow rate, power consumption and heat flow rate are respectively 0.50%, 0.93% and 3.49%. Using a global parameter determination (based on several refrigerants data), this model can predict the behavior of a compressor with another fluid for which no manufacturer data are available. (author)

  16. A phenomenological retention tank model using settling velocity distributions.

    Science.gov (United States)

    Maruejouls, T; Vanrolleghem, P A; Pelletier, G; Lessard, P

    2012-12-15

    Many authors have observed the influence of the settling velocity distribution on the sedimentation process in retention tanks. However, the pollutants' behaviour in such tanks is not well characterized, especially with respect to their settling velocity distribution. This paper presents a phenomenological modelling study dealing with the way by which the settling velocity distribution of particles in combined sewage changes between entering and leaving an off-line retention tank. The work starts from a previously published model (Lessard and Beck, 1991) which is first implemented in a wastewater management modelling software, to be then tested with full-scale field data for the first time. Next, its performance is improved by integrating the particle settling velocity distribution and adding a description of the resuspension due to pumping for emptying the tank. Finally, the potential of the improved model is demonstrated by comparing the results for one more rain event. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Business Models and Regulation | Distributed Generation Interconnection

    Science.gov (United States)

    Collaborative | NREL Business Models and Regulation Business Models and Regulation Subscribe to new business models and approaches. The growing role of distributed resources in the electricity system is leading to a shift in business models and regulation for electric utilities. These

  18. New trends in species distribution modelling

    Science.gov (United States)

    Zimmermann, Niklaus E.; Edwards, Thomas C.; Graham, Catherine H.; Pearman, Peter B.; Svenning, Jens-Christian

    2010-01-01

    Species distribution modelling has its origin in the late 1970s when computing capacity was limited. Early work in the field concentrated mostly on the development of methods to model effectively the shape of a species' response to environmental gradients (Austin 1987, Austin et al. 1990). The methodology and its framework were summarized in reviews 10–15 yr ago (Franklin 1995, Guisan and Zimmermann 2000), and these syntheses are still widely used as reference landmarks in the current distribution modelling literature. However, enormous advancements have occurred over the last decade, with hundreds – if not thousands – of publications on species distribution model (SDM) methodologies and their application to a broad set of conservation, ecological and evolutionary questions. With this special issue, originating from the third of a set of specialized SDM workshops (2008 Riederalp) entitled 'The Utility of Species Distribution Models as Tools for Conservation Ecology', we reflect on current trends and the progress achieved over the last decade.

  19. Improved road traffic emission inventories by adding mean speed distributions

    NARCIS (Netherlands)

    Smit, R.; Poelman, M.; Schrijver, J.

    2008-01-01

    Does consideration of average speed distributions on roads-as compared to single mean speed-lead to different results in emission modelling of large road networks? To address this question, a post-processing method is developed to predict mean speed distributions using available traffic data from a

  20. Economic Models and Algorithms for Distributed Systems

    CERN Document Server

    Neumann, Dirk; Altmann, Jorn; Rana, Omer F

    2009-01-01

    Distributed computing models for sharing resources such as Grids, Peer-to-Peer systems, or voluntary computing are becoming increasingly popular. This book intends to discover fresh avenues of research and amendments to existing technologies, aiming at the successful deployment of commercial distributed systems

  1. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  2. Distributed Generation Market Demand Model (dGen): Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Sigrin, Benjamin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gleason, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Preus, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Baring-Gould, Ian [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-02-01

    The Distributed Generation Market Demand model (dGen) is a geospatially rich, bottom-up, market-penetration model that simulates the potential adoption of distributed energy resources (DERs) for residential, commercial, and industrial entities in the continental United States through 2050. The National Renewable Energy Laboratory (NREL) developed dGen to analyze the key factors that will affect future market demand for distributed solar, wind, storage, and other DER technologies in the United States. The new model builds off, extends, and replaces NREL's SolarDS model (Denholm et al. 2009a), which simulates the market penetration of distributed PV only. Unlike the SolarDS model, dGen can model various DER technologies under one platform--it currently can simulate the adoption of distributed solar (the dSolar module) and distributed wind (the dWind module) and link with the ReEDS capacity expansion model (Appendix C). The underlying algorithms and datasets in dGen, which improve the representation of customer decision making as well as the spatial resolution of analyses (Figure ES-1), also are improvements over SolarDS.

  3. Improving practical atmospheric dispersion models

    International Nuclear Information System (INIS)

    Hunt, J.C.R.; Hudson, B.; Thomson, D.J.

    1992-01-01

    The new generation of practical atmospheric dispersion model (for short range ≤ 30 km) are based on dispersion science and boundary layer meteorology which have widespread international acceptance. In addition, recent improvements in computer skills and the widespread availability of small powerful computers make it possible to have new regulatory models which are more complex than the previous generation which were based on charts and simple formulae. This paper describes the basis of these models and how they have developed. Such models are needed to satisfy the urgent public demand for sound, justifiable and consistent environmental decisions. For example, it is preferable that the same models are used to simulate dispersion in different industries; in many countries at present different models are used for emissions from nuclear and fossil fuel power stations. The models should not be so simple as to be suspect but neither should they be too complex for widespread use; for example, at public inquiries in Germany, where simple models are mandatory, it is becoming usual to cite the results from highly complex computational models because the simple models are not credible. This paper is written in a schematic style with an emphasis on tables and diagrams. (au) (22 refs.)

  4. Sample sizes and model comparison metrics for species distribution models

    Science.gov (United States)

    B.B. Hanberry; H.S. He; D.C. Dey

    2012-01-01

    Species distribution models use small samples to produce continuous distribution maps. The question of how small a sample can be to produce an accurate model generally has been answered based on comparisons to maximum sample sizes of 200 observations or fewer. In addition, model comparisons often are made with the kappa statistic, which has become controversial....

  5. Scaling precipitation input to spatially distributed hydrological models by measured snow distribution

    Directory of Open Access Journals (Sweden)

    Christian Vögeli

    2016-12-01

    Full Text Available Accurate knowledge on snow distribution in alpine terrain is crucial for various applicationssuch as flood risk assessment, avalanche warning or managing water supply and hydro-power.To simulate the seasonal snow cover development in alpine terrain, the spatially distributed,physics-based model Alpine3D is suitable. The model is typically driven by spatial interpolationsof observations from automatic weather stations (AWS, leading to errors in the spatial distributionof atmospheric forcing. With recent advances in remote sensing techniques, maps of snowdepth can be acquired with high spatial resolution and accuracy. In this work, maps of the snowdepth distribution, calculated from summer and winter digital surface models based on AirborneDigital Sensors (ADS, are used to scale precipitation input data, with the aim to improve theaccuracy of simulation of the spatial distribution of snow with Alpine3D. A simple method toscale and redistribute precipitation is presented and the performance is analysed. The scalingmethod is only applied if it is snowing. For rainfall the precipitation is distributed by interpolation,with a simple air temperature threshold used for the determination of the precipitation phase.It was found that the accuracy of spatial snow distribution could be improved significantly forthe simulated domain. The standard deviation of absolute snow depth error is reduced up toa factor 3.4 to less than 20 cm. The mean absolute error in snow distribution was reducedwhen using representative input sources for the simulation domain. For inter-annual scaling, themodel performance could also be improved, even when using a remote sensing dataset from adifferent winter. In conclusion, using remote sensing data to process precipitation input, complexprocesses such as preferential snow deposition and snow relocation due to wind or avalanches,can be substituted and modelling performance of spatial snow distribution is improved.

  6. Modelling refrigerant distribution in minichannel evaporators

    DEFF Research Database (Denmark)

    Brix, Wiebke

    of the liquid and vapour in the inlet manifold. Combining non-uniform airflow and non-uniform liquid and vapour distribution shows that a non-uniform airflow distribution to some degree can be compensated by a suitable liquid and vapour distribution. Controlling the superheat out of the individual channels...... to be equal, results in a cooling capacity very close to the optimum. A sensitivity study considering parameter changes shows that the course of the pressure gradient in the channel is significant, considering the magnitude of the capacity reductions due to non-uniform liquid and vapour distribution and non......This thesis is concerned with numerical modelling of flow distribution in a minichannel evaporator for air-conditioning. The study investigates the impact of non-uniform airflow and non-uniform distribution of the liquid and vapour phases in the inlet manifold on the refrigerant mass flow...

  7. Multiplicity distributions in the dual parton model

    International Nuclear Information System (INIS)

    Batunin, A.V.; Tolstenkov, A.N.

    1985-01-01

    Multiplicity distributions are calculated by means of a new mechanism of production of hadrons in a string, which was proposed previously by the authors and takes into account explicitly the valence character of the ends of the string. It is shown that allowance for this greatly improves the description of the low-energy multiplicity distributions. At superhigh energies, the contribution of the ends of the strings becomes negligibly small, but in this case multi-Pomeron contributions must be taken into account

  8. Electricity distribution management Smart Grid system model

    Directory of Open Access Journals (Sweden)

    Wiesław Nowak

    2012-06-01

    Full Text Available This paper presents issues concerning the implementation of Smart Grid solutions in a real distribution network. The main components possible to quick implementation were presented. Realization of these ideas should bring tangible benefi ts to both customers and distribution system operators. Moreover the paper shows selected research results which examine proposed solutions in area of improving supply reliability and reducing energy losses in analysed network.

  9. Improving Distributed Diagnosis Through Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Complex engineering systems require efficient fault diagnosis methodologies, but centralized ap- proaches do not scale well, and this motivates the development of...

  10. Mathematical Models for Room Air Distribution

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    1982-01-01

    A number of different models on the air distribution in rooms are introduced. This includes the throw model, a model on penetration length of a cold wall jet and a model for maximum velocity in the dimensioning of an air distribution system in highly loaded rooms and shows that the amount of heat...... removed from the room at constant penetration length is proportional to the cube of the velocities in the occupied zone. It is also shown that a large number of diffusers increases the amount of heat which may be removed without affecting the thermal conditions. Control strategies for dual duct and single...... duct systems are given and the paper is concluded by mentioning a computer-based prediction method which gives the velocity and temperature distribution in the whole room....

  11. Mathematical Models for Room Air Distribution - Addendum

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    1982-01-01

    A number of different models on the air distribution in rooms are introduced. This includes the throw model, a model on penetration length of a cold wall jet and a model for maximum velocity in the dimensioning of an air distribution system in highly loaded rooms and shows that the amount of heat...... removed from the room at constant penetration length is proportional to the cube of the velocities in the occupied zone. It is also shown that a large number of diffusers increases the amount of heat which may be removed without affecting the thermal conditions. Control strategies for dual duct and single...... duct systems are given and the paper is concluded by mentioning a computer-based prediction method which gives the velocity and temperature distribution in the whole room....

  12. Distributed modelling of shallow landslides triggered by intense rainfall

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Hazard assessment of shallow landslides represents an important aspect of land management in mountainous areas. Among all the methods proposed in the literature, physically based methods are the only ones that explicitly includes the dynamic factors that control landslide triggering (rainfall pattern, land-use. For this reason, they allow forecasting both the temporal and the spatial distribution of shallow landslides. Physically based methods for shallow landslides are based on the coupling of the infinite slope stability analysis with hydrological models. Three different grid-based distributed hydrological models are presented in this paper: a steady state model, a transient "piston-flow" wetting front model, and a transient diffusive model. A comparative test of these models was performed to simulate landslide occurred during a rainfall event (27–28 June 1997 that triggered hundreds of shallow landslides within Lecco province (central Southern Alps, Italy. In order to test the potential for a completely distributed model for rainfall-triggered landslides, radar detected rainfall intensity has been used. A new procedure for quantitative evaluation of distributed model performance is presented and used in this paper. The diffusive model results in the best model for the simulation of shallow landslide triggering after a rainfall event like the one that we have analysed. Finally, radar data available for the June 1997 event permitted greatly improving the simulation. In particular, radar data allowed to explain the non-uniform distribution of landslides within the study area.

  13. Framing Feedback for School Improvement around Distributed Leadership

    Science.gov (United States)

    Kelley, Carolyn; Dikkers, Seann

    2016-01-01

    Purpose: The purpose of this article is to examine the utility of framing formative feedback to improve school leadership with a focus on task-based evaluation of distributed leadership rather than on role-based evaluation of an individual leader. Research Methods/Approach: Using data from research on the development of the Comprehensive…

  14. Improved Root Normal Size Distributions for Liquid Atomization

    Science.gov (United States)

    2015-11-01

    ANSI Std. Z39.18 j CONVERSION TABLE Conversion Factors for U.S. Customary to metric (SI) units of measurement. MULTIPLY BY TO...Gray (Gy) coulomb /kilogram (C/kg) second (s) kilogram (kg) kilo pascal (kPa) 1 Improved Root Normal Size Distributions for Liquid

  15. Improvements of evaporation drag model

    International Nuclear Information System (INIS)

    Li Xiaoyan; Yang Yanhua; Xu Jijun

    2004-01-01

    A special observable experiment facility has been established, and a series of experiments have been carried out on this facility by pouring one or several high-temperature particles into a water pool. The experiment has verified the evaporation drag model, which believe the non-symmetric profile of the local evaporation rate and the local density of the vapor would bring about a resultant force on the hot particle so as to resist its motion. However, in Yang's evaporation drag model, radiation heat transfer is taken as the only way to transfer heat from hot particle to the vapor-liquid interface and all of the radiation energy is deposited on the vapor-liquid interface, thus contributing to the vaporization rate and mass balance of the vapor film. So, the heat conduction and the heat convection are taken into account in improved model. At the same time, the improved model given by this paper presented calculations of the effect of hot particles temperature on the radiation absorption behavior of water

  16. Improvements in ECN Wake Model

    Energy Technology Data Exchange (ETDEWEB)

    Versteeg, M.C. [University of Twente, Enschede (Netherlands); Ozdemir, H.; Brand, A.J. [ECN Wind Energy, Petten (Netherlands)

    2013-08-15

    Wind turbines extract energy from the flow field so that the flow in the wake of a wind turbine contains less energy and more turbulence than the undisturbed flow, leading to less energy extraction for the downstream turbines. In large wind farms, most turbines are located in the wake of one or more turbines causing the flow characteristics felt by these turbines differ considerably from the free stream flow conditions. The most important wake effect is generally considered to be the lower wind speed behind the turbine(s) since this decreases the energy production and as such the economical performance of a wind farm. The overall loss of a wind farm is very much dependent on the conditions and the lay-out of the farm but it can be in the order of 5-10%. Apart from the loss in energy production an additional wake effect is formed by the increase in turbulence intensity, which leads to higher fatigue loads. In this sense it becomes important to understand the details of wake behavior to improve and/or optimize a wind farm layout. Within this study improvements are presented for the existing ECN wake model which constructs the fundamental basis of ECN's FarmFlow wind farm wake simulation tool. The outline of this paper is as follows: first, the governing equations of the ECN wake farm model are presented. Then the near wake modeling is discussed and the results compared with the original near wake modeling and EWTW (ECN Wind Turbine Test Site Wieringermeer) data as well as the results obtained for various near wake implementation cases are shown. The details of the atmospheric stability model are given and the comparison with the solution obtained for the original surface layer model and with the available data obtained by EWTW measurements are presented. Finally the conclusions are summarized.

  17. A Hierarchy Model of Income Distribution

    OpenAIRE

    Fix, Blair

    2018-01-01

    Based on worldly experience, most people would agree that firms are hierarchically organized, and that pay tends to increase as one moves up the hierarchy. But how this hierarchical structure affects income distribution has not been widely studied. To remedy this situation, this paper presents a new model of income distribution that explores the effects of social hierarchy. This ‘hierarchy model’ takes the limited available evidence on the structure of firm hierarchies and generalizes it to c...

  18. Modeling a Distribution of Mortgage Credit Losses

    Czech Academy of Sciences Publication Activity Database

    Gapko, Petr; Šmíd, Martin

    2012-01-01

    Roč. 60, č. 10 (2012), s. 1005-1023 ISSN 0013-3035 R&D Projects: GA ČR GD402/09/H045; GA ČR(CZ) GBP402/12/G097 Grant - others:Univerzita Karlova(CZ) 46108 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : credit risk * mortgage * delinquency rate * generalized hyperbolic distribution * normal distribution Subject RIV: AH - Economics Impact factor: 0.194, year: 2012 http://library.utia.cas.cz/separaty/2013/E/smid-modeling a distribution of mortgage credit losses.pdf

  19. Modeling a Distribution of Mortgage Credit Losses

    Czech Academy of Sciences Publication Activity Database

    Gapko, Petr; Šmíd, Martin

    2010-01-01

    Roč. 23, č. 23 (2010), s. 1-23 R&D Projects: GA ČR GA402/09/0965; GA ČR GD402/09/H045 Grant - others:Univerzita Karlova - GAUK(CZ) 46108 Institutional research plan: CEZ:AV0Z10750506 Keywords : Credit Risk * Mortgage * Delinquency Rate * Generalized Hyperbolic Distribution * Normal Distribution Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/gapko-modeling a distribution of mortgage credit losses-ies wp.pdf

  20. Advanced Distribution Network Modelling with Distributed Energy Resources

    Science.gov (United States)

    O'Connell, Alison

    The addition of new distributed energy resources, such as electric vehicles, photovoltaics, and storage, to low voltage distribution networks means that these networks will undergo major changes in the future. Traditionally, distribution systems would have been a passive part of the wider power system, delivering electricity to the customer and not needing much control or management. However, the introduction of these new technologies may cause unforeseen issues for distribution networks, due to the fact that they were not considered when the networks were originally designed. This thesis examines different types of technologies that may begin to emerge on distribution systems, as well as the resulting challenges that they may impose. Three-phase models of distribution networks are developed and subsequently utilised as test cases. Various management strategies are devised for the purposes of controlling distributed resources from a distribution network perspective. The aim of the management strategies is to mitigate those issues that distributed resources may cause, while also keeping customers' preferences in mind. A rolling optimisation formulation is proposed as an operational tool which can manage distributed resources, while also accounting for the uncertainties that these resources may present. Network sensitivities for a particular feeder are extracted from a three-phase load flow methodology and incorporated into an optimisation. Electric vehicles are the focus of the work, although the method could be applied to other types of resources. The aim is to minimise the cost of electric vehicle charging over a 24-hour time horizon by controlling the charge rates and timings of the vehicles. The results demonstrate the advantage that controlled EV charging can have over an uncontrolled case, as well as the benefits provided by the rolling formulation and updated inputs in terms of cost and energy delivered to customers. Building upon the rolling optimisation, a

  1. Dynamic models for distributed generation resources

    Energy Technology Data Exchange (ETDEWEB)

    Morched, A.S. [BPR Energie, Sherbrooke, PQ (Canada)

    2010-07-01

    Distributed resources can impact the performance of host power systems during both normal and abnormal system conditions. This PowerPoint presentation discussed the use of dynamic models for identifying potential interaction problems between interconnected systems. The models were designed to simulate steady state behaviour as well as transient responses to system disturbances. The distributed generators included directly coupled and electronically coupled generators. The directly coupled generator was driven by wind turbines. Simplified models of grid-side inverters, electronically coupled wind generators and doubly-fed induction generators (DFIGs) were presented. The responses of DFIGs to wind variations were evaluated. Synchronous machine and electronically coupled generator responses were compared. The system model components included load models, generators, protection systems, and system equivalents. Frequency responses to islanding events were reviewed. The study demonstrated that accurate simulations are needed to predict the impact of distributed generation resources on the performance of host systems. Advances in distributed generation technology have outpaced the development of models needed for integration studies. tabs., figs.

  2. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  3. Improved modelling of independent parton hadronization

    International Nuclear Information System (INIS)

    Biddulph, P.; Thompson, G.

    1989-01-01

    A modification is proposed to current versions of the Field-Feynman ansatz for the hadronization of a quark in Monte Carlo models of QCD interactions. This faster-running algorithm has no more parameters and imposes a better degree of energy conservation. It results in naturally introducing a limitation of the transverse momentum distribution, similar to the experimentally observed ''seagull'' effect. There is now a much improved conservation of quantum numbers between the original parton and resultant hadrons, and the momentum of the emitted parton is better preserved in the summed momentum vectors of the final state particles. (orig.)

  4. Distributed leadership, team working and service improvement in healthcare.

    Science.gov (United States)

    Boak, George; Dickens, Victoria; Newson, Annalisa; Brown, Louise

    2015-01-01

    The purpose of this paper is to analyse the introduction of distributed leadership and team working in a therapy department in a healthcare organisation and to explore the factors that enabled the introduction to be successful. This paper used a case study methodology. Qualitative and quantitative information was gathered from one physiotherapy department over a period of 24 months. Distributed leadership and team working were central to a number of system changes that were initiated by the department, which led to improvements in patient waiting times for therapy. The paper identifies six factors that appear to have influenced the successful introduction of distributed learning and team working in this case. This is a single case study. It would be interesting to explore whether these factors are found in other cases where distributed leadership is introduced in healthcare organisations. The paper provides an example of successful introduction of distributed leadership, which has had a positive impact on services to patients. Other therapy teams may consider how the approach may be adopted or adapted to their own circumstances. Although distributed leadership is thought to be important in healthcare, particularly when organisational change is needed, there are very few studies of the practicalities of how it can be introduced.

  5. Distributed model predictive control made easy

    CERN Document Server

    Negenborn, Rudy

    2014-01-01

    The rapid evolution of computer science, communication, and information technology has enabled the application of control techniques to systems beyond the possibilities of control theory just a decade ago. Critical infrastructures such as electricity, water, traffic and intermodal transport networks are now in the scope of control engineers. The sheer size of such large-scale systems requires the adoption of advanced distributed control approaches. Distributed model predictive control (MPC) is one of the promising control methodologies for control of such systems.   This book provides a state-of-the-art overview of distributed MPC approaches, while at the same time making clear directions of research that deserve more attention. The core and rationale of 35 approaches are carefully explained. Moreover, detailed step-by-step algorithmic descriptions of each approach are provided. These features make the book a comprehensive guide both for those seeking an introduction to distributed MPC as well as for those ...

  6. Applications of species distribution modeling to paleobiology

    DEFF Research Database (Denmark)

    Svenning, Jens-Christian; Fløjgaard, Camilla; Marske, Katharine Ann

    2011-01-01

    -Pleistocene megafaunal extinctions, past community assembly, human paleobiogeography, Holocene paleoecology, and even deep-time biogeography (notably, providing insights into biogeographic dynamics >400 million years ago). We discuss important assumptions and uncertainties that affect the SDM approach to paleobiology......Species distribution modeling (SDM: statistical and/or mechanistic approaches to the assessment of range determinants and prediction of species occurrence) offers new possibilities for estimating and studying past organism distributions. SDM complements fossil and genetic evidence by providing (i......) quantitative and potentially high-resolution predictions of the past organism distributions, (ii) statistically formulated, testable ecological hypotheses regarding past distributions and communities, and (iii) statistical assessment of range determinants. In this article, we provide an overview...

  7. Modeling Word Burstiness Using the Dirichlet Distribution

    DEFF Research Database (Denmark)

    Madsen, Rasmus Elsborg; Kauchak, David; Elkan, Charles

    2005-01-01

    Multinomial distributions are often used to model text documents. However, they do not capture well the phenomenon that words in a document tend to appear in bursts: if a word appears once, it is more likely to appear again. In this paper, we propose the Dirichlet compound multinomial model (DCM......) as an alternative to the multinomial. The DCM model has one additional degree of freedom, which allows it to capture burstiness. We show experimentally that the DCM is substantially better than the multinomial at modeling text data, measured by perplexity. We also show using three standard document collections...

  8. Overhead distribution line models for harmonics studies

    Energy Technology Data Exchange (ETDEWEB)

    Nagpal, M.; Xu, W.; Dommel, H.W.

    1994-01-01

    Carson's formulae and Maxwell's potential coefficients are used for calculating the per unit length series impedances and shunt capacitances of the overhead lines. The per unit length values are then used for building the models, nominal pi-circuit, and equivalent pi-circuit at the harmonic frequencies. This paper studies the accuracy of these models for presenting the overhead distribution lines in steady-state harmonic solutions at frequencies up to 5 kHz. The models are verified with a field test on a 25 kV distribution line and the sensitivity of the models to ground resistivity, skin effect, and multiple grounding is reported.

  9. A DISTRIBUTED HYPERMAP MODEL FOR INTERNET GIS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The rapid development of Internet technology makes it possible to integrate GIS with the Internet,forming Internet GIS.Internet GIS is based on a distributed client/server architecture and TCP/IP & IIOP.When constructing and designing Internet GIS,we face the problem of how to express information units of Internet GIS.In order to solve this problem,this paper presents a distributed hypermap model for Internet GIS.This model provides a solution to organize and manage Internet GIS information units.It also illustrates relations between two information units and in an internal information unit both on clients and servers.On the basis of this model,the paper contributes to the expressions of hypermap relations and hypermap operations.The usage of this model is shown in the implementation of a prototype system.

  10. Spatial distribution of emissions to air - the SPREAD model

    Energy Technology Data Exchange (ETDEWEB)

    Plejdrup, M S; Gyldenkaerne, S

    2011-04-15

    The National Environmental Research Institute (NERI), Aarhus University, completes the annual national emission inventories for greenhouse gases and air pollutants according to Denmark's obligations under international conventions, e.g. the climate convention, UNFCCC and the convention on long-range transboundary air pollution, CLRTAP. NERI has developed a model to distribute emissions from the national emission inventories on a 1x1 km grid covering the Danish land and sea territory. The new spatial high resolution distribution model for emissions to air (SPREAD) has been developed according to the requirements for reporting of gridded emissions to CLRTAP. Spatial emission data is e.g. used as input for air quality modelling, which again serves as input for assessment and evaluation of health effects. For these purposes distributions with higher spatial resolution have been requested. Previously, a distribution on the 17x17 km EMEP grid has been set up and used in research projects combined with detailed distributions for a few sectors or sub-sectors e.g. a distribution for emissions from road traffic on 1x1 km resolution. SPREAD is developed to generate improved spatial emission data for e.g. air quality modelling in exposure studies. SPREAD includes emission distributions for each sector in the Danish inventory system; stationary combustion, mobile sources, fugitive emissions from fuels, industrial processes, solvents and other product use, agriculture and waste. This model enables generation of distributions for single sectors and for a number of sub-sectors and single sources as well. This report documents the methodologies in this first version of SPREAD and presents selected results. Further, a number of potential improvements for later versions of SPREAD are addressed and discussed. (Author)

  11. Spatial distribution of emissions to air - the SPREAD model

    Energy Technology Data Exchange (ETDEWEB)

    Plejdrup, M.S.; Gyldenkaerne, S.

    2011-04-15

    The National Environmental Research Institute (NERI), Aarhus University, completes the annual national emission inventories for greenhouse gases and air pollutants according to Denmark's obligations under international conventions, e.g. the climate convention, UNFCCC and the convention on long-range transboundary air pollution, CLRTAP. NERI has developed a model to distribute emissions from the national emission inventories on a 1x1 km grid covering the Danish land and sea territory. The new spatial high resolution distribution model for emissions to air (SPREAD) has been developed according to the requirements for reporting of gridded emissions to CLRTAP. Spatial emission data is e.g. used as input for air quality modelling, which again serves as input for assessment and evaluation of health effects. For these purposes distributions with higher spatial resolution have been requested. Previously, a distribution on the 17x17 km EMEP grid has been set up and used in research projects combined with detailed distributions for a few sectors or sub-sectors e.g. a distribution for emissions from road traffic on 1x1 km resolution. SPREAD is developed to generate improved spatial emission data for e.g. air quality modelling in exposure studies. SPREAD includes emission distributions for each sector in the Danish inventory system; stationary combustion, mobile sources, fugitive emissions from fuels, industrial processes, solvents and other product use, agriculture and waste. This model enables generation of distributions for single sectors and for a number of sub-sectors and single sources as well. This report documents the methodologies in this first version of SPREAD and presents selected results. Further, a number of potential improvements for later versions of SPREAD are addressed and discussed. (Author)

  12. Improvement of axial power distribution synthesis methodology in CPC

    International Nuclear Information System (INIS)

    Kim, H. H.; Gee, S. G.;; Kim, Y. B.; In, W. K.

    2003-01-01

    The capability of axial power distribution synthesis in CPC plays an important role in determining the DNBR and LPD trip caused by CPC. The axial power distribution is synthesized using the cubic spline function based on the three excore detector signals. The axial power distributions are categorized into 8 function sets and each sets are stored as pre-calculated values in CPC to save the calculation time. In this study, the additional function sets, the real break-point function sets and the polynomial function are suggested to evaluate the possibility of improving the synthesis capability in CPC. In addition, RMS errors are compared and evaluated for each synthesis method. As a result, it was confirmed that the function sets stored in CPC were not optimal. The analysis result showed that RMS error could be reduced by selecting the proper function sets suggested in this study

  13. Improving Distribution Resiliency with Microgrids and State and Parameter Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Williams, Tess L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schneider, Kevin P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elizondo, Marcelo A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sun, Yannan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Chen-Ching [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Yin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gourisetti, Sri Nikhil Gup [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-09-30

    Modern society relies on low-cost reliable electrical power, both to maintain industry, as well as provide basic social services to the populace. When major disturbances occur, such as Hurricane Katrina or Hurricane Sandy, the nation’s electrical infrastructure can experience significant outages. To help prevent the spread of these outages, as well as facilitating faster restoration after an outage, various aspects of improving the resiliency of the power system are needed. Two such approaches are breaking the system into smaller microgrid sections, and to have improved insight into the operations to detect failures or mis-operations before they become critical. Breaking the system into smaller sections of microgrid islands, power can be maintained in smaller areas where distribution generation and energy storage resources are still available, but bulk power generation is no longer connected. Additionally, microgrid systems can maintain service to local pockets of customers when there has been extensive damage to the local distribution system. However, microgrids are grid connected a majority of the time and implementing and operating a microgrid is much different than when islanded. This report discusses work conducted by the Pacific Northwest National Laboratory that developed improvements for simulation tools to capture the characteristics of microgrids and how they can be used to develop new operational strategies. These operational strategies reduce the cost of microgrid operation and increase the reliability and resilience of the nation’s electricity infrastructure. In addition to the ability to break the system into microgrids, improved observability into the state of the distribution grid can make the power system more resilient. State estimation on the transmission system already provides great insight into grid operations and detecting abnormal conditions by leveraging existing measurements. These transmission-level approaches are expanded to using

  14. Programming model for distributed intelligent systems

    Science.gov (United States)

    Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.

    1988-01-01

    A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

  15. Modeling soil water content for vegetation modeling improvement

    Science.gov (United States)

    Cianfrani, Carmen; Buri, Aline; Zingg, Barbara; Vittoz, Pascal; Verrecchia, Eric; Guisan, Antoine

    2016-04-01

    Soil water content (SWC) is known to be important for plants as it affects the physiological processes regulating plant growth. Therefore, SWC controls plant distribution over the Earth surface, ranging from deserts and grassland to rain forests. Unfortunately, only a few data on SWC are available as its measurement is very time consuming and costly and needs specific laboratory tools. The scarcity of SWC measurements in geographic space makes it difficult to model and spatially project SWC over larger areas. In particular, it prevents its inclusion in plant species distribution model (SDMs) as predictor. The aims of this study were, first, to test a new methodology allowing problems of the scarcity of SWC measurements to be overpassed and second, to model and spatially project SWC in order to improve plant SDMs with the inclusion of SWC parameter. The study was developed in four steps. First, SWC was modeled by measuring it at 10 different pressures (expressed in pF and ranging from pF=0 to pF=4.2). The different pF represent different degrees of soil water availability for plants. An ensemble of bivariate models was built to overpass the problem of having only a few SWC measurements (n = 24) but several predictors to include in the model. Soil texture (clay, silt, sand), organic matter (OM), topographic variables (elevation, aspect, convexity), climatic variables (precipitation) and hydrological variables (river distance, NDWI) were used as predictors. Weighted ensemble models were built using only bivariate models with adjusted-R2 > 0.5 for each SWC at different pF. The second step consisted in running plant SDMs including modeled SWC jointly with the conventional topo-climatic variable used for plant SDMs. Third, SDMs were only run using the conventional topo-climatic variables. Finally, comparing the models obtained in the second and third steps allowed assessing the additional predictive power of SWC in plant SDMs. SWC ensemble models remained very good, with

  16. Beyond a climate-centric view of plant distribution: edaphic variables add value to distribution models.

    Science.gov (United States)

    Beauregard, Frieda; de Blois, Sylvie

    2014-01-01

    Both climatic and edaphic conditions determine plant distribution, however many species distribution models do not include edaphic variables especially over large geographical extent. Using an exceptional database of vegetation plots (n = 4839) covering an extent of ∼55,000 km2, we tested whether the inclusion of fine scale edaphic variables would improve model predictions of plant distribution compared to models using only climate predictors. We also tested how well these edaphic variables could predict distribution on their own, to evaluate the assumption that at large extents, distribution is governed largely by climate. We also hypothesized that the relative contribution of edaphic and climatic data would vary among species depending on their growth forms and biogeographical attributes within the study area. We modelled 128 native plant species from diverse taxa using four statistical model types and three sets of abiotic predictors: climate, edaphic, and edaphic-climate. Model predictive accuracy and variable importance were compared among these models and for species' characteristics describing growth form, range boundaries within the study area, and prevalence. For many species both the climate-only and edaphic-only models performed well, however the edaphic-climate models generally performed best. The three sets of predictors differed in the spatial information provided about habitat suitability, with climate models able to distinguish range edges, but edaphic models able to better distinguish within-range variation. Model predictive accuracy was generally lower for species without a range boundary within the study area and for common species, but these effects were buffered by including both edaphic and climatic predictors. The relative importance of edaphic and climatic variables varied with growth forms, with trees being more related to climate whereas lower growth forms were more related to edaphic conditions. Our study identifies the potential

  17. Beyond a climate-centric view of plant distribution: edaphic variables add value to distribution models.

    Directory of Open Access Journals (Sweden)

    Frieda Beauregard

    Full Text Available Both climatic and edaphic conditions determine plant distribution, however many species distribution models do not include edaphic variables especially over large geographical extent. Using an exceptional database of vegetation plots (n = 4839 covering an extent of ∼55,000 km2, we tested whether the inclusion of fine scale edaphic variables would improve model predictions of plant distribution compared to models using only climate predictors. We also tested how well these edaphic variables could predict distribution on their own, to evaluate the assumption that at large extents, distribution is governed largely by climate. We also hypothesized that the relative contribution of edaphic and climatic data would vary among species depending on their growth forms and biogeographical attributes within the study area. We modelled 128 native plant species from diverse taxa using four statistical model types and three sets of abiotic predictors: climate, edaphic, and edaphic-climate. Model predictive accuracy and variable importance were compared among these models and for species' characteristics describing growth form, range boundaries within the study area, and prevalence. For many species both the climate-only and edaphic-only models performed well, however the edaphic-climate models generally performed best. The three sets of predictors differed in the spatial information provided about habitat suitability, with climate models able to distinguish range edges, but edaphic models able to better distinguish within-range variation. Model predictive accuracy was generally lower for species without a range boundary within the study area and for common species, but these effects were buffered by including both edaphic and climatic predictors. The relative importance of edaphic and climatic variables varied with growth forms, with trees being more related to climate whereas lower growth forms were more related to edaphic conditions. Our study

  18. Distributed SLAM Using Improved Particle Filter for Mobile Robot Localization

    Directory of Open Access Journals (Sweden)

    Fujun Pei

    2014-01-01

    Full Text Available The distributed SLAM system has a similar estimation performance and requires only one-fifth of the computation time compared with centralized particle filter. However, particle impoverishment is inevitably because of the random particles prediction and resampling applied in generic particle filter, especially in SLAM problem that involves a large number of dimensions. In this paper, particle filter use in distributed SLAM was improved in two aspects. First, we improved the important function of the local filters in particle filter. The adaptive values were used to replace a set of constants in the computational process of importance function, which improved the robustness of the particle filter. Second, an information fusion method was proposed by mixing the innovation method and the number of effective particles method, which combined the advantages of these two methods. And this paper extends the previously known convergence results for particle filter to prove that improved particle filter converges to the optimal filter in mean square as the number of particles goes to infinity. The experiment results show that the proposed algorithm improved the virtue of the DPF-SLAM system in isolate faults and enabled the system to have a better tolerance and robustness.

  19. Modelling Dynamic Forgetting in Distributed Information Systems

    NARCIS (Netherlands)

    N.F. Höning (Nicolas); M.C. Schut

    2010-01-01

    htmlabstractWe describe and model a new aspect in the design of distributed information systems. We build upon a previously described problem on the microlevel, which asks how quickly agents should discount (forget) their experience: If they cherish their memories, they can build their reports on

  20. Comparison of sparse point distribution models

    DEFF Research Database (Denmark)

    Erbou, Søren Gylling Hemmingsen; Vester-Christensen, Martin; Larsen, Rasmus

    2010-01-01

    This paper compares several methods for obtaining sparse and compact point distribution models suited for data sets containing many variables. These are evaluated on a database consisting of 3D surfaces of a section of the pelvic bone obtained from CT scans of 33 porcine carcasses. The superior m...

  1. A Distributive Model of Treatment Acceptability

    Science.gov (United States)

    Carter, Stacy L.

    2008-01-01

    A model of treatment acceptability is proposed that distributes overall treatment acceptability into three separate categories of influence. The categories are comprised of societal influences, consultant influences, and influences associated with consumers of treatments. Each of these categories are defined and their inter-relationships within…

  2. Finessing atlas data for species distribution models

    NARCIS (Netherlands)

    Niamir, A.; Skidmore, A.K.; Toxopeus, A.G.; Munoz, A.R.; Real, R.

    2011-01-01

    Aim The spatial resolution of species atlases and therefore resulting model predictions are often too coarse for local applications. Collecting distribution data at a finer resolution for large numbers of species requires a comprehensive sampling effort, making it impractical and expensive. This

  3. Value-based distributed generator placements for service quality improvements

    Energy Technology Data Exchange (ETDEWEB)

    Teng, Jen-Hao; Chen, Chi-Fa [Department of Electrical Engineering, I-Shou University, No. 1, Section 1, Syuecheng Road, Dashu Township, Kaohsiung Country 840 (Taiwan); Liu, Yi-Hwa [Department of Electrical Engineering, National Taiwan University of Science and Technology, Taipei (Taiwan); Chen, Chia-Yen [Department of Computer Science, The University of Auckland (New Zealand)

    2007-03-15

    Distributed generator (DG) resources are small, self-contained electric generating plants that can provide power to homes, businesses or industrial facilities in distribution feeders. They can be used to reduce power loss and improve service reliability. However, the values of DGs are largely dependent on their types, sizes and locations as they were installed in distribution feeders. A value-based method is proposed in this paper to enhance the reliability and obtain the benefits for DG placement. The benefits of DG placement described in this paper include power cost saving, power loss reduction, and reliability enhancement. The costs of DG placement include the investment, maintenance and operating costs. The proposed value-based method tries to find the best tradeoff between the costs and benefits of DG placement and then find the optimal types of DG and their corresponding locations and sizes in distribution feeders. The derived formulations are solved by a genetic algorithm based method. Test results show that with proper types, sizes and installation site selection, DG placement can be used to improve system reliability, reduce customer interruption costs and save power cost; as well as enabling electric utilities to obtain the maximal economical benefits. (author)

  4. Estimating Predictive Variance for Statistical Gas Distribution Modelling

    International Nuclear Information System (INIS)

    Lilienthal, Achim J.; Asadi, Sahar; Reggente, Matteo

    2009-01-01

    Recent publications in statistical gas distribution modelling have proposed algorithms that model mean and variance of a distribution. This paper argues that estimating the predictive concentration variance entails not only a gradual improvement but is rather a significant step to advance the field. This is, first, since the models much better fit the particular structure of gas distributions, which exhibit strong fluctuations with considerable spatial variations as a result of the intermittent character of gas dispersal. Second, because estimating the predictive variance allows to evaluate the model quality in terms of the data likelihood. This offers a solution to the problem of ground truth evaluation, which has always been a critical issue for gas distribution modelling. It also enables solid comparisons of different modelling approaches, and provides the means to learn meta parameters of the model, to determine when the model should be updated or re-initialised, or to suggest new measurement locations based on the current model. We also point out directions of related ongoing or potential future research work.

  5. Improved Quantum Artificial Fish Algorithm Application to Distributed Network Considering Distributed Generation.

    Science.gov (United States)

    Du, Tingsong; Hu, Yang; Ke, Xianting

    2015-01-01

    An improved quantum artificial fish swarm algorithm (IQAFSA) for solving distributed network programming considering distributed generation is proposed in this work. The IQAFSA based on quantum computing which has exponential acceleration for heuristic algorithm uses quantum bits to code artificial fish and quantum revolving gate, preying behavior, and following behavior and variation of quantum artificial fish to update the artificial fish for searching for optimal value. Then, we apply the proposed new algorithm, the quantum artificial fish swarm algorithm (QAFSA), the basic artificial fish swarm algorithm (BAFSA), and the global edition artificial fish swarm algorithm (GAFSA) to the simulation experiments for some typical test functions, respectively. The simulation results demonstrate that the proposed algorithm can escape from the local extremum effectively and has higher convergence speed and better accuracy. Finally, applying IQAFSA to distributed network problems and the simulation results for 33-bus radial distribution network system show that IQAFSA can get the minimum power loss after comparing with BAFSA, GAFSA, and QAFSA.

  6. Regulatory Improvements for Effective Integration of Distributed Generation into Electricity Distribution Networks

    International Nuclear Information System (INIS)

    Scheepers, M.J.J.; Jansen, J.C.; De Joode, J.; Bauknecht, D.; Gomez, T.; Pudjianto, D.; Strbac, G.; Ropenus, S.

    2007-11-01

    The growth of distributed electricity supply of renewable energy sources (RES-E) and combined heat and power (CHP) - so called distributed generation (DG) - can cause technical problems for electricity distribution networks. These integration problems can be overcome by reinforcing the network. Many European Member States apply network regulation that does not account for the impact of DG growth on the network costs. Passing on network integration costs to the DG-operator who is responsible for these extra costs may result in discrimination between different DG plants and between DG and large power generation. Therefore, in many regulatory systems distribution system operators (DSOs) are not being compensated for the DG integration costs. The DG-GRID project analysed technical and economical barriers for integration of distributed generation into electricity distribution networks. The project looked into the impact of a high DG deployment on the electricity distribution system costs and the impact on the financial position of the DSO. Several ways for improving network regulation in order to compensate DSOs for the increasing DG penetration were identified and tested. The DG-GRID project looked also into stimulating network innovations through economic regulation. The project was co-financed by the European Commission and carried out by nine European universities and research institutes. This report summarises the project results and is based on a number of DG-GRID reports that describe the conducted analyses and their results

  7. Modelling simple helically delivered dose distributions

    International Nuclear Information System (INIS)

    Fenwick, John D; Tome, Wolfgang A; Kissick, Michael W; Mackie, T Rock

    2005-01-01

    In a previous paper, we described quality assurance procedures for Hi-Art helical tomotherapy machines. Here, we develop further some ideas discussed briefly in that paper. Simple helically generated dose distributions are modelled, and relationships between these dose distributions and underlying characteristics of Hi-Art treatment systems are elucidated. In particular, we describe the dependence of dose levels along the central axis of a cylinder aligned coaxially with a Hi-Art machine on fan beam width, couch velocity and helical delivery lengths. The impact on these dose levels of angular variations in gantry speed or output per linear accelerator pulse is also explored

  8. A void distribution model-flashing flow

    International Nuclear Information System (INIS)

    Riznic, J.; Ishii, M.; Afgan, N.

    1987-01-01

    A new model for flashing flow based on wall nucleations is proposed here and the model predictions are compared with some experimental data. In order to calculate the bubble number density, the bubble number transport equation with a distributed source from the wall nucleation sites was used. Thus it was possible to avoid the usual assumption of a constant bubble number density. Comparisons of the model with the data shows that the model based on the nucleation site density correlation appears to be acceptable to describe the vapor generation in the flashing flow. For the limited data examined, the comparisons show rather satisfactory agreement without using a floating parameter to adjust the model. This result indicated that, at least for the experimental conditions considered here, the mechanistic predictions of the flashing phenomenon is possible on the present wall nucleation based model

  9. Improving the accuracy of livestock distribution estimates through spatial interpolation.

    Science.gov (United States)

    Bryssinckx, Ward; Ducheyne, Els; Muhwezi, Bernard; Godfrey, Sunday; Mintiens, Koen; Leirs, Herwig; Hendrickx, Guy

    2012-11-01

    Animal distribution maps serve many purposes such as estimating transmission risk of zoonotic pathogens to both animals and humans. The reliability and usability of such maps is highly dependent on the quality of the input data. However, decisions on how to perform livestock surveys are often based on previous work without considering possible consequences. A better understanding of the impact of using different sample designs and processing steps on the accuracy of livestock distribution estimates was acquired through iterative experiments using detailed survey. The importance of sample size, sample design and aggregation is demonstrated and spatial interpolation is presented as a potential way to improve cattle number estimates. As expected, results show that an increasing sample size increased the precision of cattle number estimates but these improvements were mainly seen when the initial sample size was relatively low (e.g. a median relative error decrease of 0.04% per sampled parish for sample sizes below 500 parishes). For higher sample sizes, the added value of further increasing the number of samples declined rapidly (e.g. a median relative error decrease of 0.01% per sampled parish for sample sizes above 500 parishes. When a two-stage stratified sample design was applied to yield more evenly distributed samples, accuracy levels were higher for low sample densities and stabilised at lower sample sizes compared to one-stage stratified sampling. Aggregating the resulting cattle number estimates yielded significantly more accurate results because of averaging under- and over-estimates (e.g. when aggregating cattle number estimates from subcounty to district level, P interpolation to fill in missing values in non-sampled areas, accuracy is improved remarkably. This counts especially for low sample sizes and spatially even distributed samples (e.g. P <0.001 for a sample of 170 parishes using one-stage stratified sampling and aggregation on district level

  10. Improving the cooling performance of electrical distribution transformer using transformer oil – Based MEPCM suspension

    OpenAIRE

    Mushtaq Ismael Hasan

    2017-01-01

    In this paper the electrical distribution transformer has been studied numerically and the effect of outside temperature on its cooling performance has been investigated. The temperature range studied covers the hot climate regions. 250 KVA distribution transformer is chosen as a study model. A novel cooling fluid is proposed to improve the cooling performance of this transformer, transformer oil-based microencapsulated phase change materials suspension is used with volume concentration (5–25...

  11. Lightning Performance on Overhead Distribution Lines : After Improvement Field Observation

    Directory of Open Access Journals (Sweden)

    Reynaldo Zoro

    2009-11-01

    Full Text Available Two feeders of 20 kV overhead distribution lines which are located in a high lightning density area are chosen to be observed as a field study due to their good lightning performance after improvement of lightning protection system. These two feeders used the new overhead ground wire and new line arrester equipped with lightning counter on the main lines. The significant reduced of lines outages are reported. Study was carried out to observe these improvements by comparing to the other two feeders line which are not improved and not equipped yet with the ground wire and line arrester. These two feeders located in the nearby area. Two cameras were installed to record the trajectory of the lightning strikes on the improved lines. Lightning peak currents are measured using magnetic tape measurement system installed on the grounding lead of lightning arrester. Lightning overvoltage calculations are carried out by using several scenarios based on observation results and historical lightning data derived from lightning detection network. Lightning overvoltages caused by indirect or direct strikes are analyzed to get the lightning performance of the lines. The best scenario was chosen and performance of the lines were improved significantly by installing overhead ground wire and improvement of lightning arrester installation.

  12. Model for the angular distribution of sky radiance

    Energy Technology Data Exchange (ETDEWEB)

    Hooper, F C; Brunger, A P

    1979-08-01

    A flexible mathematical model is introduced which describes the radiance of the dome of the sky under various conditions. This three-component continuous distribution (TCCD) model is compounded by the superposition of three separate terms, the isotropic, circumsolar and horizon brightening terms, each representing the contribution of a particular sky characteristic. In use a particular sky condition is characterized by the values of the coefficients of each of these three terms, defining the distribution of the total diffuse component. The TCCD model has been demonstrated to fit both the normalized clear sky data and the normalized overcast sky data with an RMS error of about ten percent of the man overall sky radiance. By extension the model could describe variable or partly clouded sky conditions. The model can aid in improving the prediction of solar collector performance.

  13. Modeling error distributions of growth curve models through Bayesian methods.

    Science.gov (United States)

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided.

  14. Modelling refrigerant distribution in microchannel evaporators

    DEFF Research Database (Denmark)

    Brix, Wiebke; Kærn, Martin Ryhl; Elmegaard, Brian

    2009-01-01

    of the refrigerant distribution is carried out for two channels in parallel and for two different cases. In the first case maldistribution of the inlet quality into the channels is considered, and in the second case a non-uniform airflow on the secondary side is considered. In both cases the total mixed superheat...... out of the evaporator is kept constant. It is shown that the cooling capacity of the evaporator is reduced significantly, both in the case of unevenly distributed inlet quality and for the case of non-uniform airflow on the outside of the channels.......The effects of refrigerant maldistribution in parallel evaporator channels on the heat exchanger performance are investigated numerically. For this purpose a 1D steady state model of refrigerant R134a evaporating in a microchannel tube is built and validated against other evaporator models. A study...

  15. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  16. A distributed snow-evolution modeling system (SnowModel)

    Science.gov (United States)

    Glen E. Liston; Kelly. Elder

    2006-01-01

    SnowModel is a spatially distributed snow-evolution modeling system designed for application in landscapes, climates, and conditions where snow occurs. It is an aggregation of four submodels: MicroMet defines meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowPack simulates snow depth and water-equivalent evolution, and SnowTran-3D...

  17. Building a generalized distributed system model

    Science.gov (United States)

    Mukkamala, R.

    1993-01-01

    The key elements in the 1992-93 period of the project are the following: (1) extensive use of the simulator to implement and test - concurrency control algorithms, interactive user interface, and replica control algorithms; and (2) investigations into the applicability of data and process replication in real-time systems. In the 1993-94 period of the project, we intend to accomplish the following: (1) concentrate on efforts to investigate the effects of data and process replication on hard and soft real-time systems - especially we will concentrate on the impact of semantic-based consistency control schemes on a distributed real-time system in terms of improved reliability, improved availability, better resource utilization, and reduced missed task deadlines; and (2) use the prototype to verify the theoretically predicted performance of locking protocols, etc.

  18. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    Science.gov (United States)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  19. Reconsideration of mass-distribution models

    Directory of Open Access Journals (Sweden)

    Ninković S.

    2014-01-01

    Full Text Available The mass-distribution model proposed by Kuzmin and Veltmann (1973 is revisited. It is subdivided into two models which have a common case. Only one of them is subject of the present study. The study is focused on the relation between the density ratio (the central one to that corresponding to the core radius and the total-mass fraction within the core radius. The latter one is an increasing function of the former one, but it cannot exceed one quarter, which takes place when the density ratio tends to infinity. Therefore, the model is extended by representing the density as a sum of two components. The extension results into possibility of having a correspondence between the infinite density ratio and 100% total-mass fraction. The number of parameters in the extended model exceeds that of the original model. Due to this, in the extended model, the correspondence between the density ratio and total-mass fraction is no longer one-to-one; several values of the total-mass fraction can correspond to the same value for the density ratio. In this way, the extended model could explain the contingency of having two, or more, groups of real stellar systems (subsystems in the diagram total-mass fraction versus density ratio. [Projekat Ministarstva nauke Republike Srbije, br. 176011: Dynamics and Kinematics of Celestial Bodies and Systems

  20. ATLAS Distributed Computing Operations: Experience and improvements after 2 full years of data-taking

    International Nuclear Information System (INIS)

    Jézéquel, S; Stewart, G

    2012-01-01

    This paper summarizes operational experience and improvements in ATLAS computing infrastructure in 2010 and 2011. ATLAS has had 2 periods of data taking, with many more events recorded in 2011 than in 2010. It ran 3 major reprocessing campaigns. The activity in 2011 was similar to 2010, but scalability issues had to be addressed due to the increase in luminosity and trigger rate. Based on improved monitoring of ATLAS Grid computing, the evolution of computing activities (data/group production, their distribution and grid analysis) over time is presented. The main changes in the implementation of the computing model that will be shown are: the optimization of data distribution over the Grid, according to effective transfer rate and site readiness for analysis; the progressive dismantling of the cloud model, for data distribution and data processing; software installation migration to cvmfs; changing database access to a Frontier/squid infrastructure.

  1. Ballistic model to estimate microsprinkler droplet distribution

    Directory of Open Access Journals (Sweden)

    Conceição Marco Antônio Fonseca

    2003-01-01

    Full Text Available Experimental determination of microsprinkler droplets is difficult and time-consuming. This determination, however, could be achieved using ballistic models. The present study aimed to compare simulated and measured values of microsprinkler droplet diameters. Experimental measurements were made using the flour method, and simulations using a ballistic model adopted by the SIRIAS computational software. Drop diameters quantified in the experiment varied between 0.30 mm and 1.30 mm, while the simulated between 0.28 mm and 1.06 mm. The greatest differences between simulated and measured values were registered at the highest radial distance from the emitter. The model presented a performance classified as excellent for simulating microsprinkler drop distribution.

  2. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang

    2006-01-01

    The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used by such ada......The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used...... by such adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...

  3. Design of improved fuel cell controller for distributed generation systems

    Energy Technology Data Exchange (ETDEWEB)

    Olsen Berenguer, F.A. [Instituto de Energia Electrica, Universidad Nacional de San Juan, Av. Libertador San Martin Oeste, 1109, J5400ARL San Juan (Argentina); Molina, M.G. [CONICET, Instituto de Energia Electrica, Universidad Nacional de San Juan, Av. Libertador San Martin Oeste, 1109, J5400ARL San Juan (Argentina)

    2010-06-15

    The world has been undergoing a deregulation process which allowed competition in the electricity generation sector. This situation is bringing the opportunity for electricity users to generate power by using small-scale generation systems with emerging technologies, allowing the development of distributed generation (DG). A fuel cell power plant (FCPP) is a distributed generation technology with a rapid development because it has promising characteristics, such as low pollutant emissions, silent operation, high efficiency and long lifetime because of its small number of moving parts. The power conditioning system (PCS) is the interface that allows the effective connection to the electric power system. With the appropriate topology of the PCS and its control system design, the FCPP unit is capable of simultaneously performing both instantaneous active and reactive power flow control. This paper describes the design and implementation of a novel high performance PCS of an FCPP and its controller, for applications in distributed generation systems. A full detailed model of the FCPP is derived and a new three-level control scheme is designed. The dynamic performance of the proposed system is validated by digital simulation in SimPowerSystems (SPS) of MATLAB/Simulink. (author)

  4. Real-time modeling and simulation of distribution feeder and distributed resources

    Science.gov (United States)

    Singh, Pawan

    The analysis of the electrical system dates back to the days when analog network analyzers were used. With the advent of digital computers, many programs were written for power-flow and short circuit analysis for the improvement of the electrical system. Real-time computer simulations can answer many what-if scenarios in the existing or the proposed power system. In this thesis, the standard IEEE 13-Node distribution feeder is developed and validated on a real-time platform OPAL-RT. The concept and the challenges of the real-time simulation are studied and addressed. Distributed energy resources include some of the commonly used distributed generation and storage devices like diesel engine, solar photovoltaic array, and battery storage system are modeled and simulated on a real-time platform. A microgrid encompasses a portion of an electric power distribution which is located downstream of the distribution substation. Normally, the microgrid operates in paralleled mode with the grid; however, scheduled or forced isolation can take place. In such conditions, the microgrid must have the ability to operate stably and autonomously. The microgrid can operate in grid connected and islanded mode, both the operating modes are studied in the last chapter. Towards the end, a simple microgrid controller modeled and simulated on the real-time platform is developed for energy management and protection for the microgrid.

  5. Improved Quantum Artificial Fish Algorithm Application to Distributed Network Considering Distributed Generation

    Directory of Open Access Journals (Sweden)

    Tingsong Du

    2015-01-01

    Full Text Available An improved quantum artificial fish swarm algorithm (IQAFSA for solving distributed network programming considering distributed generation is proposed in this work. The IQAFSA based on quantum computing which has exponential acceleration for heuristic algorithm uses quantum bits to code artificial fish and quantum revolving gate, preying behavior, and following behavior and variation of quantum artificial fish to update the artificial fish for searching for optimal value. Then, we apply the proposed new algorithm, the quantum artificial fish swarm algorithm (QAFSA, the basic artificial fish swarm algorithm (BAFSA, and the global edition artificial fish swarm algorithm (GAFSA to the simulation experiments for some typical test functions, respectively. The simulation results demonstrate that the proposed algorithm can escape from the local extremum effectively and has higher convergence speed and better accuracy. Finally, applying IQAFSA to distributed network problems and the simulation results for 33-bus radial distribution network system show that IQAFSA can get the minimum power loss after comparing with BAFSA, GAFSA, and QAFSA.

  6. Broadband model of the distribution network

    DEFF Research Database (Denmark)

    Jensen, Martin Høgdahl

    for circular conductors involving Bessel series. The two methods show equal values of resistance, but there is considerable difference in the values of internal inductance. A method for calculation of proximity effect is derived for a two-conductor configuration. This method is expanded to the use...... of frequency up to 200 kHz. The square wave measurements reveal the complete capacitance matrice at a frequency of approximately 12.5 MHz as well as the series inductance between the four conductors. The influence of non-ideal ground could not be measured due to the high impedance of the grounding device...... measurement and simulation, once the Phase model is used. No explanation is found on why the new material properties cause error in the Phase model. At the kyndby 10 kV test site a non-linear load is inserted on the secondary side of normal distribution transformer and the phase voltage and current...

  7. Evaluation of an improved air distribution system for aircraft cabin

    DEFF Research Database (Denmark)

    Pang, Liping; Xu, Jie; Fang, Lei

    2013-01-01

    An improved air distribution system for aircraft cabin was proposed in this paper. Personalized outlets were introduced and placed at the bottom of the baggage hold. Its ratio of fresh air to recirculation air and the conditioned temperature of different types of inlets were also designed carefully...... to meet the goals of high air quality, thermal comfort and energy saving. Some experiments were conducted to evaluate and compare its performances with two other systems. First the Flow Visualization with Green Laser (FVGL) technology was used to analyze the air flow. The top-in-side bottom-out pattern...... may have the disadvantages of an indirect path to deliver fresh air to passengers, a low fresh air utilization ratio and the potential to widely spreading airborne infectious diseases. The bottom-in-top-out pattern can overcome these disadvantages very well, but it also faces the stratification...

  8. Performance studies and improvements of CMS distributed data transfers

    International Nuclear Information System (INIS)

    Bonacorsi, D; Flix, J; Kaselis, R; Magini, N; Letts, J; Sartirana, A

    2012-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered distributed infrastructures. CMS experiment relies on File Transfer Services (FTS) for data distribution, a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centers and used by all the computing sites in CMS, subject to established CMS and sites setup policies, including all the virtual organizations making use of the Grid resources at the site, and properly dimensioned to satisfy all the requirements for them. Managing the service efficiently needs good knowledge of the CMS needs for all kind of transfer routes, and the sharing and interference with other VOs using the same FTS transfer managers. This contribution deals with a complete revision of all FTS servers used by CMS, customizing the topologies and improving their setup in order to keep CMS transferring data to the desired levels, as well as performance studies for all kind of transfer routes, including overheads measurements introduced by SRM servers and storage systems, FTS server misconfigurations and identification of congested channels, historical transfer throughputs per stream, file-latency studies,… This information is retrieved directly from the FTS servers through the FTS Monitor webpages and conveniently archived for further analysis. The project provides an interface for all these values, to ease the analysis of the data.

  9. Improving CMS data transfers among its distributed computing facilities

    International Nuclear Information System (INIS)

    Flix, J; Magini, N; Sartirana, A

    2011-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on their usage, customizing the topologies and improving their setup in order to keep CMS transferring data at the desired levels in a reliable and robust way.

  10. Coordinated control of active and reactive power of distribution network with distributed PV cluster via model predictive control

    Science.gov (United States)

    Ji, Yu; Sheng, Wanxing; Jin, Wei; Wu, Ming; Liu, Haitao; Chen, Feng

    2018-02-01

    A coordinated optimal control method of active and reactive power of distribution network with distributed PV cluster based on model predictive control is proposed in this paper. The method divides the control process into long-time scale optimal control and short-time scale optimal control with multi-step optimization. The models are transformed into a second-order cone programming problem due to the non-convex and nonlinear of the optimal models which are hard to be solved. An improved IEEE 33-bus distribution network system is used to analyse the feasibility and the effectiveness of the proposed control method

  11. Improved Trailing Edge Noise Model

    DEFF Research Database (Denmark)

    Bertagnolio, Franck

    2012-01-01

    The modeling of the surface pressure spectrum under a turbulent boundary layer is investigated in the presence of an adverse pressure gradient along the flow direction. It is shown that discrepancies between measurements and results from a well-known model increase as the pressure gradient increa...

  12. A strategy for improved computational efficiency of the method of anchored distributions

    Science.gov (United States)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  13. Improved model for statistical alignment

    Energy Technology Data Exchange (ETDEWEB)

    Miklos, I.; Toroczkai, Z. (Zoltan)

    2001-01-01

    The statistical approach to molecular sequence evolution involves the stochastic modeling of the substitution, insertion and deletion processes. Substitution has been modeled in a reliable way for more than three decades by using finite Markov-processes. Insertion and deletion, however, seem to be more difficult to model, and thc recent approaches cannot acceptably deal with multiple insertions and deletions. A new method based on a generating function approach is introduced to describe the multiple insertion process. The presented algorithm computes the approximate joint probability of two sequences in 0(13) running time where 1 is the geometric mean of the sequence lengths.

  14. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  15. Taxing energy to improve the environment. Efficiency and distributional effects

    International Nuclear Information System (INIS)

    Heijdra, B.J.; Van der Horst, A.

    1998-02-01

    The effects of environmental tax policy in a dynamic overlapping-generations model of a small open economy with environmental quality incorporated as a durable consumption good have been studied. Raising the energy tax may deliver an efficiency gain if agents care enough about the environment. The benefits are unevenly distributed across generations since capital ownership, and the capital loss induced by a tax increase, rises with age. A suitable egalitarian bond policy can be employed in order to ensure everybody gains to the same extent. With this additional instrument the optimal energy tax can be computed. The authors further considered a tax reform that simultaneously lowers labour taxation and raises the energy tax. This policy delivers qualitatively similar consequences as the first scenario, though all changes are less pronounced. A double dividend may appear soon after the reform but vanishes in the course of the transition. 22 refs

  16. Taxing energy to improve the environment. Efficiency and distributional effects

    Energy Technology Data Exchange (ETDEWEB)

    Heijdra, B.J.; Van der Horst, A. [Faculty of Economics and Econometrics, University of Amsterdam, Amsterdam (Netherlands)

    1998-02-01

    The effects of environmental tax policy in a dynamic overlapping-generations model of a small open economy with environmental quality incorporated as a durable consumption good have been studied. Raising the energy tax may deliver an efficiency gain if agents care enough about the environment. The benefits are unevenly distributed across generations since capital ownership, and the capital loss induced by a tax increase, rises with age. A suitable egalitarian bond policy can be employed in order to ensure everybody gains to the same extent. With this additional instrument the optimal energy tax can be computed. The authors further considered a tax reform that simultaneously lowers labour taxation and raises the energy tax. This policy delivers qualitatively similar consequences as the first scenario, though all changes are less pronounced. A double dividend may appear soon after the reform but vanishes in the course of the transition. 22 refs.

  17. A Distributed Snow Evolution Modeling System (SnowModel)

    Science.gov (United States)

    Liston, G. E.; Elder, K.

    2004-12-01

    A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.

  18. Hierarchical Model Predictive Control for Resource Distribution

    DEFF Research Database (Denmark)

    Bendtsen, Jan Dimon; Trangbæk, K; Stoustrup, Jakob

    2010-01-01

    units. The approach is inspired by smart-grid electric power production and consumption systems, where the flexibility of a large number of power producing and/or power consuming units can be exploited in a smart-grid solution. The objective is to accommodate the load variation on the grid, arising......This paper deals with hierarchichal model predictive control (MPC) of distributed systems. A three level hierachical approach is proposed, consisting of a high level MPC controller, a second level of so-called aggregators, controlled by an online MPC-like algorithm, and a lower level of autonomous...... on one hand from varying consumption, on the other hand by natural variations in power production e.g. from wind turbines. The approach presented is based on quadratic optimization and possess the properties of low algorithmic complexity and of scalability. In particular, the proposed design methodology...

  19. Combining kernel matrix optimization and regularization to improve particle size distribution retrieval

    Science.gov (United States)

    Ma, Qian; Xia, Houping; Xu, Qiang; Zhao, Lei

    2018-05-01

    A new method combining Tikhonov regularization and kernel matrix optimization by multi-wavelength incidence is proposed for retrieving particle size distribution (PSD) in an independent model with improved accuracy and stability. In comparison to individual regularization or multi-wavelength least squares, the proposed method exhibited better anti-noise capability, higher accuracy and stability. While standard regularization typically makes use of the unit matrix, it is not universal for different PSDs, particularly for Junge distributions. Thus, a suitable regularization matrix was chosen by numerical simulation, with the second-order differential matrix found to be appropriate for most PSD types.

  20. Improve SSME power balance model

    Science.gov (United States)

    Karr, Gerald R.

    1992-01-01

    Effort was dedicated to development and testing of a formal strategy for reconciling uncertain test data with physically limited computational prediction. Specific weaknesses in the logical structure of the current Power Balance Model (PBM) version are described with emphasis given to the main routing subroutines BAL and DATRED. Selected results from a variational analysis of PBM predictions are compared to Technology Test Bed (TTB) variational study results to assess PBM predictive capability. The motivation for systematic integration of uncertain test data with computational predictions based on limited physical models is provided. The theoretical foundation for the reconciliation strategy developed in this effort is presented, and results of a reconciliation analysis of the Space Shuttle Main Engine (SSME) high pressure fuel side turbopump subsystem are examined.

  1. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  2. Distributed parallel computing in stochastic modeling of groundwater systems.

    Science.gov (United States)

    Dong, Yanhui; Li, Guomin; Xu, Haizhen

    2013-03-01

    Stochastic modeling is a rapidly evolving, popular approach to the study of the uncertainty and heterogeneity of groundwater systems. However, the use of Monte Carlo-type simulations to solve practical groundwater problems often encounters computational bottlenecks that hinder the acquisition of meaningful results. To improve the computational efficiency, a system that combines stochastic model generation with MODFLOW-related programs and distributed parallel processing is investigated. The distributed computing framework, called the Java Parallel Processing Framework, is integrated into the system to allow the batch processing of stochastic models in distributed and parallel systems. As an example, the system is applied to the stochastic delineation of well capture zones in the Pinggu Basin in Beijing. Through the use of 50 processing threads on a cluster with 10 multicore nodes, the execution times of 500 realizations are reduced to 3% compared with those of a serial execution. Through this application, the system demonstrates its potential in solving difficult computational problems in practical stochastic modeling. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  3. Improving Agent Based Modeling of Critical Incidents

    Directory of Open Access Journals (Sweden)

    Robert Till

    2010-04-01

    Full Text Available Agent Based Modeling (ABM is a powerful method that has been used to simulate potential critical incidents in the infrastructure and built environments. This paper will discuss the modeling of some critical incidents currently simulated using ABM and how they may be expanded and improved by using better physiological modeling, psychological modeling, modeling the actions of interveners, introducing Geographic Information Systems (GIS and open source models.

  4. Model of bidirectional reflectance distribution function for metallic materials

    International Nuclear Information System (INIS)

    Wang Kai; Zhu Jing-Ping; Liu Hong; Hou Xun

    2016-01-01

    Based on the three-component assumption that the reflection is divided into specular reflection, directional diffuse reflection, and ideal diffuse reflection, a bidirectional reflectance distribution function (BRDF) model of metallic materials is presented. Compared with the two-component assumption that the reflection is composed of specular reflection and diffuse reflection, the three-component assumption divides the diffuse reflection into directional diffuse and ideal diffuse reflection. This model effectively resolves the problem that constant diffuse reflection leads to considerable error for metallic materials. Simulation and measurement results validate that this three-component BRDF model can improve the modeling accuracy significantly and describe the reflection properties in the hemisphere space precisely for the metallic materials. (paper)

  5. Model of bidirectional reflectance distribution function for metallic materials

    Science.gov (United States)

    Wang, Kai; Zhu, Jing-Ping; Liu, Hong; Hou, Xun

    2016-09-01

    Based on the three-component assumption that the reflection is divided into specular reflection, directional diffuse reflection, and ideal diffuse reflection, a bidirectional reflectance distribution function (BRDF) model of metallic materials is presented. Compared with the two-component assumption that the reflection is composed of specular reflection and diffuse reflection, the three-component assumption divides the diffuse reflection into directional diffuse and ideal diffuse reflection. This model effectively resolves the problem that constant diffuse reflection leads to considerable error for metallic materials. Simulation and measurement results validate that this three-component BRDF model can improve the modeling accuracy significantly and describe the reflection properties in the hemisphere space precisely for the metallic materials.

  6. Improving Earth/Prediction Models to Improve Network Processing

    Science.gov (United States)

    Wagner, G. S.

    2017-12-01

    The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.

  7. Improved TOPSIS decision model for NPP emergencies

    International Nuclear Information System (INIS)

    Zhang Jin; Liu Feng; Huang Lian

    2011-01-01

    In this paper,an improved decision model is developed for its use as a tool to respond to emergencies at nuclear power plants. Given the complexity of multi-attribute emergency decision-making on nuclear accident, the improved TOPSIS method is used to build a decision-making model that integrates subjective weight and objective weight of each evaluation index. A comparison between the results of this new model and two traditional methods of fuzzy hierarchy analysis method and weighted analysis method demonstrates that the improved TOPSIS model has a better evaluation effect. (authors)

  8. A Model of U.S. Commercial Distributed Generation Adoption

    Energy Technology Data Exchange (ETDEWEB)

    LaCommare, Kristina Hamachi; Ryan Firestone; Zhou, Nan; Maribu,Karl; Marnay, Chris

    2006-01-10

    Small-scale (100 kW-5 MW) on-site distributed generation (DG) economically driven by combined heat and power (CHP) applications and, in some cases, reliability concerns will likely emerge as a common feature of commercial building energy systems over the next two decades. Forecasts of DG adoption published by the Energy Information Administration (EIA) in the Annual Energy Outlook (AEO) are made using the National Energy Modeling System (NEMS), which has a forecasting module that predicts the penetration of several possible commercial building DG technologies over the period 2005-2025. NEMS is also used for estimating the future benefits of Department of Energy research and development used in support of budget requests and management decisionmaking. The NEMS approach to modeling DG has some limitations, including constraints on the amount of DG allowed for retrofits to existing buildings and a small number of possible sizes for each DG technology. An alternative approach called Commercial Sector Model (ComSeM) is developed to improve the way in which DG adoption is modeled. The approach incorporates load shapes for specific end uses in specific building types in specific regions, e.g., cooling in hospitals in Atlanta or space heating in Chicago offices. The Distributed Energy Resources Customer Adoption Model (DER-CAM) uses these load profiles together with input cost and performance DG technology assumptions to model the potential DG adoption for four selected cities and two sizes of five building types in selected forecast years to 2022. The Distributed Energy Resources Market Diffusion Model (DER-MaDiM) is then used to then tailor the DER-CAM results to adoption projections for the entire U.S. commercial sector for all forecast years from 2007-2025. This process is conducted such that the structure of results are consistent with the structure of NEMS, and can be re-injected into NEMS that can then be used to integrate adoption results into a full forecast.

  9. Dynamical Models For Prices With Distributed Delays

    Directory of Open Access Journals (Sweden)

    Mircea Gabriela

    2015-06-01

    Full Text Available In the present paper we study some models for the price dynamics of a single commodity market. The quantities of supplied and demanded are regarded as a function of time. Nonlinearities in both supply and demand functions are considered. The inventory and the level of inventory are taken into consideration. Due to the fact that the consumer behavior affects commodity demand, and the behavior is influenced not only by the instantaneous price, but also by the weighted past prices, the distributed time delay is introduced. The following kernels are taken into consideration: demand price weak kernel and demand price Dirac kernel. Only one positive equilibrium point is found and its stability analysis is presented. When the demand price kernel is weak, under some conditions of the parameters, the equilibrium point is locally asymptotically stable. When the demand price kernel is Dirac, the existence of the local oscillations is investigated. A change in local stability of the equilibrium point, from stable to unstable, implies a Hopf bifurcation. A family of periodic orbits bifurcates from the positive equilibrium point when the time delay passes through a critical value. The last part contains some numerical simulations to illustrate the effectiveness of our results and conclusions.

  10. A Variance Distribution Model of Surface EMG Signals Based on Inverse Gamma Distribution.

    Science.gov (United States)

    Hayashi, Hideaki; Furui, Akira; Kurita, Yuichi; Tsuji, Toshio

    2017-11-01

    Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force. Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this

  11. Marketing & Distributive Education. Committed to the Improvement of Marketing.

    Science.gov (United States)

    South Carolina State Dept. of Education, Columbia. Office of Vocational Education.

    This package consists of 34 transparency masters outlining the nature and scope of marketing and distributive education. Included in the set are transparency masters addressing the following themes: the interconnectedness of education, labor, and work; objectives of marketing and distributive education at both the secondary and postsecondary…

  12. IMPROVING THE DISTRIBUTION OF BULGARIAN SEASIDE HOLIDAY HOTELS

    Directory of Open Access Journals (Sweden)

    Stoyan Marinov

    2010-06-01

    Full Text Available This report aims at viewing and analysing the trends and changes in the channels for distribution of tourist products. The specific features of modern hotel management together with the tasks in the process of distribution of hotel products have been presented. Applying.

  13. Understanding and Improving the Performance Consistency of Distributed Computing Systems

    NARCIS (Netherlands)

    Yigitbasi, M.N.

    2012-01-01

    With the increasing adoption of distributed systems in both academia and industry, and with the increasing computational and storage requirements of distributed applications, users inevitably demand more from these systems. Moreover, users also depend on these systems for latency and throughput

  14. Improved margin utilization through the use of beacon power distribution surveillance

    International Nuclear Information System (INIS)

    Miller, R. Wade; Boyd, William A.

    2002-01-01

    Core Operations, including fuel cycle costs, can be significantly improved when state of the art surveillance techniques are employed for core power distribution monitoring. Core power distribution monitoring and Technical Specification surveillance are major operational issues at PWR's, particularly in plants with movable in core detectors. Even plants with fixed in core detectors do not always make use of the continuous data that is available. The BEACON TM system (Best Estimate Analysis of Core Operations - Nuclear) is a core monitoring and operational support package developed by Westinghouse for use in PWR plants with fixed or movable in core detectors. BEACON is a real time core monitoring system, which uses existing core instrumentation data and an on-line neutronics model to provide continuous monitored of the core power distribution information. With this information available the BEACON system can be used to continuously monitor core power margin for the plant Tech Spec surveillance requirements and for plant operational guidance

  15. Correlation Structures of Correlated Binomial Models and Implied Default Distribution

    OpenAIRE

    S. Mori; K. Kitsukawa; M. Hisakado

    2006-01-01

    We show how to analyze and interpret the correlation structures, the conditional expectation values and correlation coefficients of exchangeable Bernoulli random variables. We study implied default distributions for the iTraxx-CJ tranches and some popular probabilistic models, including the Gaussian copula model, Beta binomial distribution model and long-range Ising model. We interpret the differences in their profiles in terms of the correlation structures. The implied default distribution h...

  16. An Improved Distribution Policy with a Maintenance Aspect for an Urban Logistic Problem

    Directory of Open Access Journals (Sweden)

    Nadia Ndhaief

    2017-07-01

    Full Text Available In this paper, we present an improved distribution plan supporting an urban distribution center (UDC to solve the last mile problem of urban freight. This is motivated by the need of UDCs to satisfy daily demand in time under a high service level in allocated urban areas. Moreover, these demands could not be satisfied in individual cases because the delivery rate can be less than daily demand and/or affected by random failure or maintenance actions of vehicles. The scope of our work is to focus on a UDC, which needs to satisfy demands in a finite horizon. To that end, we consider a distribution policy on two sequential plans, a distribution plan correlated to a maintenance plan using a subcontracting strategy with several potential urban distribution centers (UDCs and performing preventive maintenance to ensure deliveries for their allocated urban area. The choice of subcontractor will depend on distance, environmental and availability criteria. In doing so, we define a mathematical model for searching the best distribution and maintenance plans using a subcontracting strategy. Moreover, we consider delay for the next periods with an expensive penalty. Finally, we present a numerical example illustrating the benefits of our approach.

  17. Modeling the economics and market adoption of distributed power generation

    International Nuclear Information System (INIS)

    Maribu, Karl Magnus

    2006-01-01

    After decades of power generating units increasing in size, there is currently a growing focus on distributed generation, power generation close to energy loads. Investments in large-scale units have been driven by economy of scale, but recent technological improvements on small generating plants have made it possible to exploit the benefits of local power generation to a larger extent than previously. Distributed generation can improve power system efficiency because heat can be recovered from thermal units to supply heat and thermally activated cooling, and because small-scale renewables have a promising end-user market. Further benefits of distributed generation include improved reliability, deferral of often controversial and costly grid investments and reduction of grid losses. The new appeal of small-scale power generation means that there is a need for new tools to analyze distributed generation, both from a system perspective and from the perspective of potential developers. In this thesis, the focus is on the value of power generation for end-users. The thesis identifies how an end-user can find optimal distributed generation systems and investment strategies under a variety of economic and regulatory scenarios. The final part of the thesis extends the analysis with a bottom up model of how the economics of distributed generation for a representative set of building types can transfer to technology diffusion in a market. Four separate research papers make up the thesis. In the first paper, Optimal Investment Strategies in Decentralized Renewable Power Generation under Uncertainty, a method for evaluation of investments in renewable power units under price uncertainty is presented. It is assumed the developer has a building with an electricity load and a renewable power resource. The case study compares a set of wind power systems with different capacity and finds that capacity depends on the electricity price and that there under uncertain prices can be a

  18. Light distributions in a port wine stain model containing multiple cylindrical and curved blood vessels

    NARCIS (Netherlands)

    Lucassen, G. W.; Verkruysse, W.; Keijzer, M.; van Gemert, M. J.

    1996-01-01

    Knowledge of the light distribution in skin tissue is important for the understanding, prediction, and improvement of the clinical results in laser treatment of port wine stains (PWS). The objective of this study is to improve modelling of PWS treated by laser using an improved and more realistic

  19. Testing the efficacy of downscaling in species distribution modelling: a comparison between MaxEnt and Favourability Function models

    Directory of Open Access Journals (Sweden)

    Olivero, J.

    2016-03-01

    Full Text Available Statistical downscaling is used to improve the knowledge of spatial distributions from broad–scale to fine–scale maps with higher potential for conservation planning. We assessed the effectiveness of downscaling in two commonly used species distribution models: Maximum Entropy (MaxEnt and the Favourability Function (FF. We used atlas data (10 x 10 km of the fire salamander Salamandra salamandra distribution in southern Spain to derive models at a 1 x 1 km resolution. Downscaled models were assessed using an independent dataset of the species’ distribution at 1 x 1 km. The Favourability model showed better downscaling performance than the MaxEnt model, and the models that were based on linear combinations of environmental variables performed better than models allowing higher flexibility. The Favourability model minimized model overfitting compared to the MaxEnt model.

  20. Testing the efficacy of downscaling in species distribution modelling: a comparison between MaxEnt and Favourability Function models

    Energy Technology Data Exchange (ETDEWEB)

    Olivero, J.; Toxopeus, A.G.; Skidmore, A.K.; Real, R.

    2016-07-01

    Statistical downscaling is used to improve the knowledge of spatial distributions from broad–scale to fine–scale maps with higher potential for conservation planning. We assessed the effectiveness of downscaling in two commonly used species distribution models: Maximum Entropy (MaxEnt) and the Favourability Function (FF). We used atlas data (10 x 10 km) of the fire salamander Salamandra salamandra distribution in southern Spain to derive models at a 1 x 1 km resolution. Downscaled models were assessed using an independent dataset of the species’ distribution at 1 x 1 km. The Favourability model showed better downscaling performance than the MaxEnt model, and the models that were based on linear combinations of environmental variables performed better than models allowing higher flexibility. The Favourability model minimized model overfitting compared to the MaxEnt model. (Author)

  1. Models for Evaluating and Improving Architecture Competence

    National Research Council Canada - National Science Library

    Bass, Len; Clements, Paul; Kazman, Rick; Klein, Mark

    2008-01-01

    ... producing high-quality architectures. This report lays out the basic concepts of software architecture competence and describes four models for explaining, measuring, and improving the architecture competence of an individual...

  2. Flash flood modeling with the MARINE hydrological distributed model

    Science.gov (United States)

    Estupina-Borrell, V.; Dartus, D.; Ababou, R.

    2006-11-01

    Flash floods are characterized by their violence and the rapidity of their occurrence. Because these events are rare and unpredictable, but also fast and intense, their anticipation with sufficient lead time for warning and broadcasting is a primary subject of research. Because of the heterogeneities of the rain and of the behavior of the surface, spatially distributed hydrological models can lead to a better understanding of the processes and so on they can contribute to a better forecasting of flash flood. Our main goal here is to develop an operational and robust methodology for flash flood forecasting. This methodology should provide relevant data (information) about flood evolution on short time scales, and should be applicable even in locations where direct observations are sparse (e.g. absence of historical and modern rainfalls and streamflows in small mountainous watersheds). The flash flood forecast is obtained by the physically based, space-time distributed hydrological model "MARINE'' (Model of Anticipation of Runoff and INondations for Extreme events). This model is presented and tested in this paper for a real flash flood event. The model consists in two steps, or two components: the first component is a "basin'' flood module which generates flood runoff in the upstream part of the watershed, and the second component is the "stream network'' module, which propagates the flood in the main river and its subsidiaries. The basin flash flood generation model is a rainfall-runoff model that can integrate remotely sensed data. Surface hydraulics equations are solved with enough simplifying hypotheses to allow real time exploitation. The minimum data required by the model are: (i) the Digital Elevation Model, used to calculate slopes that generate runoff, it can be issued from satellite imagery (SPOT) or from French Geographical Institute (IGN); (ii) the rainfall data from meteorological radar, observed or anticipated by the French Meteorological Service (M

  3. Deterministic Properties of Serially Connected Distributed Lag Models

    Directory of Open Access Journals (Sweden)

    Piotr Nowak

    2013-01-01

    Full Text Available Distributed lag models are an important tool in modeling dynamic systems in economics. In the analysis of composite forms of such models, the component models are ordered in parallel (with the same independent variable and/or in series (where the independent variable is also the dependent variable in the preceding model. This paper presents an analysis of certain deterministic properties of composite distributed lag models composed of component distributed lag models arranged in sequence, and their asymptotic properties in particular. The models considered are in discrete form. Even though the paper focuses on deterministic properties of distributed lag models, the derivations are based on analytical tools commonly used in probability theory such as probability distributions and the central limit theorem. (original abstract

  4. Can model weighting improve probabilistic projections of climate change?

    Energy Technology Data Exchange (ETDEWEB)

    Raeisaenen, Jouni; Ylhaeisi, Jussi S. [Department of Physics, P.O. Box 48, University of Helsinki (Finland)

    2012-10-15

    Recently, Raeisaenen and co-authors proposed a weighting scheme in which the relationship between observable climate and climate change within a multi-model ensemble determines to what extent agreement with observations affects model weights in climate change projection. Within the Third Coupled Model Intercomparison Project (CMIP3) dataset, this scheme slightly improved the cross-validated accuracy of deterministic projections of temperature change. Here the same scheme is applied to probabilistic temperature change projection, under the strong limiting assumption that the CMIP3 ensemble spans the actual modeling uncertainty. Cross-validation suggests that probabilistic temperature change projections may also be improved by this weighting scheme. However, the improvement relative to uniform weighting is smaller in the tail-sensitive logarithmic score than in the continuous ranked probability score. The impact of the weighting on projection of real-world twenty-first century temperature change is modest in most parts of the world. However, in some areas mainly over the high-latitude oceans, the mean of the distribution is substantially changed and/or the distribution is considerably narrowed. The weights of individual models vary strongly with location, so that a model that receives nearly zero weight in some area may still get a large weight elsewhere. Although the details of this variation are method-specific, it suggests that the relative strengths of different models may be difficult to harness by weighting schemes that use spatially uniform model weights. (orig.)

  5. The low cost of quality improvements in the electricity distribution sector of Brazil

    International Nuclear Information System (INIS)

    Corton, Maria Luisa; Zimmermann, Aneliese; Phillips, Michelle Andrea

    2016-01-01

    We analyze the impact of introducing output-based incentives in the price-cap regulatory regime of the Brazilian electricity distribution sector. We focus on the trade-off between operating costs and quality improvement, hypothesizing a positive relationship. Operating costs include maintenance and repair expenses. The regulator sets limits for service continuity and non-technical energy losses in each regulatory period. Service continuity refers to the average length of interruptions in electricity distribution. Non-technical losses refer to losses due to factors specific to the distribution segment. Quality incentives include peer-pressure and penalties/rewards for compliance with minimum quality standards. We model operating costs using a GMM framework to acknowledge endogeneity of variables. The model is dynamic given the inclusion of regulatory lags to recognize past cost behavior. Findings reveal a small trade-off between costs and quality. We conclude that quality improvements are not costly relative to the potential savings from complying with quality standards. We also find that the impact on operating costs is larger when energy losses increase compared to the cost effect due to increases in duration of outages. These findings suggest areas of attention in managerial decision making, and serve as valuable information to the regulator in tailoring quality incentives for this sector. - Highlights: • The article focuses on the impact of quality improvements on operating costs. • We find a very small tradeoff between quality improvements and operating costs. • We find the impact of a large share of electricity losses on costs larger compared to the impact of longer outages. • The results serve the regulator to adjust incentives for quality improvement. • The results serve the regulator in tailoring regulatory values for electricity losses and outages.

  6. Affordable non-traditional source data mining for context assessment to improve distributed fusion system robustness

    Science.gov (United States)

    Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael

    2013-05-01

    This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.

  7. Improved models of dense anharmonic lattices

    Energy Technology Data Exchange (ETDEWEB)

    Rosenau, P., E-mail: rosenau@post.tau.ac.il; Zilburg, A.

    2017-01-15

    We present two improved quasi-continuous models of dense, strictly anharmonic chains. The direct expansion which includes the leading effect due to lattice dispersion, results in a Boussinesq-type PDE with a compacton as its basic solitary mode. Without increasing its complexity we improve the model by including additional terms in the expanded interparticle potential with the resulting compacton having a milder singularity at its edges. A particular care is applied to the Hertz potential due to its non-analyticity. Since, however, the PDEs of both the basic and the improved model are ill posed, they are unsuitable for a study of chains dynamics. Using the bond length as a state variable we manipulate its dispersion and derive a well posed fourth order PDE. - Highlights: • An improved PDE model of a Newtonian lattice renders compacton solutions. • Compactons are classical solutions of the improved model and hence amenable to standard analysis. • An alternative well posed model enables to study head on interactions of lattices' solitary waves. • Well posed modeling of Hertz potential.

  8. A model for the distribution channels planning process

    NARCIS (Netherlands)

    Neves, M.F.; Zuurbier, P.; Campomar, M.C.

    2001-01-01

    Research of existing literature reveals some models (sequence of steps) for companies that want to plan distribution channels. None of these models uses strong contributions from transaction cost economics, bringing a possibility to elaborate on a "distribution channels planning model", with these

  9. Can environmental improvement change the population distribution of walking?

    Science.gov (United States)

    Panter, Jenna; Ogilvie, David

    2017-06-01

    Few studies have explored the impact of environmental change on walking using controlled comparisons. Even fewer have examined whose behaviour changes and how. In a natural experimental study of new walking and cycling infrastructure, we explored changes in walking, identified groups who changed in similar ways and assessed whether exposure to the infrastructure was associated with trajectories of walking. 1257 adults completed annual surveys assessing walking, sociodemographic and health characteristics and use of the infrastructure (2010-2012). Residential proximity to the new routes was assessed objectively. We used latent growth curve models to assess change in total walking, walking for recreation and for transport, used simple descriptive analysis and latent class analysis (LCA) to identify groups who changed in similar ways and examined factors associated with group membership using multinomial regression. LCA identified five trajectories, characterised by consistently low levels; consistently high levels; decreases; short-lived increases; and sustained increases. Those with lower levels of education and lower incomes were more likely to show both short-lived and sustained increases in walking for transport. However, those with lower levels of education were less likely to take up walking. Proximity to the intervention was associated with both uptake of and short-lived increases in walking for transport. Environmental improvement encouraged the less active to take up walking for transport, as well as encouraging those who were already active to walk more. Further research should disentangle the role of socioeconomic characteristics in determining use of new environments and changes in walking. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  10. Asymmetric fan beams (AFB) for improvement of the craniocaudal dose distribution in helical tomotherapy delivery

    International Nuclear Information System (INIS)

    Gladwish, Adam; Kron, Tomas; McNiven, Andrea; Bauman, Glenn; Van Dyk, Jake

    2004-01-01

    Helical tomotherapy (HT) is a novel radiotherapy technique that utilizes intensity modulated fan beams that deliver highly conformal dose distributions in a helical beam trajectory. The most significant limitation in dose delivery with a constant fan beam thickness (FBT) is the penumbra width of the dose distribution in the craniocaudal direction, which is equivalent to the FBT. We propose to employ a half-blocked fan beam at start and stop location to reduce the penumbra width by half. By opening the jaw slowly during the helical delivery until the desired FBT is achieved it is possible to create a sharper edge in the superior and inferior direction from the target. The technique was studied using a tomotherapy beam model implemented on a commercial treatment planning system (Theraplan Plus V3.0). It was demonstrated that the dose distribution delivered using a 25 mm fan beam can be improved significantly, to reduce the dose to normal structures located superiorly and inferiorly of the target. Dosimetry for this technique is straightforward down to a FBT of 15 mm and implementation should be simple as no changes in couch movement are required compared to a standard HT delivery. We conclude that the use of asymmetric collimated fan beams for the start and stop of the helical tomotherapeutic dose delivery has the potential of significantly improving the dose distribution in helical tomotherapy

  11. A Continuous Improvement Capital Funding Model.

    Science.gov (United States)

    Adams, Matt

    2001-01-01

    Describes a capital funding model that helps assess facility renewal needs in a way that minimizes resources while maximizing results. The article explains the sub-components of a continuous improvement capital funding model, including budgeting processes for finish renewal, building performance renewal, and critical outcome. (GR)

  12. Understanding catchment behaviour through model concept improvement

    NARCIS (Netherlands)

    Fenicia, F.

    2008-01-01

    This thesis describes an approach to model development based on the concept of iterative model improvement, which is a process where by trial and error different hypotheses of catchment behaviour are progressively tested, and the understanding of the system proceeds through a combined process of

  13. Improved ionic model of liquid uranium dioxide

    NARCIS (Netherlands)

    Gryaznov, [No Value; Iosilevski, [No Value; Yakub, E; Fortov, [No Value; Hyland, GJ; Ronchi, C

    The paper presents a model for liquid uranium dioxide, obtained by improving a simplified ionic model, previously adopted to describe the equation of state of this substance [1]. A "chemical picture" is used for liquid UO2 of stoichiometric and non-stoichiometric composition. Several ionic species

  14. Influence of Hardening Model on Weld Residual Stress Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Mullins, Jonathan; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden))

    2009-06-15

    This study is the third stage of a project sponsored by the Swedish Radiation Safety Authority (SSM) to improve the weld residual stress modelling procedures currently used in Sweden. The aim of this study was to determine which material hardening model gave the best agreement with experimentally measured weld residual stress distributions. Two girth weld geometries were considered: 19mm and 65mm thick girth welds with Rin/t ratios of 10.5 and 2.8, respectively. The FE solver ABAQUS Standard v6.5 was used for analysis. As a preliminary step some improvements were made to the welding simulation procedure used in part one of the project. First, monotonic stress strain curves and a mixed isotropic/kinematic hardening model were sourced from the literature for 316 stainless steel. Second, more detailed information was obtained regarding the geometry and welding sequence for the Case 1 weld (compared with phase 1 of this project). Following the preliminary step, welding simulations were conducted using isotropic, kinematic and mixed hardening models. The isotropic hardening model gave the best overall agreement with experimental measurements; it is therefore recommended for future use in welding simulations. The mixed hardening model gave good agreement for predictions of the hoop stress but tended to under estimate the magnitude of the axial stress. It must be noted that two different sources of data were used for the isotropic and mixed models in this study and this may have contributed to the discrepancy in predictions. When defining a mixed hardening model it is difficult to delineate the relative contributions of isotropic and kinematic hardening and for the model used it may be that a greater isotropic hardening component should have been specified. The kinematic hardening model consistently underestimated the magnitude of both the axial and hoop stress and is not recommended for use. Two sensitivity studies were also conducted. In the first the effect of using a

  15. Influence of Hardening Model on Weld Residual Stress Distribution

    International Nuclear Information System (INIS)

    Mullins, Jonathan; Gunnars, Jens

    2009-06-01

    This study is the third stage of a project sponsored by the Swedish Radiation Safety Authority (SSM) to improve the weld residual stress modelling procedures currently used in Sweden. The aim of this study was to determine which material hardening model gave the best agreement with experimentally measured weld residual stress distributions. Two girth weld geometries were considered: 19mm and 65mm thick girth welds with Rin/t ratios of 10.5 and 2.8, respectively. The FE solver ABAQUS Standard v6.5 was used for analysis. As a preliminary step some improvements were made to the welding simulation procedure used in part one of the project. First, monotonic stress strain curves and a mixed isotropic/kinematic hardening model were sourced from the literature for 316 stainless steel. Second, more detailed information was obtained regarding the geometry and welding sequence for the Case 1 weld (compared with phase 1 of this project). Following the preliminary step, welding simulations were conducted using isotropic, kinematic and mixed hardening models. The isotropic hardening model gave the best overall agreement with experimental measurements; it is therefore recommended for future use in welding simulations. The mixed hardening model gave good agreement for predictions of the hoop stress but tended to under estimate the magnitude of the axial stress. It must be noted that two different sources of data were used for the isotropic and mixed models in this study and this may have contributed to the discrepancy in predictions. When defining a mixed hardening model it is difficult to delineate the relative contributions of isotropic and kinematic hardening and for the model used it may be that a greater isotropic hardening component should have been specified. The kinematic hardening model consistently underestimated the magnitude of both the axial and hoop stress and is not recommended for use. Two sensitivity studies were also conducted. In the first the effect of using a

  16. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  17. The κ-generalized distribution: A new descriptive model for the size distribution of incomes

    Science.gov (United States)

    Clementi, F.; Di Matteo, T.; Gallegati, M.; Kaniadakis, G.

    2008-05-01

    This paper proposes the κ-generalized distribution as a model for describing the distribution and dispersion of income within a population. Formulas for the shape, moments and standard tools for inequality measurement-such as the Lorenz curve and the Gini coefficient-are given. A method for parameter estimation is also discussed. The model is shown to fit extremely well the data on personal income distribution in Australia and in the United States.

  18. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  19. A Unified Framework for Systematic Model Improvement

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2003-01-01

    A unified framework for improving the quality of continuous time models of dynamic systems based on experimental data is presented. The framework is based on an interplay between stochastic differential equation (SDE) modelling, statistical tests and multivariate nonparametric regression. This co......-batch bioreactor, where it is illustrated how an incorrectly modelled biomass growth rate can be pinpointed and an estimate provided of the functional relation needed to properly describe it....

  20. Can better modelling improve tokamak control?

    International Nuclear Information System (INIS)

    Lister, J.B.; Vyas, P.; Ward, D.J.; Albanese, R.; Ambrosino, G.; Ariola, M.; Villone, F.; Coutlis, A.; Limebeer, D.J.N.; Wainwright, J.P.

    1997-01-01

    The control of present day tokamaks usually relies upon primitive modelling and TCV is used to illustrate this. A counter example is provided by the successful implementation of high order SISO controllers on COMPASS-D. Suitable models of tokamaks are required to exploit the potential of modern control techniques. A physics based MIMO model of TCV is presented and validated with experimental closed loop responses. A system identified open loop model is also presented. An enhanced controller based on these models is designed and the performance improvements discussed. (author) 5 figs., 9 refs

  1. Distributed generation system with PEM fuel cell for electrical power quality improvement

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez, D.; Beites, L.F.; Blazquez, F. [Department of Electrical Engineering, ETSII, Escuela de Ingenieros Industriales, Universidad Politecnica de Madrid, C/ Jose Gutierrez Abascal 2, 28006 Madrid (Spain); Ballesteros, J.C. [Endesa Generacion, S.A. c/ Ribera de Loira 60, 28042 Madrid (Spain)

    2008-08-15

    In this paper, a physical model for a distributed generation (DG) system with power quality improvement capability is presented. The generating system consists of a 5 kW PEM fuel cell, a natural gas reformer, hydrogen storage bottles and a bank of ultra-capacitors. Additional power quality functions are implemented with a vector-controlled electronic converter for regulating the injected power. The capabilities of the system were experimentally tested on a scaled electrical network. It is composed of different lines, built with linear inductances and resistances, and taking into account both linear and non-linear loads. The ability to improve power quality was tested by means of different voltage and frequency perturbations produced on the physical model electrical network. (author)

  2. Improvement of the design model for SMART fuel assembly

    International Nuclear Information System (INIS)

    Zee, Sung Kyun; Yim, Jeong Sik

    2001-04-01

    A Study on the design improvement of the TEP, BEP and Hoddown spring of a fuel assembly for SMART was performed. Cut boundary Interpolation Method was applied to get more accurate results of stress and strain distribution from the results of the coarse model calculation. The improved results were compared with that of a coarse one. The finer model predicted slightly higher stress and strain distribution than the coarse model, which meant the results of the coarse model was not converged. Considering that the test results always showed much less stress than the FEM and the location of the peak stress of the refined model, the pressure stress on the loading point seemed to contribute significantly to the stresses. Judging from the fact that the peak stress appeared only at the local area, the results of the refined model were considered enough to be a conservative prediction of the stress levels. The slot of the guide thimble screw was ignored to get how much thickness of the flow plate can be reduced in case of optimization of the thickness and also cut off the screw dent hole was included for the actual geometry. For the BEP, the leg and web were also included in the model and the results with and without the leg alignment support were compared. Finally, the holddown spring which is important during the in-reactor behavior of the FA was modeled more realistic and improved to include the effects of the friction between the leaves and the loading surface. Using this improved model, it was possible that the spring characteristics were predicted more accurate to the test results. From the analysis of the spring characteristics, the local plastic area controled the characteristics of the spring dominantly which implied that it was necessary for the design of the leaf to be optimized for the improvement of the plastic behavior of the leaf spring

  3. Improved hidden Markov model for nosocomial infections.

    Science.gov (United States)

    Khader, Karim; Leecaster, Molly; Greene, Tom; Samore, Matthew; Thomas, Alun

    2014-12-01

    We propose a novel hidden Markov model (HMM) for parameter estimation in hospital transmission models, and show that commonly made simplifying assumptions can lead to severe model misspecification and poor parameter estimates. A standard HMM that embodies two commonly made simplifying assumptions, namely a fixed patient count and binomially distributed detections is compared with a new alternative HMM that does not require these simplifying assumptions. Using simulated data, we demonstrate how each of the simplifying assumptions used by the standard model leads to model misspecification, whereas the alternative model results in accurate parameter estimates. © The Authors 2013. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  4. Risk Assessment for Distribution Systems Using an Improved PEM-Based Method Considering Wind and Photovoltaic Power Distribution

    Directory of Open Access Journals (Sweden)

    Qingwu Gong

    2017-03-01

    Full Text Available The intermittency and variability of permeated distributed generators (DGs could cause many critical security and economy risks to distribution systems. This paper applied a certain mathematical distribution to imitate the output variability and uncertainty of DGs. Then, four risk indices—EENS (expected energy not supplied, PLC (probability of load curtailment, EFLC (expected frequency of load curtailment, and SI (severity index—were established to reflect the system risk level of the distribution system. For the certain mathematical distribution of the DGs’ output power, an improved PEM (point estimate method-based method was proposed to calculate these four system risk indices. In this improved PEM-based method, an enumeration method was used to list the states of distribution systems, and an improved PEM was developed to deal with the uncertainties of DGs, and the value of load curtailment in distribution systems was calculated by an optimal power flow algorithm. Finally, the effectiveness and advantages of this proposed PEM-based method for distribution system assessment were verified by testing a modified IEEE 30-bus system. Simulation results have shown that this proposed PEM-based method has a high computational accuracy and highly reduced computational costs compared with other risk assessment methods and is very effective for risk assessments.

  5. A Distributed Hydrological model Forced by DIMP2 Data and the WRF Mesoscale model

    Science.gov (United States)

    Wayand, N. E.

    2010-12-01

    Forecasted warming over the next century will drastically reduce seasonal snowpack that provides 40% of the world’s drinking water. With increased climate warming, droughts may occur more frequently, which will increase society’s reliance on this same summer snowpack as a water supply. This study aims to reduce driving data errors that lead to poor simulations of snow ablation and accumulation, and streamflow. Results from the Distributed Hydrological Model Intercomparison Project Phase 2 (DMIP2) project using the Distributed Hydrology Soil and Vegetation Model (DHSVM) highlighted the critical need for accurate driving data that distributed models require. Currently, the meteorological driving data for distributed hydrological models commonly rely on interpolation techniques between a network of observational stations, as well as historical monthly means. This method is limited by two significant issues: snowpack is stored at high elevations, where interpolation techniques perform poorly due to sparse observations, and historic climatological means may be unsuitable in a changing climate. Mesoscale models may provide a physically-based approach to supplement surface observations over high-elevation terrain. Initial results have shown that while temperature lapse rates are well represented by multiple mesoscale models, significant precipitation biases are dependent on the particular model microphysics. We evaluate multiple methods of downscaling surface variables from the Weather and Research Forecasting (WRF) model that are then used to drive DHSVM over the North Fork American River basin in California. A comparison between each downscaled driving data set and paired DHSVM results to observations will determine how much improvement in simulated streamflow and snowpack are gained at the expense of each additional degree of downscaling. Our results from DMIP2 will be used as a benchmark for the best available DHSVM run using all available observational data. The

  6. Supply Chain Synchronization: Improving Distribution Velocity to the Theatre

    Science.gov (United States)

    2009-06-01

    Figures ix List of Tables x I. Introduction 1 II. Literature Review 4...DISTRIBUTION VELOCITY TO THE THEATRE I. Introduction “When you do battle, even if you are winning, if you continue for a long time it will...jointvision/jvpub2.htm Accessed 9 March 2009. Lambert, Douglas M. Supply Chain Mangement : Processes, Partnerships, Performance. Jacksonville: The

  7. Deterioration and optimal rehabilitation modelling for urban water distribution systems

    NARCIS (Netherlands)

    Zhou, Y.

    2018-01-01

    Pipe failures in water distribution systems can have a serious impact and hence it’s important to maintain the condition and integrity of the distribution system. This book presents a whole-life cost optimisation model for the rehabilitation of water distribution systems. It combines a pipe breakage

  8. Bayesian Nonparametric Model for Estimating Multistate Travel Time Distribution

    Directory of Open Access Journals (Sweden)

    Emmanuel Kidando

    2017-01-01

    Full Text Available Multistate models, that is, models with more than two distributions, are preferred over single-state probability models in modeling the distribution of travel time. Literature review indicated that the finite multistate modeling of travel time using lognormal distribution is superior to other probability functions. In this study, we extend the finite multistate lognormal model of estimating the travel time distribution to unbounded lognormal distribution. In particular, a nonparametric Dirichlet Process Mixture Model (DPMM with stick-breaking process representation was used. The strength of the DPMM is that it can choose the number of components dynamically as part of the algorithm during parameter estimation. To reduce computational complexity, the modeling process was limited to a maximum of six components. Then, the Markov Chain Monte Carlo (MCMC sampling technique was employed to estimate the parameters’ posterior distribution. Speed data from nine links of a freeway corridor, aggregated on a 5-minute basis, were used to calculate the corridor travel time. The results demonstrated that this model offers significant flexibility in modeling to account for complex mixture distributions of the travel time without specifying the number of components. The DPMM modeling further revealed that freeway travel time is characterized by multistate or single-state models depending on the inclusion of onset and offset of congestion periods.

  9. Improving the physiological realism of experimental models.

    Science.gov (United States)

    Vinnakota, Kalyan C; Cha, Chae Y; Rorsman, Patrik; Balaban, Robert S; La Gerche, Andre; Wade-Martins, Richard; Beard, Daniel A; Jeneson, Jeroen A L

    2016-04-06

    The Virtual Physiological Human (VPH) project aims to develop integrative, explanatory and predictive computational models (C-Models) as numerical investigational tools to study disease, identify and design effective therapies and provide an in silico platform for drug screening. Ultimately, these models rely on the analysis and integration of experimental data. As such, the success of VPH depends on the availability of physiologically realistic experimental models (E-Models) of human organ function that can be parametrized to test the numerical models. Here, the current state of suitable E-models, ranging from in vitro non-human cell organelles to in vivo human organ systems, is discussed. Specifically, challenges and recent progress in improving the physiological realism of E-models that may benefit the VPH project are highlighted and discussed using examples from the field of research on cardiovascular disease, musculoskeletal disorders, diabetes and Parkinson's disease.

  10. Radar meteors range distribution model. I. Theory

    Czech Academy of Sciences Publication Activity Database

    Pecinová, Drahomíra; Pecina, Petr

    2007-01-01

    Roč. 37, č. 2 (2007), s. 83-106 ISSN 1335-1842 R&D Projects: GA ČR GA205/03/1405 Institutional research plan: CEZ:AV0Z10030501 Keywords : physics of meteors * radar meteors * range distribution Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics

  11. Information Distribution in Complex Systems to Improve Team Performance

    National Research Council Canada - National Science Library

    Sperling, Brian K; Pritchett, Amy; Estrada, Arthur; Adam, Gina E

    2006-01-01

    .... Specifically, this study hypothesizes that providing task specific information to individual team members will improve coordination and decision-making, and therefore team performance, at time-critical tasks...

  12. A Hierarchical Modeling for Reactive Power Optimization With Joint Transmission and Distribution Networks by Curve Fitting

    DEFF Research Database (Denmark)

    Ding, Tao; Li, Cheng; Huang, Can

    2018-01-01

    –slave structure and improves traditional centralized modeling methods by alleviating the big data problem in a control center. Specifically, the transmission-distribution-network coordination issue of the hierarchical modeling method is investigated. First, a curve-fitting approach is developed to provide a cost......In order to solve the reactive power optimization with joint transmission and distribution networks, a hierarchical modeling method is proposed in this paper. It allows the reactive power optimization of transmission and distribution networks to be performed separately, leading to a master...... optimality. Numerical results on two test systems verify the effectiveness of the proposed hierarchical modeling and curve-fitting methods....

  13. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  14. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  15. Research on the control strategy of distributed energy resources inverter based on improved virtual synchronous generator.

    Science.gov (United States)

    Gao, Changwei; Liu, Xiaoming; Chen, Hai

    2017-08-22

    This paper focus on the power fluctuations of the virtual synchronous generator(VSG) during the transition process. An improved virtual synchronous generator(IVSG) control strategy based on feed-forward compensation is proposed. Adjustable parameter of the compensation section can be modified to achieve the goal of reducing the order of the system. It can effectively suppress the power fluctuations of the VSG in transient process. To verify the effectiveness of the proposed control strategy for distributed energy resources inverter, the simulation model is set up in MATLAB/SIMULINK platform and physical experiment platform is established. Simulation and experiment results demonstrate the effectiveness of the proposed IVSG control strategy.

  16. Improving Distribution of Military Programs’ Technical Criteria

    Science.gov (United States)

    1993-08-01

    Vacuum System ETL 1110-3-380 01/29/88 Std Distribution of Military Arfid Pavement Dsg ETL 1110-3-381 01/29/88 Airfield Pavement Design ETL 1110-3...Army Arfid O&M Facilities TM 5-825-2 08/01/78 Flexible Pavement Design for Airfields TM 5-825-2-1 11/01/89 Army Airfields Pavements, Flex (Appendix

  17. Making Improvements to The Army Distributed Learning Program

    Science.gov (United States)

    2012-01-01

    ing focuses on leadership and management as well as technical skills, and involves the creation of global virtual teams. e training often deals...develop and distribute knowledge via a dynamic, global knowledge network called the Battle Command Knowledge System with a purpose of providing...Levels of Interactivity,” paper presented at 2006 dL Workshop, March 14, 2006. Wexler, S., et al., E-Learning 2.0., Santa Rosa, Calif.: e ELearning

  18. Modelling the distribution of pig production and diseases in Thailand

    OpenAIRE

    Thanapongtharm, Weerapong

    2015-01-01

    This thesis, entitled “Modelling the distribution of pig production and diseases in Thailand”, presents many aspects of pig production in Thailand including the characteristics of pig farming system, distribution of pig population and pig farms, spatio-temporal distribution and risk of most important diseases in pig at present, and the suitability area for pig farming. Spatial distribution and characteristics of pig farming in Thailand were studied using time-series pig population data to des...

  19. Modeling and optimization of an electric power distribution network ...

    African Journals Online (AJOL)

    Modeling and optimization of an electric power distribution network planning system using ... of the network was modelled with non-linear mathematical expressions. ... given feasible locations, re-conductoring of existing feeders in the network, ...

  20. Bilinear reduced order approximate model of parabolic distributed solar collectors

    KAUST Repository

    Elmetennani, Shahrazed; Laleg-Kirati, Taous-Meriem

    2015-01-01

    This paper proposes a novel, low dimensional and accurate approximate model for the distributed parabolic solar collector, by means of a modified gaussian interpolation along the spatial domain. The proposed reduced model, taking the form of a low

  1. A penalized framework for distributed lag non-linear models.

    Science.gov (United States)

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  2. An Improved Valuation Model for Technology Companies

    Directory of Open Access Journals (Sweden)

    Ako Doffou

    2015-06-01

    Full Text Available This paper estimates some of the parameters of the Schwartz and Moon (2001 model using cross-sectional data. Stochastic costs, future financing, capital expenditures and depreciation are taken into account. Some special conditions are also set: the speed of adjustment parameters are equal; the implied half-life of the sales growth process is linked to analyst forecasts; and the risk-adjustment parameter is inferred from the company’s observed stock price beta. The model is illustrated in the valuation of Google, Amazon, eBay, Facebook and Yahoo. The improved model is far superior to the Schwartz and Moon (2001 model.

  3. Designing the Distributed Model Integration Framework – DMIF

    NARCIS (Netherlands)

    Belete, Getachew F.; Voinov, Alexey; Morales, Javier

    2017-01-01

    We describe and discuss the design and prototype of the Distributed Model Integration Framework (DMIF) that links models deployed on different hardware and software platforms. We used distributed computing and service-oriented development approaches to address the different aspects of

  4. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  5. Building predictive models of soil particle-size distribution

    Directory of Open Access Journals (Sweden)

    Alessandro Samuel-Rosa

    2013-04-01

    Full Text Available Is it possible to build predictive models (PMs of soil particle-size distribution (psd in a region with complex geology and a young and unstable land-surface? The main objective of this study was to answer this question. A set of 339 soil samples from a small slope catchment in Southern Brazil was used to build PMs of psd in the surface soil layer. Multiple linear regression models were constructed using terrain attributes (elevation, slope, catchment area, convergence index, and topographic wetness index. The PMs explained more than half of the data variance. This performance is similar to (or even better than that of the conventional soil mapping approach. For some size fractions, the PM performance can reach 70 %. Largest uncertainties were observed in geologically more complex areas. Therefore, significant improvements in the predictions can only be achieved if accurate geological data is made available. Meanwhile, PMs built on terrain attributes are efficient in predicting the particle-size distribution (psd of soils in regions of complex geology.

  6. Improvement of MARS code reflood model

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Chung, Bub-Dong

    2011-01-01

    A specifically designed heat transfer model for the reflood process which normally occurs at low flow and low pressure was originally incorporated in the MARS code. The model is essentially identical to that of the RELAP5/MOD3.3 code. The model, however, is known to have under-estimated the peak cladding temperature (PCT) with earlier turn-over. In this study, the original MARS code reflood model is improved. Based on the extensive sensitivity studies for both hydraulic and wall heat transfer models, it is found that the dispersed flow film boiling (DFFB) wall heat transfer is the most influential process determining the PCT, whereas the interfacial drag model most affects the quenching time through the liquid carryover phenomenon. The model proposed by Bajorek and Young is incorporated for the DFFB wall heat transfer. Both space grid and droplet enhancement models are incorporated. Inverted annular film boiling (IAFB) is modeled by using the original PSI model of the code. The flow transition between the DFFB and IABF, is modeled using the TRACE code interpolation. A gas velocity threshold is also added to limit the top-down quenching effect. Assessment calculations are performed for the original and modified MARS codes for the Flecht-Seaset test and RBHT test. Improvements are observed in terms of the PCT and quenching time predictions in the Flecht-Seaset assessment. In case of the RBHT assessment, the improvement over the original MARS code is found marginal. A space grid effect, however, is clearly seen from the modified version of the MARS code. (author)

  7. Study on isotopic distribution produced by nucleus-nucleus collisions with modified SAA model

    International Nuclear Information System (INIS)

    Zhong Chen; Fang Deqing; Cai Xiangzhou; Shen Wenqing; Zhang Huyong; Wei Yibin; Ma Yugang

    2003-01-01

    Base on Brohm's Statistic-Ablation-Abrasion (SAA) model, the modified SAA model was developed via introducing the isospin dependence of nucleon distribution in nucleus and parameterized formulas for nucleon-nucleon cross section in nuclear matter. It can simulate well the isotopic distribution at both high and intermediate energies. By the improvement of computational method, the range of calculation of isotopic distribution can be increased from three order magnitude to eight order magnitude (even higher). It can reproduce experimental data and predict the isotopic distribution for very far from stability line which is very important from experimental viewpoint

  8. Improved dust representation in the Community Atmosphere Model

    Science.gov (United States)

    Albani, S.; Mahowald, N. M.; Perry, A. T.; Scanza, R. A.; Zender, C. S.; Heavens, N. G.; Maggi, V.; Kok, J. F.; Otto-Bliesner, B. L.

    2014-09-01

    Aerosol-climate interactions constitute one of the major sources of uncertainty in assessing changes in aerosol forcing in the anthropocene as well as understanding glacial-interglacial cycles. Here we focus on improving the representation of mineral dust in the Community Atmosphere Model and assessing the impacts of the improvements in terms of direct effects on the radiative balance of the atmosphere. We simulated the dust cycle using different parameterization sets for dust emission, size distribution, and optical properties. Comparing the results of these simulations with observations of concentration, deposition, and aerosol optical depth allows us to refine the representation of the dust cycle and its climate impacts. We propose a tuning method for dust parameterizations to allow the dust module to work across the wide variety of parameter settings which can be used within the Community Atmosphere Model. Our results include a better representation of the dust cycle, most notably for the improved size distribution. The estimated net top of atmosphere direct dust radiative forcing is -0.23 ± 0.14 W/m2 for present day and -0.32 ± 0.20 W/m2 at the Last Glacial Maximum. From our study and sensitivity tests, we also derive some general relevant findings, supporting the concept that the magnitude of the modeled dust cycle is sensitive to the observational data sets and size distribution chosen to constrain the model as well as the meteorological forcing data, even within the same modeling framework, and that the direct radiative forcing of dust is strongly sensitive to the optical properties and size distribution used.

  9. Determining the factors affecting the distribution of Muscari latifolium, an endemic plant of Turkey, and a mapping species distribution model.

    Science.gov (United States)

    Yilmaz, Hatice; Yilmaz, Osman Yalçın; Akyüz, Yaşar Feyza

    2017-02-01

    Species distribution modeling was used to determine factors among the large predictor candidate data set that affect the distribution of Muscari latifolium , an endemic bulbous plant species of Turkey, to quantify the relative importance of each factor and make a potential spatial distribution map of M. latifolium . Models were built using the Boosted Regression Trees method based on 35 presence and 70 absence records obtained through field sampling in the Gönen Dam watershed area of the Kazdağı Mountains in West Anatolia. Large candidate variables of monthly and seasonal climate, fine-scale land surface, and geologic and biotic variables were simplified using a BRT simplifying procedure. Analyses performed on these resources, direct and indirect variables showed that there were 14 main factors that influence the species' distribution. Five of the 14 most important variables influencing the distribution of the species are bedrock type, Quercus cerris density, precipitation during the wettest month, Pinus nigra density, and northness. These variables account for approximately 60% of the relative importance for determining the distribution of the species. Prediction performance was assessed by 10 random subsample data sets and gave a maximum the area under a receiver operating characteristic curve (AUC) value of 0.93 and an average AUC value of 0.8. This study provides a significant contribution to the knowledge of the habitat requirements and ecological characteristics of this species. The distribution of this species is explained by a combination of biotic and abiotic factors. Hence, using biotic interaction and fine-scale land surface variables in species distribution models improved the accuracy and precision of the model. The knowledge of the relationships between distribution patterns and environmental factors and biotic interaction of M. latifolium can help develop a management and conservation strategy for this species.

  10. A generalized statistical model for the size distribution of wealth

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2012-01-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature. (paper)

  11. A generalized statistical model for the size distribution of wealth

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  12. Electric Power Distribution System Model Simplification Using Segment Substitution

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat; Reed, Gregory F.

    2018-05-01

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers model bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). In contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.

  13. Electric Power Distribution System Model Simplification Using Segment Substitution

    International Nuclear Information System (INIS)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat; Reed, Gregory F.

    2017-01-01

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers model bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). Finally, in contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.

  14. Robustness of a Distributed Knowledge Management Model

    DEFF Research Database (Denmark)

    Pedersen, Mogens Kühn; Larsen, Michael Holm

    1999-01-01

    Knowledge management based on symmetric incentives is rarely found in literature. A knowledge exchange model relies upon a double loop knowledge conversion with symmetric incentives in a network. The model merges specific knowledge with knowledge from other actors into a decision support system...

  15. Improved model for solar heating of buildings

    OpenAIRE

    Lie, Bernt

    2015-01-01

    A considerable future increase in the global energy use is expected, and the effects of energy conversion on the climate are already observed. Future energy conversion should thus be based on resources that have negligible climate effects; solar energy is perhaps the most important of such resources. The presented work builds on a previous complete model for solar heating of a house; here the aim to introduce ventilation heat recovery and improve on the hot water storage model. Ventilation he...

  16. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  17. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  18. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  19. Distributed hydrological modelling of total dissolved phosphorus transport in an agricultural landscape, part I: distributed runoff generation

    Directory of Open Access Journals (Sweden)

    P. Gérard-Marchant

    2006-01-01

    Full Text Available Successful implementation of best management practices for reducing non-point source (NPS pollution requires knowledge of the location of saturated areas that produce runoff. A physically-based, fully-distributed, GIS-integrated model, the Soil Moisture Distribution and Routing (SMDR model was developed to simulate the hydrologic behavior of small rural upland watersheds with shallow soils and steep to moderate slopes. The model assumes that gravity is the only driving force of water and that most overland flow occurs as saturation excess. The model uses available soil and climatic data, and requires little calibration. The SMDR model was used to simulate runoff production on a 164-ha farm watershed in Delaware County, New York, in the headwaters of New York City water supply. Apart from land use, distributed input parameters were derived from readily available data. Simulated hydrographs compared reasonably with observed flows at the watershed outlet over a eight year simulation period, and peak timing and intensities were well reproduced. Using off-site weather input data produced occasional missed event peaks. Simulated soil moisture distribution agreed well with observed hydrological features and followed the same spatial trend as observed soil moisture contents sampled on four transects. Model accuracy improved when input variables were calibrated within the range of SSURGO-available parameters. The model will be a useful planning tool for reducing NPS pollution from farms in landscapes similar to the Northeastern US.

  20. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  1. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  2. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  3. Tempered stable distributions stochastic models for multiscale processes

    CERN Document Server

    Grabchak, Michael

    2015-01-01

    This brief is concerned with tempered stable distributions and their associated Levy processes. It is a good text for researchers interested in learning about tempered stable distributions.  A tempered stable distribution is one which takes a stable distribution and modifies its tails to make them lighter. The motivation for this class comes from the fact that infinite variance stable distributions appear to provide a good fit to data in a variety of situations, but the extremely heavy tails of these models are not realistic for most real world applications. The idea of using distributions that modify the tails of stable models to make them lighter seems to have originated in the influential paper of Mantegna and Stanley (1994). Since then, these distributions have been extended and generalized in a variety of ways. They have been applied to a wide variety of areas including mathematical finance, biostatistics,computer science, and physics.

  4. Improved SPICE electrical model of silicon photomultipliers

    Energy Technology Data Exchange (ETDEWEB)

    Marano, D., E-mail: davide.marano@oact.inaf.it [INAF, Osservatorio Astrofisico di Catania, Via S. Sofia 78, I-95123 Catania (Italy); Bonanno, G.; Belluso, M.; Billotta, S.; Grillo, A.; Garozzo, S.; Romeo, G. [INAF, Osservatorio Astrofisico di Catania, Via S. Sofia 78, I-95123 Catania (Italy); Catalano, O.; La Rosa, G.; Sottile, G.; Impiombato, D.; Giarrusso, S. [INAF, Istituto di Astrofisica Spaziale e Fisica Cosmica di Palermo, Via U. La Malfa 153, I-90146 Palermo (Italy)

    2013-10-21

    The present work introduces an improved SPICE equivalent electrical model of silicon photomultiplier (SiPM) detectors, in order to simulate and predict their transient response to avalanche triggering events. In particular, the developed circuit model provides a careful investigation of the magnitude and timing of the read-out signals and can therefore be exploited to perform reliable circuit-level simulations. The adopted modeling approach is strictly related to the physics of each basic microcell constituting the SiPM device, and allows the avalanche timing as well as the photodiode current and voltage to be accurately simulated. Predictive capabilities of the proposed model are demonstrated by means of experimental measurements on a real SiPM detector. Simulated and measured pulses are found to be in good agreement with the expected results. -- Highlights: • An improved SPICE electrical model of silicon photomultipliers is proposed. • The developed model provides a truthful representation of the physics of the device. • An accurate charge collection as a function of the overvoltage is achieved. • The adopted electrical model allows reliable circuit-level simulations to be performed. • Predictive capabilities of the adopted model are experimentally demonstrated.

  5. Correlation Structures of Correlated Binomial Models and Implied Default Distribution

    Science.gov (United States)

    Mori, Shintaro; Kitsukawa, Kenji; Hisakado, Masato

    2008-11-01

    We show how to analyze and interpret the correlation structures, the conditional expectation values and correlation coefficients of exchangeable Bernoulli random variables. We study implied default distributions for the iTraxx-CJ tranches and some popular probabilistic models, including the Gaussian copula model, Beta binomial distribution model and long-range Ising model. We interpret the differences in their profiles in terms of the correlation structures. The implied default distribution has singular correlation structures, reflecting the credit market implications. We point out two possible origins of the singular behavior.

  6. Development of a distributed air pollutant dry deposition modeling framework

    International Nuclear Information System (INIS)

    Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J.

    2012-01-01

    A distributed air pollutant dry deposition modeling system was developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry deposition of carbon monoxide (CO), nitrogen dioxide (NO 2 ), sulfur dioxide (SO 2 ), and particulate matter less than 10 microns (PM10) to trees can be spatially quantified. Employing nationally available road network, traffic volume, air pollutant emission/measurement and meteorological data, the developed system provides a framework for the U.S. city managers to identify spatial patterns of urban forest and locate potential areas for future urban forest planting and protection to improve air quality. To exhibit the usability of the framework, a case study was performed for July and August of 2005 in Baltimore, MD. - Highlights: ► A distributed air pollutant dry deposition modeling system was developed. ► The developed system enhances the functionality of i-Tree Eco. ► The developed system employs nationally available input datasets. ► The developed system is transferable to any U.S. city. ► Future planting and protection spots were visually identified in a case study. - Employing nationally available datasets and a GIS, this study will provide urban forest managers in U.S. cities a framework to quantify and visualize urban forest structure and its air pollution removal effect.

  7. An optimization model for improving highway safety

    Directory of Open Access Journals (Sweden)

    Promothes Saha

    2016-12-01

    Full Text Available This paper developed a traffic safety management system (TSMS for improving safety on county paved roads in Wyoming. TSMS is a strategic and systematic process to improve safety of roadway network. When funding is limited, it is important to identify the best combination of safety improvement projects to provide the most benefits to society in terms of crash reduction. The factors included in the proposed optimization model are annual safety budget, roadway inventory, roadway functional classification, historical crashes, safety improvement countermeasures, cost and crash reduction factors (CRFs associated with safety improvement countermeasures, and average daily traffics (ADTs. This paper demonstrated how the proposed model can identify the best combination of safety improvement projects to maximize the safety benefits in terms of reducing overall crash frequency. Although the proposed methodology was implemented on the county paved road network of Wyoming, it could be easily modified for potential implementation on the Wyoming state highway system. Other states can also benefit by implementing a similar program within their jurisdictions.

  8. Improving Representational Competence with Concrete Models

    Science.gov (United States)

    Stieff, Mike; Scopelitis, Stephanie; Lira, Matthew E.; DeSutter, Dane

    2016-01-01

    Representational competence is a primary contributor to student learning in science, technology, engineering, and math (STEM) disciplines and an optimal target for instruction at all educational levels. We describe the design and implementation of a learning activity that uses concrete models to improve students' representational competence and…

  9. Improved transition models for cepstral trajectories

    CSIR Research Space (South Africa)

    Badenhorst, J

    2012-11-01

    Full Text Available We improve on a piece-wise linear model of the trajectories of Mel Frequency Cepstral Coefficients, which are commonly used as features in Automatic Speech Recognition. For this purpose, we have created a very clean single-speaker corpus, which...

  10. School Improvement Model to Foster Student Learning

    Science.gov (United States)

    Rulloda, Rudolfo Barcena

    2011-01-01

    Many classroom teachers are still using the traditional teaching methods. The traditional teaching methods are one-way learning process, where teachers would introduce subject contents such as language arts, English, mathematics, science, and reading separately. However, the school improvement model takes into account that all students have…

  11. Species Distribution modeling as a tool to unravel determinants of palm distribution in Thailand

    DEFF Research Database (Denmark)

    Tovaranonte, Jantrararuk; Barfod, Anders S.; Balslev, Henrik

    2011-01-01

    As a consequence of the decimation of the forest cover in Thailand from 50% to ca. 20 % since the 1950ies, it is difficult to gain insight in the drivers behind past, present and future distribution ranges of plant species. Species distribution modeling allows visualization of potential species...... distribution under specific sets of assumptions. In this study we used maximum entropy to map potential distributions of 103 species of palms for which more than 5 herbarium records exist. Palms constitute key-stone plant group from both an ecological, economical and conservation perspective. The models were......) and the Area Under the Curve (AUC). All models performed well with AUC scores above 0.95. The predicted distribution ranges showed high suitability for palms in the southern region of Thailand. It also shows that spatial predictor variables are important in cases where historical processes may explain extant...

  12. Distributed Prognostics Based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...

  13. Improved double Q2 rescaling model

    International Nuclear Information System (INIS)

    Gao Yonghua

    2001-01-01

    The authors present an improved double Q 2 rescaling model. Based on this condition of the nuclear momentum conservation, the authors have found a Q 2 rescaling parameters' formula of the model, where authors have established the connection between the Q 2 rescaling parameter ζ i (i = v, s, g) and the mean binding energy in nucleus. By using this model, the authors coned explain the experimental data of the EMC effect in the whole x region, the nuclear Drell-Yan process and J/Ψ photoproduction process

  14. Smart Demand for Improving Short-term Voltage Control on Distribution Networks

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; P. Da Silva, Luiz C.; Xu, Zhao

    2009-01-01

    customer integration to aid power system performance is almost inevitable. This study introduces a new type of smart demand side technology, denoted demand as voltage controlled reserve (DVR), to improve short-term voltage control, where customers are expected to play a more dynamic role to improve voltage...... control. The technology can be provided by thermostatically controlled loads as well as other types of load. This technology is proven to be effective in case of distribution systems with a large composition of induction motors, where the voltage presents a slow recovery characteristic due to deceleration...... of the motors during faults. This study presents detailed models, discussion and simulation tests to demonstrate the technical viability and effectiveness of the DVR technology for short-term voltage control....

  15. Improving the spectral measurement accuracy based on temperature distribution and spectra-temperature relationship

    Science.gov (United States)

    Li, Zhe; Feng, Jinchao; Liu, Pengyu; Sun, Zhonghua; Li, Gang; Jia, Kebin

    2018-05-01

    Temperature is usually considered as a fluctuation in near-infrared spectral measurement. Chemometric methods were extensively studied to correct the effect of temperature variations. However, temperature can be considered as a constructive parameter that provides detailed chemical information when systematically changed during the measurement. Our group has researched the relationship between temperature-induced spectral variation (TSVC) and normalized squared temperature. In this study, we focused on the influence of temperature distribution in calibration set. Multi-temperature calibration set selection (MTCS) method was proposed to improve the prediction accuracy by considering the temperature distribution of calibration samples. Furthermore, double-temperature calibration set selection (DTCS) method was proposed based on MTCS method and the relationship between TSVC and normalized squared temperature. We compare the prediction performance of PLS models based on random sampling method and proposed methods. The results from experimental studies showed that the prediction performance was improved by using proposed methods. Therefore, MTCS method and DTCS method will be the alternative methods to improve prediction accuracy in near-infrared spectral measurement.

  16. An Improved Voltage Regulation of a Distribution Network Using ...

    African Journals Online (AJOL)

    The Newton-Raphson Load flow equation modeling was a veritable tool applied in this analysis to determine the convergence points for the voltage magnitude, power (load) angle, power losses along the lines, sending end and receiving end power values at the various buses that make up the thirteen bus network.

  17. Taxing energy to improve the environment : Efficiency and distributional effects

    NARCIS (Netherlands)

    Heijdra, BJ; van der Horst, A

    We study the effects of environmental tax policy in a dynamic overlapping generations model of a small open economy with environmental quality incorporated as a durable consumption good. Raising the energy tax may yield an efficiency gain if agents care enough about the environment. The benefits are

  18. Effects of varying the step particle distribution on a probabilistic transport model

    International Nuclear Information System (INIS)

    Bouzat, S.; Farengo, R.

    2005-01-01

    The consequences of varying the step particle distribution on a probabilistic transport model, which captures the basic features of transport in plasmas and was recently introduced in Ref. 1 [B. Ph. van Milligen et al., Phys. Plasmas 11, 2272 (2004)], are studied. Different superdiffusive transport mechanisms generated by a family of distributions with algebraic decays (Tsallis distributions) are considered. It is observed that the possibility of changing the superdiffusive transport mechanism improves the flexibility of the model for describing different situations. The use of the model to describe the low (L) and high (H) confinement modes is also analyzed

  19. Energy Loss, Velocity Distribution, and Temperature Distribution for a Baffled Cylinder Model, Special Report

    Science.gov (United States)

    Brevoort, Maurice J.

    1937-01-01

    In the design of a cowling a certain pressure drop across the cylinders of a radial air-cooled engine is made available. Baffles are designed to make use of this available pressure drop for cooling. The problem of cooling an air-cooled engine cylinder has been treated, for the most part, from considerations of a large heat-transfer coefficient. The knowledge of the precise cylinder characteristics that give a maximum heat-transfer coefficient should be the first consideration. The next problem is to distribute this ability to cool so that the cylinder cools uniformly. This report takes up the problem of the design of a baffle for a model cylinder. A study has been made of the important principles involved in the operation of a baffle for an engine cylinder and shows that the cooling can be improved 20% by using a correctly designed baffle. Such a gain is as effective in cooling the cylinder with the improved baffle as a 65% increase in pressure drop across the standard baffle and fin tips.

  20. Independent tasks scheduling in cloud computing via improved estimation of distribution algorithm

    Science.gov (United States)

    Sun, Haisheng; Xu, Rui; Chen, Huaping

    2018-04-01

    To minimize makespan for scheduling independent tasks in cloud computing, an improved estimation of distribution algorithm (IEDA) is proposed to tackle the investigated problem in this paper. Considering that the problem is concerned with multi-dimensional discrete problems, an improved population-based incremental learning (PBIL) algorithm is applied, which the parameter for each component is independent with other components in PBIL. In order to improve the performance of PBIL, on the one hand, the integer encoding scheme is used and the method of probability calculation of PBIL is improved by using the task average processing time; on the other hand, an effective adaptive learning rate function that related to the number of iterations is constructed to trade off the exploration and exploitation of IEDA. In addition, both enhanced Max-Min and Min-Min algorithms are properly introduced to form two initial individuals. In the proposed IEDA, an improved genetic algorithm (IGA) is applied to generate partial initial population by evolving two initial individuals and the rest of initial individuals are generated at random. Finally, the sampling process is divided into two parts including sampling by probabilistic model and IGA respectively. The experiment results show that the proposed IEDA not only gets better solution, but also has faster convergence speed.

  1. Facility optimization to improve activation rate distributions during IVNAA

    International Nuclear Information System (INIS)

    Ebrahimi Khankook, Atiyeh; Rafat Motavalli, Laleh; Miri Hakimabad, Hashem

    2013-01-01

    Currently, determination of body composition is the most useful method for distinguishing between certain diseases. The prompt-gamma in vivo neutron activation analysis (IVNAA) facility for non-destructive elemental analysis of the human body is the gold standard method for this type of analysis. In order to obtain accurate measurements using the IVNAA system, the activation probability in the body must be uniform. This can be difficult to achieve, as body shape and body composition affect the rate of activation. The aim of this study was to determine the optimum pre-moderator, in terms of material for attaining uniform activation probability with a CV value of about 10% and changing the collimator role to increase activation rate within the body. Such uniformity was obtained with a high thickness of paraffin pre-moderator, however, because of increasing secondary photon flux received by the detectors it was not an appropriate choice. Our final calculations indicated that using two paraffin slabs with a thickness of 3 cm as a pre-moderator, in the presence of 2 cm Bi on the collimator, achieves a satisfactory distribution of activation rate in the body. (author)

  2. Spreadsheet Modeling of Electron Distributions in Solids

    Science.gov (United States)

    Glassy, Wingfield V.

    2006-01-01

    A series of spreadsheet modeling exercises constructed as part of a new upper-level elective course on solid state materials and surface chemistry is described. The spreadsheet exercises are developed to provide students with the opportunity to interact with the conceptual framework where the role of the density of states and the Fermi-Dirac…

  3. Distributed Model Predictive Control via Dual Decomposition

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Stoustrup, Jakob; Andersen, Palle

    2014-01-01

    This chapter presents dual decomposition as a means to coordinate a number of subsystems coupled by state and input constraints. Each subsystem is equipped with a local model predictive controller while a centralized entity manages the subsystems via prices associated with the coupling constraints...

  4. Simultaneous treatment of unspecified heteroskedastic model error distribution and mismeasured covariates for restricted moment models.

    Science.gov (United States)

    Garcia, Tanya P; Ma, Yanyuan

    2017-10-01

    We develop consistent and efficient estimation of parameters in general regression models with mismeasured covariates. We assume the model error and covariate distributions are unspecified, and the measurement error distribution is a general parametric distribution with unknown variance-covariance. We construct root- n consistent, asymptotically normal and locally efficient estimators using the semiparametric efficient score. We do not estimate any unknown distribution or model error heteroskedasticity. Instead, we form the estimator under possibly incorrect working distribution models for the model error, error-prone covariate, or both. Empirical results demonstrate robustness to different incorrect working models in homoscedastic and heteroskedastic models with error-prone covariates.

  5. Species distribution modelling for conservation of an endangered endemic orchid.

    Science.gov (United States)

    Wang, Hsiao-Hsuan; Wonkka, Carissa L; Treglia, Michael L; Grant, William E; Smeins, Fred E; Rogers, William E

    2015-04-21

    Concerns regarding the long-term viability of threatened and endangered plant species are increasingly warranted given the potential impacts of climate change and habitat fragmentation on unstable and isolated populations. Orchidaceae is the largest and most diverse family of flowering plants, but it is currently facing unprecedented risks of extinction. Despite substantial conservation emphasis on rare orchids, populations continue to decline. Spiranthes parksii (Navasota ladies' tresses) is a federally and state-listed endangered terrestrial orchid endemic to central Texas. Hence, we aimed to identify potential factors influencing the distribution of the species, quantify the relative importance of each factor and determine suitable habitat for future surveys and targeted conservation efforts. We analysed several geo-referenced variables describing climatic conditions and landscape features to identify potential factors influencing the likelihood of occurrence of S. parksii using boosted regression trees. Our model classified 97 % of the cells correctly with regard to species presence and absence, and indicated that probability of existence was correlated with climatic conditions and landscape features. The most influential variables were mean annual precipitation, mean elevation, mean annual minimum temperature and mean annual maximum temperature. The most likely suitable range for S. parksii was the eastern portions of Leon and Madison Counties, the southern portion of Brazos County, a portion of northern Grimes County and along the borders between Burleson and Washington Counties. Our model can assist in the development of an integrated conservation strategy through: (i) focussing future survey and research efforts on areas with a high likelihood of occurrence, (ii) aiding in selection of areas for conservation and restoration and (iii) framing future research questions including those necessary for predicting responses to climate change. Our model could also

  6. Improvements on Semi-Classical Distorted-Wave model

    Energy Technology Data Exchange (ETDEWEB)

    Sun Weili; Watanabe, Y.; Kuwata, R. [Kyushu Univ., Fukuoka (Japan); Kohno, M.; Ogata, K.; Kawai, M.

    1998-03-01

    A method of improving the Semi-Classical Distorted Wave (SCDW) model in terms of the Wigner transform of the one-body density matrix is presented. Finite size effect of atomic nuclei can be taken into account by using the single particle wave functions for harmonic oscillator or Wood-Saxon potential, instead of those based on the local Fermi-gas model which were incorporated into previous SCDW model. We carried out a preliminary SCDW calculation of 160 MeV (p,p`x) reaction on {sup 90}Zr with the Wigner transform of harmonic oscillator wave functions. It is shown that the present calculation of angular distributions increase remarkably at backward angles than the previous ones and the agreement with the experimental data is improved. (author)

  7. Improved Inference of Heteroscedastic Fixed Effects Models

    Directory of Open Access Journals (Sweden)

    Afshan Saeed

    2016-12-01

    Full Text Available Heteroscedasticity is a stern problem that distorts estimation and testing of panel data model (PDM. Arellano (1987 proposed the White (1980 estimator for PDM with heteroscedastic errors but it provides erroneous inference for the data sets including high leverage points. In this paper, our attempt is to improve heteroscedastic consistent covariance matrix estimator (HCCME for panel dataset with high leverage points. To draw robust inference for the PDM, our focus is to improve kernel bootstrap estimators, proposed by Racine and MacKinnon (2007. The Monte Carlo scheme is used for assertion of the results.

  8. Fast Performance Computing Model for Smart Distributed Power Systems

    Directory of Open Access Journals (Sweden)

    Umair Younas

    2017-06-01

    Full Text Available Plug-in Electric Vehicles (PEVs are becoming the more prominent solution compared to fossil fuels cars technology due to its significant role in Greenhouse Gas (GHG reduction, flexible storage, and ancillary service provision as a Distributed Generation (DG resource in Vehicle to Grid (V2G regulation mode. However, large-scale penetration of PEVs and growing demand of energy intensive Data Centers (DCs brings undesirable higher load peaks in electricity demand hence, impose supply-demand imbalance and threaten the reliability of wholesale and retail power market. In order to overcome the aforementioned challenges, the proposed research considers smart Distributed Power System (DPS comprising conventional sources, renewable energy, V2G regulation, and flexible storage energy resources. Moreover, price and incentive based Demand Response (DR programs are implemented to sustain the balance between net demand and available generating resources in the DPS. In addition, we adapted a novel strategy to implement the computational intensive jobs of the proposed DPS model including incoming load profiles, V2G regulation, battery State of Charge (SOC indication, and fast computation in decision based automated DR algorithm using Fast Performance Computing resources of DCs. In response, DPS provide economical and stable power to DCs under strict power quality constraints. Finally, the improved results are verified using case study of ISO California integrated with hybrid generation.

  9. Calculations of dose distributions using a neural network model

    International Nuclear Information System (INIS)

    Mathieu, R; Martin, E; Gschwind, R; Makovicka, L; Contassot-Vivier, S; Bahi, J

    2005-01-01

    The main goal of external beam radiotherapy is the treatment of tumours, while sparing, as much as possible, surrounding healthy tissues. In order to master and optimize the dose distribution within the patient, dosimetric planning has to be carried out. Thus, for determining the most accurate dose distribution during treatment planning, a compromise must be found between the precision and the speed of calculation. Current techniques, using analytic methods, models and databases, are rapid but lack precision. Enhanced precision can be achieved by using calculation codes based, for example, on Monte Carlo methods. However, in spite of all efforts to optimize speed (methods and computer improvements), Monte Carlo based methods remain painfully slow. A newer way to handle all of these problems is to use a new approach in dosimetric calculation by employing neural networks. Neural networks (Wu and Zhu 2000 Phys. Med. Biol. 45 913-22) provide the advantages of those various approaches while avoiding their main inconveniences, i.e., time-consumption calculations. This permits us to obtain quick and accurate results during clinical treatment planning. Currently, results obtained for a single depth-dose calculation using a Monte Carlo based code (such as BEAM (Rogers et al 2003 NRCC Report PIRS-0509(A) rev G)) require hours of computing. By contrast, the practical use of neural networks (Mathieu et al 2003 Proceedings Journees Scientifiques Francophones, SFRP) provides almost instant results and quite low errors (less than 2%) for a two-dimensional dosimetric map

  10. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  11. Charge distribution in an two-chain dual model

    International Nuclear Information System (INIS)

    Fialkowski, K.; Kotanski, A.

    1983-01-01

    Charge distributions in the multiple production processes are analysed using the dual chain model. A parametrisation of charge distributions for single dual chains based on the νp and anti vp data is proposed. The rapidity charge distributions are then calculated for pp and anti pp collisions and compared with the previous calculations based on the recursive cascade model of single chains. The results differ at the SPS collider energies and in the energy dependence of the net forward charge supplying the useful tests of the dual chain model. (orig.)

  12. An improved interfacial bonding model for material interface modeling

    Science.gov (United States)

    Lin, Liqiang; Wang, Xiaodu; Zeng, Xiaowei

    2016-01-01

    An improved interfacial bonding model was proposed from potential function point of view to investigate interfacial interactions in polycrystalline materials. It characterizes both attractive and repulsive interfacial interactions and can be applied to model different material interfaces. The path dependence of work-of-separation study indicates that the transformation of separation work is smooth in normal and tangential direction and the proposed model guarantees the consistency of the cohesive constitutive model. The improved interfacial bonding model was verified through a simple compression test in a standard hexagonal structure. The error between analytical solutions and numerical results from the proposed model is reasonable in linear elastic region. Ultimately, we investigated the mechanical behavior of extrafibrillar matrix in bone and the simulation results agreed well with experimental observations of bone fracture. PMID:28584343

  13. Modeling and improving Ethiopian pasture systems

    Science.gov (United States)

    Parisi, S. G.; Cola, G.; Gilioli, G.; Mariani, L.

    2018-05-01

    The production of pasture in Ethiopia was simulated by means of a dynamic model. Most of the country is characterized by a tropical monsoon climate with mild temperatures and precipitation mainly concentrated in the June-September period (main rainy season). The production model is driven by solar radiation and takes into account limitations due to relocation, maintenance respiration, conversion to final dry matter, temperature, water stress, and nutrients availability. The model also considers the senescence of grassland which strongly limits the nutritional value of grasses for livestock. The simulation for the 1982-2009 period, performed on gridded daily time series of rainfall and maximum and minimum temperature with a resolution of 0.5°, provided results comparable with values reported in literature. Yearly mean yield in Ethiopia ranged between 1.8 metric ton per hectare (t ha-1) (2002) and 2.6 t ha-1 (1989) of dry matter with values above 2.5 t ha-1 attained in 1983, 1985, 1989, and 2008. The Ethiopian territory has been subdivided in 1494 cells and a frequency distribution of the per-cell yearly mean pasture production has been obtained. This distribution ranges from 0 to 7 t ha-1 and it shows a right skewed distribution and a modal class between 1.5-2 t ha-1. Simulation carried out on long time series for this peculiar tropical environment give rise to as lot of results relevant by the agroecological point of view on space variability of pasture production, main limiting factors (solar radiation, precipitation, temperature), and relevant meteo-climatic cycles affecting pasture production (seasonal and inter yearly variability, ENSO). These results are useful to establish an agro-ecological zoning of the Ethiopian territory.

  14. Modeling and improving Ethiopian pasture systems

    Science.gov (United States)

    Parisi, S. G.; Cola, G.; Gilioli, G.; Mariani, L.

    2018-01-01

    The production of pasture in Ethiopia was simulated by means of a dynamic model. Most of the country is characterized by a tropical monsoon climate with mild temperatures and precipitation mainly concentrated in the June-September period (main rainy season). The production model is driven by solar radiation and takes into account limitations due to relocation, maintenance respiration, conversion to final dry matter, temperature, water stress, and nutrients availability. The model also considers the senescence of grassland which strongly limits the nutritional value of grasses for livestock. The simulation for the 1982-2009 period, performed on gridded daily time series of rainfall and maximum and minimum temperature with a resolution of 0.5°, provided results comparable with values reported in literature. Yearly mean yield in Ethiopia ranged between 1.8 metric ton per hectare (t ha-1) (2002) and 2.6 t ha-1 (1989) of dry matter with values above 2.5 t ha-1 attained in 1983, 1985, 1989, and 2008. The Ethiopian territory has been subdivided in 1494 cells and a frequency distribution of the per-cell yearly mean pasture production has been obtained. This distribution ranges from 0 to 7 t ha-1 and it shows a right skewed distribution and a modal class between 1.5-2 t ha-1. Simulation carried out on long time series for this peculiar tropical environment give rise to as lot of results relevant by the agroecological point of view on space variability of pasture production, main limiting factors (solar radiation, precipitation, temperature), and relevant meteo-climatic cycles affecting pasture production (seasonal and inter yearly variability, ENSO). These results are useful to establish an agro-ecological zoning of the Ethiopian territory.

  15. Quantifying Distributional Model Risk via Optimal Transport

    OpenAIRE

    Blanchet, Jose; Murthy, Karthyek R. A.

    2016-01-01

    This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality, in particular, we provide examples involving path dependent expectations of stochastic processes. Our approach consists in computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms ...

  16. Improved Collaborative Filtering Algorithm using Topic Model

    Directory of Open Access Journals (Sweden)

    Liu Na

    2016-01-01

    Full Text Available Collaborative filtering algorithms make use of interactions rates between users and items for generating recommendations. Similarity among users or items is calculated based on rating mostly, without considering explicit properties of users or items involved. In this paper, we proposed collaborative filtering algorithm using topic model. We describe user-item matrix as document-word matrix and user are represented as random mixtures over item, each item is characterized by a distribution over users. The experiments showed that the proposed algorithm achieved better performance compared the other state-of-the-art algorithms on Movie Lens data sets.

  17. Cost allocation model for distribution networks considering high penetration of distributed energy resources

    DEFF Research Database (Denmark)

    Soares, Tiago; Pereira, Fábio; Morais, Hugo

    2015-01-01

    The high penetration of distributed energy resources (DER) in distribution networks and the competitive environment of electricity markets impose the use of new approaches in several domains. The network cost allocation, traditionally used in transmission networks, should be adapted and used...... in the distribution networks considering the specifications of the connected resources. The main goal is to develop a fairer methodology trying to distribute the distribution network use costs to all players which are using the network in each period. In this paper, a model considering different type of costs (fixed......, losses, and congestion costs) is proposed comprising the use of a large set of DER, namely distributed generation (DG), demand response (DR) of direct load control type, energy storage systems (ESS), and electric vehicles with capability of discharging energy to the network, which is known as vehicle...

  18. Rationalisation of distribution functions for models of nanoparticle magnetism

    International Nuclear Information System (INIS)

    El-Hilo, M.; Chantrell, R.W.

    2012-01-01

    A formalism is presented which reconciles the use of different distribution functions of particle diameter in analytical models of the magnetic properties of nanoparticle systems. For the lognormal distribution a transformation is derived which shows that a distribution of volume fraction transforms into a lognormal distribution of particle number albeit with a modified median diameter. This transformation resolves an apparent discrepancy reported in Tournus and Tamion [Journal of Magnetism and Magnetic Materials 323 (2011) 1118]. - Highlights: ► We resolve a problem resulting from the misunderstanding of the nature. ► The nature of dispersion functions in models of nanoparticle magnetism. ► The derived transformation between distributions will be of benefit in comparing models and experimental results.

  19. Linear Power-Flow Models in Multiphase Distribution Networks: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Bernstein, Andrey; Dall' Anese, Emiliano

    2017-05-26

    This paper considers multiphase unbalanced distribution systems and develops approximate power-flow models where bus-voltages, line-currents, and powers at the point of common coupling are linearly related to the nodal net power injections. The linearization approach is grounded on a fixed-point interpretation of the AC power-flow equations, and it is applicable to distribution systems featuring (i) wye connections; (ii) ungrounded delta connections; (iii) a combination of wye-connected and delta-connected sources/loads; and, (iv) a combination of line-to-line and line-to-grounded-neutral devices at the secondary of distribution transformers. The proposed linear models can facilitate the development of computationally-affordable optimization and control applications -- from advanced distribution management systems settings to online and distributed optimization routines. Performance of the proposed models is evaluated on different test feeders.

  20. Distributed MAP in the SpinJa Model Checker

    Directory of Open Access Journals (Sweden)

    Stefan Vijzelaar

    2011-10-01

    Full Text Available Spin in Java (SpinJa is an explicit state model checker for the Promela modelling language also used by the SPIN model checker. Designed to be extensible and reusable, the implementation of SpinJa follows a layered approach in which each new layer extends the functionality of the previous one. While SpinJa has preliminary support for shared-memory model checking, it did not yet support distributed-memory model checking. This tool paper presents a distributed implementation of a maximal accepting predecessors (MAP search algorithm on top of SpinJa.

  1. Modeling and Control for Islanding Operation of Active Distribution Systems

    DEFF Research Database (Denmark)

    Cha, Seung-Tae; Wu, Qiuwei; Saleem, Arshad

    2011-01-01

    to stabilize the frequency. Different agents are defined to represent different resources in the distribution systems. A test platform with a real time digital simulator (RTDS), an OPen Connectivity (OPC) protocol server and the multi-agent based intelligent controller is established to test the proposed multi......Along with the increasing penetration of distributed generation (DG) in distribution systems, there are more resources for system operators to improve the operation and control of the whole system and enhance the reliability of electricity supply to customers. The distribution systems with DG...... are able to operate in is-landing operation mode intentionally or unintentionally. In order to smooth the transition from grid connected operation to islanding operation for distribution systems with DG, a multi-agent based controller is proposed to utilize different re-sources in the distribution systems...

  2. Modeling and analysis of solar distributed generation

    Science.gov (United States)

    Ortiz Rivera, Eduardo Ivan

    Recent changes in the global economy are creating a big impact in our daily life. The price of oil is increasing and the number of reserves are less every day. Also, dramatic demographic changes are impacting the viability of the electric infrastructure and ultimately the economic future of the industry. These are some of the reasons that many countries are looking for alternative energy to produce electric energy. The most common form of green energy in our daily life is solar energy. To convert solar energy into electrical energy is required solar panels, dc-dc converters, power control, sensors, and inverters. In this work, a photovoltaic module, PVM, model using the electrical characteristics provided by the manufacturer data sheet is presented for power system applications. Experimental results from testing are showed, verifying the proposed PVM model. Also in this work, three maximum power point tracker, MPPT, algorithms would be presented to obtain the maximum power from a PVM. The first MPPT algorithm is a method based on the Rolle's and Lagrange's Theorems and can provide at least an approximate answer to a family of transcendental functions that cannot be solved using differential calculus. The second MPPT algorithm is based on the approximation of the proposed PVM model using fractional polynomials where the shape, boundary conditions and performance of the proposed PVM model are satisfied. The third MPPT algorithm is based in the determination of the optimal duty cycle for a dc-dc converter and the previous knowledge of the load or load matching conditions. Also, four algorithms to calculate the effective irradiance level and temperature over a photovoltaic module are presented in this work. The main reasons to develop these algorithms are for monitoring climate conditions, the elimination of temperature and solar irradiance sensors, reductions in cost for a photovoltaic inverter system, and development of new algorithms to be integrated with maximum

  3. A improved Network Security Situation Awareness Model

    Directory of Open Access Journals (Sweden)

    Li Fangwei

    2015-08-01

    Full Text Available In order to reflect the situation of network security assessment performance fully and accurately, a new network security situation awareness model based on information fusion was proposed. Network security situation is the result of fusion three aspects evaluation. In terms of attack, to improve the accuracy of evaluation, a situation assessment method of DDoS attack based on the information of data packet was proposed. In terms of vulnerability, a improved Common Vulnerability Scoring System (CVSS was raised and maked the assessment more comprehensive. In terms of node weights, the method of calculating the combined weights and optimizing the result by Sequence Quadratic Program (SQP algorithm which reduced the uncertainty of fusion was raised. To verify the validity and necessity of the method, a testing platform was built and used to test through evaluating 2000 DAPRA data sets. Experiments show that the method can improve the accuracy of evaluation results.

  4. Idealized models of the joint probability distribution of wind speeds

    Science.gov (United States)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  5. Numerical modeling of regional stress distributions for geothermal exploration

    Science.gov (United States)

    Guillon, Theophile; Peter-Borie, Mariane; Gentier, Sylvie; Blaisonneau, Arnold

    2017-04-01

    Any high-enthalpy unconventional geothermal projectcan be jeopardized by the uncertainty on the presence of the geothermal resource at depth. Indeed, for the majority of such projects the geothermal resource is deeply seated and, with the drilling costs increasing accordingly, must be located as precisely as possible to increase the chance of their economic viability. In order to reduce the "geological risk", i.e., the chance to poorly locate the geothermal resource, a maximum amount of information must be gathered prior to any drilling of exploration and/or operational well. Cross-interpretation from multiple disciplines (e.g., geophysics, hydrology, geomechanics …) should improve locating the geothermal resource and so the position of exploration wells ; this is the objective of the European project IMAGE (grant agreement No. 608553), under which the work presented here was carried out. As far as geomechanics is concerned, in situ stresses can have a great impact on the presence of a geothermal resource since they condition both the regime within the rock mass, and the state of the major fault zones (and hence, the possible flow paths). In this work, we propose a geomechanical model to assess the stress distribution at the regional scale (characteristic length of 100 kilometers). Since they have a substantial impact on the stress distributions and on the possible creation of regional flow paths, the major fault zones are explicitly taken into account. The Distinct Element Method is used, where the medium is modeled as fully deformable blocks representing the rock mass interacting through mechanically active joints depicting the fault zones. The first step of the study is to build the model geometry based on geological and geophysical evidences. Geophysical and structural geology results help positioning the major fault zones in the first place. Then, outcrop observations, structural models and site-specific geological knowledge give information on the fault

  6. Model Checking Geographically Distributed Interlocking Systems Using UMC

    DEFF Research Database (Denmark)

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Nielsen, Michel Bøje Randahl

    2017-01-01

    the relevant distributed protocols. By doing that we obey the safety guidelines of the railway signalling domain, that require formal methods to support the certification of such products. We also show how formal modelling can help designing alternative distributed solutions, while maintaining adherence...

  7. Modelling aspects of distributed processing in telecommunication networks

    NARCIS (Netherlands)

    Tomasgard, A; Audestad, JA; Dye, S; Stougie, L; van der Vlerk, MH; Wallace, SW

    1998-01-01

    The purpose of this paper is to formally describe new optimization models for telecommunication networks with distributed processing. Modem distributed networks put more focus on the processing of information and less on the actual transportation of data than we are traditionally used to in

  8. Modified Normal Demand Distributions in (R,S)-Inventory Models

    NARCIS (Netherlands)

    Strijbosch, L.W.G.; Moors, J.J.A.

    2003-01-01

    To model demand, the normal distribution is by far the most popular; the disadvantage that it takes negative values is taken for granted.This paper proposes two modi.cations of the normal distribution, both taking non-negative values only.Safety factors and order-up-to-levels for the familiar (R,

  9. Income Distribution Over Educational Levels: A Simple Model.

    Science.gov (United States)

    Tinbergen, Jan

    An econometric model is formulated that explains income per person in various compartments of the labor market defined by three main levels of education and by education required. The model enables an estimation of the effect of increased access to education on that distribution. The model is based on a production for the economy as a whole; a…

  10. Development of vortex model with realistic axial velocity distribution

    International Nuclear Information System (INIS)

    Ito, Kei; Ezure, Toshiki; Ohshima, Hiroyuki

    2014-01-01

    A vortex is considered as one of significant phenomena which may cause gas entrainment (GE) and/or vortex cavitation in sodium-cooled fast reactors. In our past studies, the vortex is assumed to be approximated by the well-known Burgers vortex model. However, the Burgers vortex model has a simple but unreal assumption that the axial velocity component is horizontally constant, while in real the free surface vortex has the axial velocity distribution which shows large gradient in radial direction near the vortex center. In this study, a new vortex model with realistic axial velocity distribution is proposed. This model is derived from the steady axisymmetric Navier-Stokes equation as well as the Burgers vortex model, but the realistic axial velocity distribution in radial direction is considered, which is defined to be zero at the vortex center and to approach asymptotically to zero at infinity. As the verification, the new vortex model is applied to the evaluation of a simple vortex experiment, and shows good agreements with the experimental data in terms of the circumferential velocity distribution and the free surface shape. In addition, it is confirmed that the Burgers vortex model fails to calculate accurate velocity distribution with the assumption of uniform axial velocity. However, the calculation accuracy of the Burgers vortex model can be enhanced close to that of the new vortex model in consideration of the effective axial velocity which is calculated as the average value only in the vicinity of the vortex center. (author)

  11. Application of Improved Radiation Modeling to General Circulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Michael J Iacono

    2011-04-07

    This research has accomplished its primary objectives of developing accurate and efficient radiation codes, validating them with measurements and higher resolution models, and providing these advancements to the global modeling community to enhance the treatment of cloud and radiative processes in weather and climate prediction models. A critical component of this research has been the development of the longwave and shortwave broadband radiative transfer code for general circulation model (GCM) applications, RRTMG, which is based on the single-column reference code, RRTM, also developed at AER. RRTMG is a rigorously tested radiation model that retains a considerable level of accuracy relative to higher resolution models and measurements despite the performance enhancements that have made it possible to apply this radiation code successfully to global dynamical models. This model includes the radiative effects of all significant atmospheric gases, and it treats the absorption and scattering from liquid and ice clouds and aerosols. RRTMG also includes a statistical technique for representing small-scale cloud variability, such as cloud fraction and the vertical overlap of clouds, which has been shown to improve cloud radiative forcing in global models. This development approach has provided a direct link from observations to the enhanced radiative transfer provided by RRTMG for application to GCMs. Recent comparison of existing climate model radiation codes with high resolution models has documented the improved radiative forcing capability provided by RRTMG, especially at the surface, relative to other GCM radiation models. Due to its high accuracy, its connection to observations, and its computational efficiency, RRTMG has been implemented operationally in many national and international dynamical models to provide validated radiative transfer for improving weather forecasts and enhancing the prediction of global climate change.

  12. Regularized multivariate regression models with skew-t error distributions

    KAUST Repository

    Chen, Lianfu; Pourahmadi, Mohsen; Maadooliat, Mehdi

    2014-01-01

    We consider regularization of the parameters in multivariate linear regression models with the errors having a multivariate skew-t distribution. An iterative penalized likelihood procedure is proposed for constructing sparse estimators of both

  13. Modelling distributed energy resources in energy service networks

    CERN Document Server

    Acha, Salvador

    2013-01-01

    Focuses on modelling two key infrastructures (natural gas and electrical) in urban energy systems with embedded technologies (cogeneration and electric vehicles) to optimise the operation of natural gas and electrical infrastructures under the presence of distributed energy resources

  14. Modeling of Drift Effects on Solar Tower Concentrated Flux Distributions

    Directory of Open Access Journals (Sweden)

    Luis O. Lara-Cerecedo

    2016-01-01

    Full Text Available A novel modeling tool for calculation of central receiver concentrated flux distributions is presented, which takes into account drift effects. This tool is based on a drift model that includes different geometrical error sources in a rigorous manner and on a simple analytic approximation for the individual flux distribution of a heliostat. The model is applied to a group of heliostats of a real field to obtain the resulting flux distribution and its variation along the day. The distributions differ strongly from those obtained assuming the ideal case without drift or a case with a Gaussian tracking error function. The time evolution of peak flux is also calculated to demonstrate the capabilities of the model. The evolution of this parameter also shows strong differences in comparison to the case without drift.

  15. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  16. An Improved QTM Subdivision Model with Approximate Equal-area

    Directory of Open Access Journals (Sweden)

    ZHAO Xuesheng

    2016-01-01

    Full Text Available To overcome the defect of large area deformation in the traditional QTM subdivision model, an improved subdivision model is proposed which based on the “parallel method” and the thought of the equal area subdivision with changed-longitude-latitude. By adjusting the position of the parallel, this model ensures that the grid area between two adjacent parallels combined with no variation, so as to control area variation and variation accumulation of the QTM grid. The experimental results show that this improved model not only remains some advantages of the traditional QTM model(such as the simple calculation and the clear corresponding relationship with longitude/latitude grid, etc, but also has the following advantages: ①this improved model has a better convergence than the traditional one. The ratio of area_max/min finally converges to 1.38, far less than 1.73 of the “parallel method”; ②the grid units in middle and low latitude regions have small area variations and successive distributions; meanwhile, with the increase of subdivision level, the grid units with large variations gradually concentrate to the poles; ③the area variation of grid unit will not cumulate with the increasing of subdivision level.

  17. An improved gravity model for Mars: Goddard Mars Model 1

    Science.gov (United States)

    Smith, D. E.; Lerch, F. J.; Nerem, R. S.; Zuber, M. T.; Patel, G. B.; Fricke, S. K.; Lemoine, F. G.

    1993-01-01

    Doppler tracking data of three orbiting spacecraft have been reanalyzed to develop a new gravitational field model for the planet Mars, Goddard Mars Model 1 (GMM-1). This model employs nearly all available data, consisting of approximately 1100 days of S band tracking data collected by NASA's Deep Space Network from the Mariner 9 and Viking 1 and Viking 2 spacecraft, in seven different orbits, between 1971 and 1979. GMM-1 is complete to spherical harmonic degree and order 50, which corresponds to a half-wavelength spatial resolution of 200-300 km where the data permit. GMM-1 represents satellite orbits with considerably better accuracy than previous Mars gravity models and shows greater resolution of identifiable geological structures. The notable improvement in GMM-1 over previous models is a consequence of several factors: improved computational capabilities, the use of otpimum weighting and least squares collocation solution techniques which stabilized the behavior of the solution at high degree and order, and the use of longer satellite arcs than employed in previous solutions that were made possible by improved force and measurement models. The inclusion of X band tracking data from the 379-km altitude, nnear-polar orbiting Mars Observer spacecraft should provide a significant improvement over GMM-1, particularly at high latitudes where current data poorly resolve the gravitational signature of the planet.

  18. Canonical Probability Distributions for Model Building, Learning, and Inference

    National Research Council Canada - National Science Library

    Druzdzel, Marek J

    2006-01-01

    ...) improvements of stochastic sampling algorithms based on importance sampling, and (3) practical applications of our general purpose decision modeling environment to diagnosis of complex systems...

  19. Improving the cooling performance of electrical distribution transformer using transformer oil – Based MEPCM suspension

    Directory of Open Access Journals (Sweden)

    Mushtaq Ismael Hasan

    2017-04-01

    Full Text Available In this paper the electrical distribution transformer has been studied numerically and the effect of outside temperature on its cooling performance has been investigated. The temperature range studied covers the hot climate regions. 250 KVA distribution transformer is chosen as a study model. A novel cooling fluid is proposed to improve the cooling performance of this transformer, transformer oil-based microencapsulated phase change materials suspension is used with volume concentration (5–25% as a cooling fluid instead of pure transformer oil. Paraffin wax is used as a phase change material to make the suspension, in addition to the ability of heat absorption due to melting, the paraffin wax considered as a good electrical insulator. Results obtained show that, using of MEPCM suspension instead of pure transformer oil lead to improve the cooling performance of transformer by reducing its temperature and as a consequence increasing its protection against the breakdown. The melting fraction increased with increasing outside temperature up to certain temperature after which the melting fraction reach maximum constant value (MF = 1 which indicate that, the choosing of PCM depend on the environment in which the transformer is used.

  20. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  1. A Complex Network Approach to Distributional Semantic Models.

    Directory of Open Access Journals (Sweden)

    Akira Utsumi

    Full Text Available A number of studies on network analysis have focused on language networks based on free word association, which reflects human lexical knowledge, and have demonstrated the small-world and scale-free properties in the word association network. Nevertheless, there have been very few attempts at applying network analysis to distributional semantic models, despite the fact that these models have been studied extensively as computational or cognitive models of human lexical knowledge. In this paper, we analyze three network properties, namely, small-world, scale-free, and hierarchical properties, of semantic networks created by distributional semantic models. We demonstrate that the created networks generally exhibit the same properties as word association networks. In particular, we show that the distribution of the number of connections in these networks follows the truncated power law, which is also observed in an association network. This indicates that distributional semantic models can provide a plausible model of lexical knowledge. Additionally, the observed differences in the network properties of various implementations of distributional semantic models are consistently explained or predicted by considering the intrinsic semantic features of a word-context matrix and the functions of matrix weighting and smoothing. Furthermore, to simulate a semantic network with the observed network properties, we propose a new growing network model based on the model of Steyvers and Tenenbaum. The idea underlying the proposed model is that both preferential and random attachments are required to reflect different types of semantic relations in network growth process. We demonstrate that this model provides a better explanation of network behaviors generated by distributional semantic models.

  2. Design, modeling, simulation and evaluation of a distributed energy system

    Science.gov (United States)

    Cultura, Ambrosio B., II

    This dissertation presents the design, modeling, simulation and evaluation of distributed energy resources (DER) consisting of photovoltaics (PV), wind turbines, batteries, a PEM fuel cell and supercapacitors. The distributed energy resources installed at UMass Lowell consist of the following: 2.5kW PV, 44kWhr lead acid batteries and 1500W, 500W & 300W wind turbines, which were installed before year 2000. Recently added to that are the following: 10.56 kW PV array, 2.4 kW wind turbine, 29 kWhr Lead acid batteries, a 1.2 kW PEM fuel cell and 4-140F supercapacitors. Each newly added energy resource has been designed, modeled, simulated and evaluated before its integration into the existing PV/Wind grid-connected system. The Mathematical and Simulink model of each system was derived and validated by comparing the simulated and experimental results. The Simulated results of energy generated from a 10.56kW PV system are in good agreement with the experimental results. A detailed electrical model of a 2.4kW wind turbine system equipped with a permanent magnet generator, diode rectifier, boost converter and inverter is presented. The analysis of the results demonstrates the effectiveness of the constructed simulink model, and can be used to predict the performance of the wind turbine. It was observed that a PEM fuel cell has a very fast response to load changes. Moreover, the model has validated the actual operation of the PEM fuel cell, showing that the simulated results in Matlab Simulink are consistent with the experimental results. The equivalent mathematical equation, derived from an electrical model of the supercapacitor, is used to simulate its voltage response. The model is completely capable of simulating its voltage behavior, and can predict the charge time and discharge time of voltages on the supercapacitor. The bi-directional dc-dc converter was designed in order to connect the 48V battery bank storage to the 24V battery bank storage. This connection was

  3. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  4. Maxent modelling for predicting the potential distribution of Thai Palms

    DEFF Research Database (Denmark)

    Tovaranonte, Jantrararuk; Barfod, Anders S.; Overgaard, Anne Blach

    2011-01-01

    on presence data. The aim was to identify potential hot spot areas, assess the determinants of palm distribution ranges, and provide a firmer knowledge base for future conservation actions. We focused on a relatively small number of climatic, environmental and spatial variables in order to avoid...... overprediction of species distribution ranges. The models with the best predictive power were found by calculating the area under the curve (AUC) of receiver-operating characteristic (ROC). Here, we provide examples of contrasting predicted species distribution ranges as well as a map of modeled palm diversity...

  5. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    Science.gov (United States)

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  6. A Stochastic After-Taxes Optimisation Model to Support Distribution Network Strategies

    DEFF Research Database (Denmark)

    Fernandes, Rui; Hvolby, Hans-Henrik; Gouveia, Borges

    2012-01-01

    The paper proposes a stochastic model to integrate tax issues into strategic distribution network decisions. Specifically, this study will explore the role of distribution models in business profitability, and how to use the network design to deliver additional bottom-line results, using...... distribution centres located in different countries. The challenge is also to reveal how financial and tax knowledge can help logistic leaders improving the value to their companies under global solutions and sources of business net profitability in a dynamic environment. In particular, based on inventory...

  7. Improving Bioenergy Crops through Dynamic Metabolic Modeling

    Directory of Open Access Journals (Sweden)

    Mojdeh Faraji

    2017-10-01

    Full Text Available Enormous advances in genetics and metabolic engineering have made it possible, in principle, to create new plants and crops with improved yield through targeted molecular alterations. However, while the potential is beyond doubt, the actual implementation of envisioned new strains is often difficult, due to the diverse and complex nature of plants. Indeed, the intrinsic complexity of plants makes intuitive predictions difficult and often unreliable. The hope for overcoming this challenge is that methods of data mining and computational systems biology may become powerful enough that they could serve as beneficial tools for guiding future experimentation. In the first part of this article, we review the complexities of plants, as well as some of the mathematical and computational methods that have been used in the recent past to deepen our understanding of crops and their potential yield improvements. In the second part, we present a specific case study that indicates how robust models may be employed for crop improvements. This case study focuses on the biosynthesis of lignin in switchgrass (Panicum virgatum. Switchgrass is considered one of the most promising candidates for the second generation of bioenergy production, which does not use edible plant parts. Lignin is important in this context, because it impedes the use of cellulose in such inedible plant materials. The dynamic model offers a platform for investigating the pathway behavior in transgenic lines. In particular, it allows predictions of lignin content and composition in numerous genetic perturbation scenarios.

  8. Improving PSA quality of KSNP PSA model

    International Nuclear Information System (INIS)

    Yang, Joon Eon; Ha, Jae Joo

    2004-01-01

    In the RIR (Risk-informed Regulation), PSA (Probabilistic Safety Assessment) plays a major role because it provides overall risk insights for the regulatory body and utility. Therefore, the scope, the level of details and the technical adequacy of PSA, i.e. the quality of PSA is to be ensured for the successful RIR. To improve the quality of Korean PSA, we evaluate the quality of the KSNP (Korean Standard Nuclear Power Plant) internal full-power PSA model based on the 'ASME PRA Standard' and the 'NEI PRA Peer Review Process Guidance.' As a working group, PSA experts of the regulatory body and industry also participated in the evaluation process. It is finally judged that the overall quality of the KSNP PSA is between the ASME Standard Capability Category I and II. We also derive some items to be improved for upgrading the quality of the PSA up to the ASME Standard Capability Category II. In this paper, we show the result of quality evaluation, and the activities to improve the quality of the KSNP PSA model

  9. Validation of model predictions of pore-scale fluid distributions during two-phase flow

    Science.gov (United States)

    Bultreys, Tom; Lin, Qingyang; Gao, Ying; Raeini, Ali Q.; AlRatrout, Ahmed; Bijeljic, Branko; Blunt, Martin J.

    2018-05-01

    Pore-scale two-phase flow modeling is an important technology to study a rock's relative permeability behavior. To investigate if these models are predictive, the calculated pore-scale fluid distributions which determine the relative permeability need to be validated. In this work, we introduce a methodology to quantitatively compare models to experimental fluid distributions in flow experiments visualized with microcomputed tomography. First, we analyzed five repeated drainage-imbibition experiments on a single sample. In these experiments, the exact fluid distributions were not fully repeatable on a pore-by-pore basis, while the global properties of the fluid distribution were. Then two fractional flow experiments were used to validate a quasistatic pore network model. The model correctly predicted the fluid present in more than 75% of pores and throats in drainage and imbibition. To quantify what this means for the relevant global properties of the fluid distribution, we compare the main flow paths and the connectivity across the different pore sizes in the modeled and experimental fluid distributions. These essential topology characteristics matched well for drainage simulations, but not for imbibition. This suggests that the pore-filling rules in the network model we used need to be improved to make reliable predictions of imbibition. The presented analysis illustrates the potential of our methodology to systematically and robustly test two-phase flow models to aid in model development and calibration.

  10. Renewable Distributed Generation Models in Three-Phase Load Flow Analysis for Smart Grid

    Directory of Open Access Journals (Sweden)

    K. M. Nor

    2013-11-01

    Full Text Available The paper presents renewable distributed generation  (RDG models as three-phase resource in load flow computation and analyzes their effect when they are connected in composite networks. The RDG models that have been considered comprise of photovoltaic (PV and wind turbine generation (WTG. The voltage-controlled node and complex power injection node are used in the models. These improvement models are suitable for smart grid power system analysis. The combination of IEEE transmission and distribution data used to test and analyze the algorithm in solving balanced/unbalanced active systems. The combination of IEEE transmission data and IEEE test feeder are used to test the the algorithm for balanced and unbalanced multi-phase distribution system problem. The simulation results show that by increased number and size of RDG units have improved voltage profile and reduced system losses.

  11. Modeling, robust and distributed model predictive control for freeway networks

    NARCIS (Netherlands)

    Liu, S.

    2016-01-01

    In Model Predictive Control (MPC) for traffic networks, traffic models are crucial since they are used as prediction models for determining the optimal control actions. In order to reduce the computational complexity of MPC for traffic networks, macroscopic traffic models are often used instead of

  12. The Distributed Geothermal Market Demand Model (dGeo): Documentation

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mooney, Meghan E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sigrin, Benjamin O [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gleason, Michael [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Liu, Xiaobing [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-06

    The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistent with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.

  13. Working toward integrated models of alpine plant distribution.

    Science.gov (United States)

    Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe

    2013-10-01

    Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.

  14. Orbital angular momentum parton distributions in quark models

    International Nuclear Information System (INIS)

    Scopetta, S.; Vento, V.

    2000-01-01

    At the low energy, hadronic, scale we calculate Orbital Angular Momentum (OAM) twist-two parton distributions for the relativistic MIT bag model and for nonrelativistic quark models. We reach the scale of the data by leading order evolution in perturbative QCD. We confirm that the contribution of quarks and gluons OAM to the nucleon spin grows with Q 2 , and it can be relevant at the experimental scale, even if it is negligible at the hadronic scale, irrespective of the model used. The sign and shape of the quark OAM distribution at high Q 2 may depend strongly on the relative size of the OAM and spin distributions at the hadronic scale. Sizeable quark OAM distributions at the hadronic scale, as proposed by several authors, can produce the dominant contribution to the nucleon spin at high Q 2 . (author)

  15. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  16. Improving Clinical Trial Cohort Definition Criteria and Enrollment with Distributional Semantic Matching

    OpenAIRE

    Shao, Jianyin; Gouripeddi, Ramkiran; Facelli, Julio C.

    2016-01-01

    Shao, J., Gouripeddi, R., & Facelli, J.C. (2016). Improving Clinical Trial Cohort Definition Criteria and Enrollment with Distributional Semantic Matching (poster). Research Reproducibility 2016. Salt Lake City, UT, USA

  17. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  18. Modelling the potential distribution of Betula utilis in the Himalaya

    Directory of Open Access Journals (Sweden)

    Maria Bobrowski

    2017-07-01

    Full Text Available Developing sustainable adaptation pathways under climate change conditions in mountain regions requires accurate predictions of treeline shifts and future distribution ranges of treeline species. Here, we model for the first time the potential distribution of Betula utilis, a principal Himalayan treeline species, to provide a basis for the analysis of future range shifts. Our target species Betula utilis is widespread at alpine treelines in the Himalayan mountains, the distribution range extends across the Himalayan mountain range. Our objective is to model the potential distribution of B. utilis in relation to current climate conditions. We generated a dataset of 590 occurrence records and used 24 variables for ecological niche modelling. We calibrated Generalized Linear Models using the Akaike Information Criterion (AIC and evaluated model performance using threshold-independent (AUC, Area Under the Curve and threshold-dependent (TSS, True Skill Statistics characteristics as well as visual assessments of projected distribution maps. We found two temperature-related (Mean Temperature of the Wettest Quarter, Temperature Annual Range and three precipitation-related variables (Precipitation of the Coldest Quarter, Average Precipitation of March, April and May and Precipitation Seasonality to be useful for predicting the potential distribution of B. utilis. All models had high predictive power (AUC ≥ 0.98 and TSS ≥ 0.89. The projected suitable area in the Himalayan mountains varies considerably, with most extensive distribution in the western and central Himalayan region. A substantial difference between potential and real distribution in the eastern Himalaya points to decreasing competitiveness of B. utilis under more oceanic conditions in the eastern part of the mountain system. A comparison between the vegetation map of Schweinfurth (1957 and our current predictions suggests that B. utilis does not reach the upper elevational limit in

  19. An Improved Inventory Control Model for the Brazilian Navy Supply System

    Science.gov (United States)

    2001-12-01

    Portuguese Centro de Controle de Inventario da Marinha, the Brazilian Navy Inventory Control Point (ICP) developed an empirical model called SPAADA...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited AN IMPROVED INVENTORY CONTROL ...AN IMPROVED INVENTORY CONTROL MODEL FOR THE BRAZILIAN NAVY SUPPLY SYSTEM Contract Number Grant Number Program Element Number Author(s) Moreira

  20. Siting and sizing of distributed generators based on improved simulated annealing particle swarm optimization.

    Science.gov (United States)

    Su, Hongsheng

    2017-12-18

    Distributed power grids generally contain multiple diverse types of distributed generators (DGs). Traditional particle swarm optimization (PSO) and simulated annealing PSO (SA-PSO) algorithms have some deficiencies in site selection and capacity determination of DGs, such as slow convergence speed and easily falling into local trap. In this paper, an improved SA-PSO (ISA-PSO) algorithm is proposed by introducing crossover and mutation operators of genetic algorithm (GA) into SA-PSO, so that the capabilities of the algorithm are well embodied in global searching and local exploration. In addition, diverse types of DGs are made equivalent to four types of nodes in flow calculation by the backward or forward sweep method, and reactive power sharing principles and allocation theory are applied to determine initial reactive power value and execute subsequent correction, thus providing the algorithm a better start to speed up the convergence. Finally, a mathematical model of the minimum economic cost is established for the siting and sizing of DGs under the location and capacity uncertainties of each single DG. Its objective function considers investment and operation cost of DGs, grid loss cost, annual purchase electricity cost, and environmental pollution cost, and the constraints include power flow, bus voltage, conductor current, and DG capacity. Through applications in an IEEE33-node distributed system, it is found that the proposed method can achieve desirable economic efficiency and safer voltage level relative to traditional PSO and SA-PSO algorithms, and is a more effective planning method for the siting and sizing of DGs in distributed power grids.

  1. Location Model for Distribution Centers for Fulfilling Electronic Orders of Fresh Foods under Uncertain Demand

    Directory of Open Access Journals (Sweden)

    Hao Zhang

    2017-01-01

    Full Text Available The problem of locating distribution centers for delivering fresh food as a part of electronic commerce is a strategic decision problem for enterprises. This paper establishes a model for locating distribution centers that considers the uncertainty of customer demands for fresh goods in terms of time-sensitiveness and freshness. Based on the methodology of robust optimization in dealing with uncertain problems, this paper optimizes the location model in discrete demand probabilistic scenarios. In this paper, an improved fruit fly optimization algorithm is proposed to solve the distribution center location problem. An example is given to show that the proposed model and algorithm are robust and can effectively handle the complications caused by uncertain demand. The model proposed in this paper proves valuable both theoretically and practically in the selection of locations of distribution centers.

  2. Testing species distribution models across space and time: high latitude butterflies and recent warming

    DEFF Research Database (Denmark)

    Eskildsen, Anne; LeRoux, Peter C.; Heikkinen, Risto K.

    2013-01-01

    changes at expanding range margins can be predicted accurately. Location. Finland. Methods. Using 10-km resolution butterfly atlas data from two periods, 1992–1999 (t1) and 2002–2009 (t2), with a significant between-period temperature increase, we modelled the effects of climatic warming on butterfly...... butterfly distributions under climate change. Model performance was lower with independent compared to non-independent validation and improved when land cover and soil type variables were included, compared to climate-only models. SDMs performed less well for highly mobile species and for species with long......Aim. To quantify whether species distribution models (SDMs) can reliably forecast species distributions under observed climate change. In particular, to test whether the predictive ability of SDMs depends on species traits or the inclusion of land cover and soil type, and whether distributional...

  3. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  4. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  5. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  6. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  7. A proposed centralised distribution model for the South African automotive component industry

    Directory of Open Access Journals (Sweden)

    Micheline J. Naude

    2009-12-01

    Full Text Available Purpose: This article explores the possibility of developing a distribution model, similar to the model developed and implemented by the South African pharmaceutical industry, which could be implemented by automotive component manufacturers for supply to independent retailers. Problem Investigated: The South African automotive components distribution chain is extensive with a number of players of varying sizes, from the larger spares distribution groups to a number of independent retailers. Distributing to the smaller independent retailers is costly for the automotive component manufacturers. Methodology: This study is based on a preliminary study of an explorative nature. Interviews were conducted with a senior staff member from a leading automotive component manufacturer in KwaZulu Natal and nine participants at a senior management level at five of their main customers (aftermarket retailers. Findings: The findings from the empirical study suggest that the aftermarket component industry is mature with the role players well established. The distribution chain to the independent retailer is expensive in terms of transaction and distribution costs for the automotive component manufacturer. A proposed centralised distribution model for supply to independent retailers has been developed which should reduce distribution costs for the automotive component manufacturer in terms of (1 the lowest possible freight rate; (2 timely and controlled delivery; and (3 reduced congestion at the customer's receiving dock. Originality: This research is original in that it explores the possibility of implementing a centralised distribution model for independent retailers in the automotive component industry. Furthermore, there is a dearth of published research on the South African automotive component industry particularly addressing distribution issues. Conclusion: The distribution model as suggested is a practical one and should deliver added value to automotive

  8. Stand diameter distribution modelling and prediction based on Richards function.

    Directory of Open Access Journals (Sweden)

    Ai-guo Duan

    Full Text Available The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM or maximum likelihood estimates method (MLEM were applied to estimate the parameters of models, and the parameter prediction method (PPM and parameter recovery method (PRM were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1 R distribution presented a more accurate simulation than three-parametric Weibull function; (2 the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3 the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4 the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.

  9. Estimating the Value of Improved Distributed Photovoltaic Adoption Forecasts for Utility Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Gagnon, Pieter [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barbose, Galen L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ehlen, Ali [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zuboy, Jarret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mills, Andrew D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2018-05-15

    Misforecasting the adoption of customer-owned distributed photovoltaics (DPV) can have operational and financial implications for utilities; forecasting capabilities can be improved, but generally at a cost. This paper informs this decision-space by using a suite of models to explore the capacity expansion and operation of the Western Interconnection over a 15-year period across a wide range of DPV growth rates and misforecast severities. The system costs under a misforecast are compared against the costs under a perfect forecast, to quantify the costs of misforecasting. Using a simplified probabilistic method applied to these modeling results, an analyst can make a first-order estimate of the financial benefit of improving a utility’s forecasting capabilities, and thus be better informed about whether to make such an investment. For example, under our base assumptions, a utility with 10 TWh per year of retail electric sales who initially estimates that DPV growth could range from 2% to 7.5% of total generation over the next 15 years could expect total present-value savings of approximately $4 million if they could reduce the severity of misforecasting to within ±25%. Utility resource planners can compare those savings against the costs needed to achieve that level of precision, to guide their decision on whether to make an investment in tools or resources.

  10. A Discrete Model for HIV Infection with Distributed Delay

    Directory of Open Access Journals (Sweden)

    Brahim EL Boukari

    2014-01-01

    Full Text Available We give a consistent discretization of a continuous model of HIV infection, with distributed time delays to express the lag between the times when the virus enters a cell and when the cell becomes infected. The global stability of the steady states of the model is determined and numerical simulations are presented to illustrate our theoretical results.

  11. Optimal dimensioning model of water distribution systems | Gomes ...

    African Journals Online (AJOL)

    This study is aimed at developing a pipe-sizing model for a water distribution system. The optimal solution minimises the system's total cost, which comprises the hydraulic network capital cost, plus the capitalised cost of pumping energy. The developed model, called Lenhsnet, may also be used for economical design when ...

  12. Five (or so) challenges for species distribution modelling

    DEFF Research Database (Denmark)

    Bastos Araujo, Miguel; Guisan, Antoine

    2006-01-01

    Species distribution modelling is central to both fundamental and applied research in biogeography. Despite widespread use of models, there are still important conceptual ambiguities as well as biotic and algorithmic uncertainties that need to be investigated in order to increase confidence in mo...

  13. Degree distribution of a new model for evolving networks

    Indian Academy of Sciences (India)

    on intuitive but realistic consideration that nodes are added to the network with both preferential and random attachments. The degree distribution of the model is between a power-law and an exponential decay. Motivated by the features of network evolution, we introduce a new model of evolving networks, incorporating the ...

  14. Diffusion approximation for modeling of 3-D radiation distributions

    International Nuclear Information System (INIS)

    Zardecki, A.; Gerstl, S.A.W.; De Kinder, R.E. Jr.

    1985-01-01

    A three-dimensional transport code DIF3D, based on the diffusion approximation, is used to model the spatial distribution of radiation energy arising from volumetric isotropic sources. Future work will be concerned with the determination of irradiances and modeling of realistic scenarios, relevant to the battlefield conditions. 8 refs., 4 figs

  15. Occam factors and model independent Bayesian learning of continuous distributions

    International Nuclear Information System (INIS)

    Nemenman, Ilya; Bialek, William

    2002-01-01

    Learning of a smooth but nonparametric probability density can be regularized using methods of quantum field theory. We implement a field theoretic prior numerically, test its efficacy, and show that the data and the phase space factors arising from the integration over the model space determine the free parameter of the theory ('smoothness scale') self-consistently. This persists even for distributions that are atypical in the prior and is a step towards a model independent theory for learning continuous distributions. Finally, we point out that a wrong parametrization of a model family may sometimes be advantageous for small data sets

  16. Probability Distribution and Deviation Information Fusion Driven Support Vector Regression Model and Its Application

    Directory of Open Access Journals (Sweden)

    Changhao Fan

    2017-01-01

    Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.

  17. Smoluchowski coagulation models of sea ice thickness distribution dynamics

    Science.gov (United States)

    Godlovitch, D.; Illner, R.; Monahan, A.

    2011-12-01

    Sea ice thickness distributions display a ubiquitous exponential decrease with thickness. This tail characterizes the range of ice thickness produced by mechanical redistribution of ice through the process of ridging, rafting, and shearing. We investigate how well the thickness distribution can be simulated by representing mechanical redistribution as a generalized stacking process. Such processes are naturally described by a well-studied class of models known as Smoluchowski Coagulation Models (SCMs), which describe the dynamics of a population of fixed-mass "particles" which combine in pairs to form a "particle" with the combined mass of the constituent pair at a rate which depends on the mass of the interacting particles. Like observed sea ice thickness distributions, the mass distribution of the populations generated by SCMs has an exponential or quasi-exponential form. We use SCMs to model sea ice, identifying mass-increasing particle combinations with thickness-increasing ice redistribution processes. Our model couples an SCM component with a thermodynamic component and generates qualitatively accurate thickness distributions with a variety of rate kernels. Our results suggest that the exponential tail of the sea ice thickness distribution arises from the nature of the ridging process, rather than specific physical properties of sea ice or the spatial arrangement of floes, and that the relative strengths of the dynamic and thermodynamic processes are key in accurately simulating the rate at which the sea ice thickness tail drops off with thickness.

  18. Improved modeling techniques for turbomachinery flow fields

    Energy Technology Data Exchange (ETDEWEB)

    Lakshminarayana, B. [Pennsylvania State Univ., University Park, PA (United States); Fagan, J.R. Jr. [Allison Engine Company, Indianapolis, IN (United States)

    1995-10-01

    This program has the objective of developing an improved methodology for modeling turbomachinery flow fields, including the prediction of losses and efficiency. Specifically, the program addresses the treatment of the mixing stress tensor terms attributed to deterministic flow field mechanisms required in steady-state Computational Fluid Dynamic (CFD) models for turbo-machinery flow fields. These mixing stress tensors arise due to spatial and temporal fluctuations (in an absolute frame of reference) caused by rotor-stator interaction due to various blade rows and by blade-to-blade variation of flow properties. These tasks include the acquisition of previously unavailable experimental data in a high-speed turbomachinery environment, the use of advanced techniques to analyze the data, and the development of a methodology to treat the deterministic component of the mixing stress tensor. Penn State will lead the effort to make direct measurements of the momentum and thermal mixing stress tensors in high-speed multistage compressor flow field in the turbomachinery laboratory at Penn State. They will also process the data by both conventional and conditional spectrum analysis to derive momentum and thermal mixing stress tensors due to blade-to-blade periodic and aperiodic components, revolution periodic and aperiodic components arising from various blade rows and non-deterministic (which includes random components) correlations. The modeling results from this program will be publicly available and generally applicable to steady-state Navier-Stokes solvers used for turbomachinery component (compressor or turbine) flow field predictions. These models will lead to improved methodology, including loss and efficiency prediction, for the design of high-efficiency turbomachinery and drastically reduce the time required for the design and development cycle of turbomachinery.

  19. Improving carbon model phenology using data assimilation

    Science.gov (United States)

    Exrayat, Jean-François; Smallman, T. Luke; Bloom, A. Anthony; Williams, Mathew

    2015-04-01

    drivers. DALEC2-GSI showed a more realistic response to climate variability and fire disturbance than DALEC2. DALEC2-GSI more accurately reproduced the assimilated global LAI time series, particularly in areas with high levels of disturbance. This result is supported by more ecologically consistent trait combinations generated by the DALEC2-GSI calibration. In addition, using DALEC2-GSI we are able to map global information on ecosystem traits such as drought tolerance and adaptation to repeated fire disturbance. This demonstrates that utilizing data assimilation provides a useful means of improving the representation of processes within models.

  20. Improved water density feedback model for pressurized water reactors

    International Nuclear Information System (INIS)

    Casadei, A.L.

    1976-01-01

    An improved water density feedback model has been developed for neutron diffusion calculations of PWR cores. This work addresses spectral effects on few-group cross sections due to water density changes, and water density predictions considering open channel and subcooled boiling effects. An homogenized spectral model was also derived using the unit assembly diffusion method for employment in a coarse mesh 3D diffusion computer program. The spectral and water density evaluation models described were incorporated in a 3D diffusion code, and neutronic calculations for a typical PWR were completed for both nominal and accident conditions. Comparison of neutronic calculations employing the open versus the closed channel model for accident conditions indicates that significant safety margin increases can be obtained if subcooled boiling and open channel effects are considered in accident calculations. This is attributed to effects on both core reactivity and power distribution, which result in increased margin to fuel degradation limits. For nominal operating conditions, negligible differences in core reactivity and power distribution exist since flow redistribution and subcooled voids are not significant at such conditions. The results serve to confirm the conservatism of currently employed closed channel feedback methods in accident analysis, and indicate that the model developed in this work can contribute to show increased safety margins for certain accidents

  1. Improving Marine Ecosystem Models with Biochemical Tracers

    Science.gov (United States)

    Pethybridge, Heidi R.; Choy, C. Anela; Polovina, Jeffrey J.; Fulton, Elizabeth A.

    2018-01-01

    Empirical data on food web dynamics and predator-prey interactions underpin ecosystem models, which are increasingly used to support strategic management of marine resources. These data have traditionally derived from stomach content analysis, but new and complementary forms of ecological data are increasingly available from biochemical tracer techniques. Extensive opportunities exist to improve the empirical robustness of ecosystem models through the incorporation of biochemical tracer data and derived indices, an area that is rapidly expanding because of advances in analytical developments and sophisticated statistical techniques. Here, we explore the trophic information required by ecosystem model frameworks (species, individual, and size based) and match them to the most commonly used biochemical tracers (bulk tissue and compound-specific stable isotopes, fatty acids, and trace elements). Key quantitative parameters derived from biochemical tracers include estimates of diet composition, niche width, and trophic position. Biochemical tracers also provide powerful insight into the spatial and temporal variability of food web structure and the characterization of dominant basal and microbial food web groups. A major challenge in incorporating biochemical tracer data into ecosystem models is scale and data type mismatches, which can be overcome with greater knowledge exchange and numerical approaches that transform, integrate, and visualize data.

  2. Distributing Correlation Coefficients of Linear Structure-Activity/Property Models

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACA

    2011-12-01

    Full Text Available Quantitative structure-activity/property relationships are mathematical relationships linking chemical structure and activity/property in a quantitative manner. These in silico approaches are frequently used to reduce animal testing and risk-assessment, as well as to increase time- and cost-effectiveness in characterization and identification of active compounds. The aim of our study was to investigate the pattern of correlation coefficients distribution associated to simple linear relationships linking the compounds structure with their activities. A set of the most common ordnance compounds found at naval facilities with a limited data set with a range of toxicities on aquatic ecosystem and a set of seven properties was studied. Statistically significant models were selected and investigated. The probability density function of the correlation coefficients was investigated using a series of possible continuous distribution laws. Almost 48% of the correlation coefficients proved fit Beta distribution, 40% fit Generalized Pareto distribution, and 12% fit Pert distribution.

  3. Kaon quark distribution functions in the chiral constituent quark model

    Science.gov (United States)

    Watanabe, Akira; Sawada, Takahiro; Kao, Chung Wen

    2018-04-01

    We investigate the valence u and s ¯ quark distribution functions of the K+ meson, vK (u )(x ,Q2) and vK (s ¯)(x ,Q2), in the framework of the chiral constituent quark model. We judiciously choose the bare distributions at the initial scale to generate the dressed distributions at the higher scale, considering the meson cloud effects and the QCD evolution, which agree with the phenomenologically satisfactory valence quark distribution of the pion and the experimental data of the ratio vK (u )(x ,Q2)/vπ (u )(x ,Q2) . We show how the meson cloud effects affect the bare distribution functions in detail. We find that a smaller S U (3 ) flavor symmetry breaking effect is observed, compared with results of the preceding studies based on other approaches.

  4. Bilinear reduced order approximate model of parabolic distributed solar collectors

    KAUST Repository

    Elmetennani, Shahrazed

    2015-07-01

    This paper proposes a novel, low dimensional and accurate approximate model for the distributed parabolic solar collector, by means of a modified gaussian interpolation along the spatial domain. The proposed reduced model, taking the form of a low dimensional bilinear state representation, enables the reproduction of the heat transfer dynamics along the collector tube for system analysis. Moreover, presented as a reduced order bilinear state space model, the well established control theory for this class of systems can be applied. The approximation efficiency has been proven by several simulation tests, which have been performed considering parameters of the Acurex field with real external working conditions. Model accuracy has been evaluated by comparison to the analytical solution of the hyperbolic distributed model and its semi discretized approximation highlighting the benefits of using the proposed numerical scheme. Furthermore, model sensitivity to the different parameters of the gaussian interpolation has been studied.

  5. On Improving 4-km Mesoscale Model Simulations

    Science.gov (United States)

    Deng, Aijun; Stauffer, David R.

    2006-03-01

    A previous study showed that use of analysis-nudging four-dimensional data assimilation (FDDA) and improved physics in the fifth-generation Pennsylvania State University National Center for Atmospheric Research Mesoscale Model (MM5) produced the best overall performance on a 12-km-domain simulation, based on the 18 19 September 1983 Cross-Appalachian Tracer Experiment (CAPTEX) case. However, reducing the simulated grid length to 4 km had detrimental effects. The primary cause was likely the explicit representation of convection accompanying a cold-frontal system. Because no convective parameterization scheme (CPS) was used, the convective updrafts were forced on coarser-than-realistic scales, and the rainfall and the atmospheric response to the convection were too strong. The evaporative cooling and downdrafts were too vigorous, causing widespread disruption of the low-level winds and spurious advection of the simulated tracer. In this study, a series of experiments was designed to address this general problem involving 4-km model precipitation and gridpoint storms and associated model sensitivities to the use of FDDA, planetary boundary layer (PBL) turbulence physics, grid-explicit microphysics, a CPS, and enhanced horizontal diffusion. Some of the conclusions include the following: 1) Enhanced parameterized vertical mixing in the turbulent kinetic energy (TKE) turbulence scheme has shown marked improvements in the simulated fields. 2) Use of a CPS on the 4-km grid improved the precipitation and low-level wind results. 3) Use of the Hong and Pan Medium-Range Forecast PBL scheme showed larger model errors within the PBL and a clear tendency to predict much deeper PBL heights than the TKE scheme. 4) Combining observation-nudging FDDA with a CPS produced the best overall simulations. 5) Finer horizontal resolution does not always produce better simulations, especially in convectively unstable environments, and a new CPS suitable for 4-km resolution is needed. 6

  6. Actors: A Model of Concurrent Computation in Distributed Systems.

    Science.gov (United States)

    1985-06-01

    Artificial Intelligence Labora- tory of the Massachusetts Institute of Technology. Support for the labora- tory’s aritificial intelligence research is...RD-A157 917 ACTORS: A MODEL OF CONCURRENT COMPUTATION IN 1/3- DISTRIBUTED SY𔃿TEMS(U) MASSACHUSETTS INST OF TECH CRMBRIDGE ARTIFICIAL INTELLIGENCE ...Computation In Distributed Systems Gui A. Aghai MIT Artificial Intelligence Laboratory Thsdocument ha. been cipp-oved I= pblicrelease and sale; itsI

  7. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  8. Improvement in the distribution of services in multi-agent systems with SCODA

    Directory of Open Access Journals (Sweden)

    Jesús Ángel ROMÁN GALLEGO

    2016-06-01

    Full Text Available The distribution of services on multi-agent systems allows it to reduce to the agents their computational load. The functionality of the system does not reside in the agents themselves, however it is ubiquitously distributed so that allows you to perform tasks in parallel avoiding an additional computational cost to the elements in the system. The distribution of services that offers SCODA (Distributed and Specialized Agent Communities allows an intelligent management of these services provided by agents of the system and the parallel execution of threads that allow to respond to requests asynchronously, which implies an improvement in the performance of the system at both the computational level as the level of quality of service in the control of these services. The comparison carried out in the case of study that is presented in this paper demonstrates the existing improvement in the distribution of services on systems based on SCODA.

  9. Reservoir theory, groundwater transit time distributions, and lumped parameter models

    International Nuclear Information System (INIS)

    Etcheverry, D.; Perrochet, P.

    1999-01-01

    The relation between groundwater residence times and transit times is given by the reservoir theory. It allows to calculate theoretical transit time distributions in a deterministic way, analytically, or on numerical models. Two analytical solutions validates the piston flow and the exponential model for simple conceptual flow systems. A numerical solution of a hypothetical regional groundwater flow shows that lumped parameter models could be applied in some cases to large-scale, heterogeneous aquifers. (author)

  10. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  11. Distributional Language Learning: Mechanisms and Models of ategory Formation.

    Science.gov (United States)

    Aslin, Richard N; Newport, Elissa L

    2014-09-01

    In the past 15 years, a substantial body of evidence has confirmed that a powerful distributional learning mechanism is present in infants, children, adults and (at least to some degree) in nonhuman animals as well. The present article briefly reviews this literature and then examines some of the fundamental questions that must be addressed for any distributional learning mechanism to operate effectively within the linguistic domain. In particular, how does a naive learner determine the number of categories that are present in a corpus of linguistic input and what distributional cues enable the learner to assign individual lexical items to those categories? Contrary to the hypothesis that distributional learning and category (or rule) learning are separate mechanisms, the present article argues that these two seemingly different processes---acquiring specific structure from linguistic input and generalizing beyond that input to novel exemplars---actually represent a single mechanism. Evidence in support of this single-mechanism hypothesis comes from a series of artificial grammar-learning studies that not only demonstrate that adults can learn grammatical categories from distributional information alone, but that the specific patterning of distributional information among attested utterances in the learning corpus enables adults to generalize to novel utterances or to restrict generalization when unattested utterances are consistently absent from the learning corpus. Finally, a computational model of distributional learning that accounts for the presence or absence of generalization is reviewed and the implications of this model for linguistic-category learning are summarized.

  12. Using the Weibull distribution reliability, modeling and inference

    CERN Document Server

    McCool, John I

    2012-01-01

    Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution

  13. Shell model test of the Porter-Thomas distribution

    International Nuclear Information System (INIS)

    Grimes, S.M.; Bloom, S.D.

    1981-01-01

    Eigenvectors have been calculated for the A=18, 19, 20, 21, and 26 nuclei in an sd shell basis. The decomposition of these states into their shell model components shows, in agreement with other recent work, that this distribution is not a single Gaussian. We find that the largest amplitudes are distributed approximately in a Gaussian fashion. Thus, many experimental measurements should be consistent with the Porter-Thomas predictions. We argue that the non-Gaussian form of the complete distribution can be simply related to the structure of the Hamiltonian

  14. Distributed model based control of multi unit evaporation systems

    International Nuclear Information System (INIS)

    Yudi Samyudia

    2006-01-01

    In this paper, we present a new approach to the analysis and design of distributed control systems for multi-unit plants. The approach is established after treating the effect of recycled dynamics as a gap metric uncertainty from which a distributed controller can be designed sequentially for each unit to tackle the uncertainty. We then use a single effect multi-unit evaporation system to illustrate how the proposed method is used to analyze different control strategies and to systematically achieve a better closed-loop performance using a distributed model-based controller

  15. Transversity quark distributions in a covariant quark-diquark model

    Energy Technology Data Exchange (ETDEWEB)

    Cloet, I.C. [Physics Division, Argonne National Laboratory, Argonne, IL 60439-4843 (United States)], E-mail: icloet@anl.gov; Bentz, W. [Department of Physics, School of Science, Tokai University, Hiratsuka-shi, Kanagawa 259-1292 (Japan)], E-mail: bentz@keyaki.cc.u-tokai.ac.jp; Thomas, A.W. [Jefferson Lab, 12000 Jefferson Avenue, Newport News, VA 23606 (United States); College of William and Mary, Williamsburg, VA 23187 (United States)], E-mail: awthomas@jlab.org

    2008-01-17

    Transversity quark light-cone momentum distributions are calculated for the nucleon. We utilize a modified Nambu-Jona-Lasinio model in which confinement is simulated by eliminating unphysical thresholds for nucleon decay into quarks. The nucleon bound state is obtained by solving the relativistic Faddeev equation in the quark-diquark approximation, where both scalar and axial-vector diquark channels are included. Particular attention is paid to comparing our results with the recent experimental extraction of the transversity distributions by Anselmino et al. We also compare our transversity results with earlier spin-independent and helicity quark distributions calculated in the same approach.

  16. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  17. Milk distribution and feeding practice data for the PATHWAY model

    International Nuclear Information System (INIS)

    Ward, G.M.; Whicker, F.W.

    1990-01-01

    Milk is a major source for ingestion of several radionuclides, particularly when it is produced from fresh forage. Estimation of radionuclide ingestion via milk from Nevada Test Site fallout events in the 1950s required information on the sources of milk, feeding practices for cows, and elapsed time between milking and consumption for various geographic areas. These data were essential input to the food-chain model, PATHWAY. A data base was compiled from personal interviews. Milk sources included private cows, local dairies, and regional plants that collected from and distributed to wide geographic areas. Estimates of the contribution of each source were made for communities in a nine-state area. Pasture seasons varied from 3 to 6 mo. Pasture use varied from zero to 80% of the cows' diet. Pasture use declined during the 1950s, as did the number of family cows and local dairy plants. Regional distributors captured a larger portion of the market, and improved technologies increased the shelf life of milk. These factors tended to reduce the human intake of fallout radionuclides from milk in the latter part of the 1950s

  18. Confronting species distribution model predictions with species functional traits.

    Science.gov (United States)

    Wittmann, Marion E; Barnes, Matthew A; Jerde, Christopher L; Jones, Lisa A; Lodge, David M

    2016-02-01

    Species distribution models are valuable tools in studies of biogeography, ecology, and climate change and have been used to inform conservation and ecosystem management. However, species distribution models typically incorporate only climatic variables and species presence data. Model development or validation rarely considers functional components of species traits or other types of biological data. We implemented a species distribution model (Maxent) to predict global climate habitat suitability for Grass Carp (Ctenopharyngodon idella). We then tested the relationship between the degree of climate habitat suitability predicted by Maxent and the individual growth rates of both wild (N = 17) and stocked (N = 51) Grass Carp populations using correlation analysis. The Grass Carp Maxent model accurately reflected the global occurrence data (AUC = 0.904). Observations of Grass Carp growth rate covered six continents and ranged from 0.19 to 20.1 g day(-1). Species distribution model predictions were correlated (r = 0.5, 95% CI (0.03, 0.79)) with observed growth rates for wild Grass Carp populations but were not correlated (r = -0.26, 95% CI (-0.5, 0.012)) with stocked populations. Further, a review of the literature indicates that the few studies for other species that have previously assessed the relationship between the degree of predicted climate habitat suitability and species functional traits have also discovered significant relationships. Thus, species distribution models may provide inferences beyond just where a species may occur, providing a useful tool to understand the linkage between species distributions and underlying biological mechanisms.

  19. Automatic generation of 3D statistical shape models with optimal landmark distributions.

    Science.gov (United States)

    Heimann, T; Wolf, I; Meinzer, H-P

    2007-01-01

    To point out the problem of non-uniform landmark placement in statistical shape modeling, to present an improved method for generating landmarks in the 3D case and to propose an unbiased evaluation metric to determine model quality. Our approach minimizes a cost function based on the minimum description length (MDL) of the shape model to optimize landmark correspondences over the training set. In addition to the standard technique, we employ an extended remeshing method to change the landmark distribution without losing correspondences, thus ensuring a uniform distribution over all training samples. To break the dependency of the established evaluation measures generalization and specificity from the landmark distribution, we change the internal metric from landmark distance to volumetric overlap. Redistributing landmarks to an equally spaced distribution during the model construction phase improves the quality of the resulting models significantly if the shapes feature prominent bulges or other complex geometry. The distribution of landmarks on the training shapes is -- beyond the correspondence issue -- a crucial point in model construction.

  20. A distributed dynamic model of a monolith hydrogen membrane reactor

    International Nuclear Information System (INIS)

    Michelsen, Finn Are; Wilhelmsen, Øivind; Zhao, Lei; Aasen, Knut Ingvar

    2013-01-01

    Highlights: ► We model a rigorous distributed dynamic model for a HMR unit. ► The model includes enough complexity for steady-state and dynamic analysis. ► Simulations show that the model is non-linear within the normal operating range. ► The model is useful for studying and handling disturbances such as inlet changes and membrane leakage. - Abstract: This paper describes a distributed mechanistic dynamic model of a hydrogen membrane reformer unit (HMR) used for methane steam reforming. The model is based on a square channel monolith structure concept, where air flows adjacent to a mix of natural gas and water distributed in a chess pattern of channels. Combustion of hydrogen gives energy to the endothermic steam reforming reactions. The model is used for both steady state and dynamic analyses. It therefore needs to be computationally attractive, but still include enough complexity to study the important steady state and dynamic features of the process. Steady-state analysis of the model gives optimum for the steam to carbon and steam to oxygen ratios, where the conversion of methane is 92% and the hydrogen used as energy for the endothermic reactions is 28% at the nominal optimum. The dynamic analysis shows that non-linear control schemes may be necessary for satisfactory control performance

  1. Joint Distributed Surf Zone Environmental Model: FY96 Modeling Procedure

    National Research Council Canada - National Science Library

    Allard, Richard

    1997-01-01

    ... to the modeling and simulation community. To test this proof of concept, a suite of models were identified and tested for Camp Pendelton, CA, during two 7 day periods in January and August 1995, in which data from the Coupled Ocean...

  2. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......In recent years, there has been an increase in the application of distributed, physically-based and integrated hydrological models. Many questions regarding how to properly calibrate and validate distributed models and assess the uncertainty of the estimated parameters and the spatially......-site validation must complement the usual time validation. In this study, we develop, through an application, a comprehensive framework for multi-criteria calibration and uncertainty assessment of distributed physically-based, integrated hydrological models. A revised version of the generalized likelihood...

  3. Model documentation: Natural gas transmission and distribution model of the National Energy Modeling System. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-17

    The Natural Gas Transmission and Distribution Model (NGTDM) is the component of the National Energy Modeling System (NEMS) that is used to represent the domestic natural gas transmission and distribution system. NEMS was developed in the Office of integrated Analysis and Forecasting of the Energy information Administration (EIA). NEMS is the third in a series of computer-based, midterm energy modeling systems used since 1974 by the EIA and its predecessor, the Federal Energy Administration, to analyze domestic energy-economy markets and develop projections. The NGTDM is the model within the NEMS that represents the transmission, distribution, and pricing of natural gas. The model also includes representations of the end-use demand for natural gas, the production of domestic natural gas, and the availability of natural gas traded on the international market based on information received from other NEMS models. The NGTDM determines the flow of natural gas in an aggregate, domestic pipeline network, connecting domestic and foreign supply regions with 12 demand regions. The methodology employed allows the analysis of impacts of regional capacity constraints in the interstate natural gas pipeline network and the identification of pipeline capacity expansion requirements. There is an explicit representation of core and noncore markets for natural gas transmission and distribution services, and the key components of pipeline tariffs are represented in a pricing algorithm. Natural gas pricing and flow patterns are derived by obtaining a market equilibrium across the three main elements of the natural gas market: the supply element, the demand element, and the transmission and distribution network that links them. The NGTDM consists of four modules: the Annual Flow Module, the Capacity F-expansion Module, the Pipeline Tariff Module, and the Distributor Tariff Module. A model abstract is provided in Appendix A.

  4. Model documentation: Natural gas transmission and distribution model of the National Energy Modeling System. Volume 1

    International Nuclear Information System (INIS)

    1995-01-01

    The Natural Gas Transmission and Distribution Model (NGTDM) is the component of the National Energy Modeling System (NEMS) that is used to represent the domestic natural gas transmission and distribution system. NEMS was developed in the Office of integrated Analysis and Forecasting of the Energy information Administration (EIA). NEMS is the third in a series of computer-based, midterm energy modeling systems used since 1974 by the EIA and its predecessor, the Federal Energy Administration, to analyze domestic energy-economy markets and develop projections. The NGTDM is the model within the NEMS that represents the transmission, distribution, and pricing of natural gas. The model also includes representations of the end-use demand for natural gas, the production of domestic natural gas, and the availability of natural gas traded on the international market based on information received from other NEMS models. The NGTDM determines the flow of natural gas in an aggregate, domestic pipeline network, connecting domestic and foreign supply regions with 12 demand regions. The methodology employed allows the analysis of impacts of regional capacity constraints in the interstate natural gas pipeline network and the identification of pipeline capacity expansion requirements. There is an explicit representation of core and noncore markets for natural gas transmission and distribution services, and the key components of pipeline tariffs are represented in a pricing algorithm. Natural gas pricing and flow patterns are derived by obtaining a market equilibrium across the three main elements of the natural gas market: the supply element, the demand element, and the transmission and distribution network that links them. The NGTDM consists of four modules: the Annual Flow Module, the Capacity F-expansion Module, the Pipeline Tariff Module, and the Distributor Tariff Module. A model abstract is provided in Appendix A

  5. Improved choked flow model for MARS code

    International Nuclear Information System (INIS)

    Chung, Moon Sun; Lee, Won Jae; Ha, Kwi Seok; Hwang, Moon Kyu

    2002-01-01

    Choked flow calculation is improved by using a new sound speed criterion for bubbly flow that is derived by the characteristic analysis of hyperbolic two-fluid model. This model was based on the notion of surface tension for the interfacial pressure jump terms in the momentum equations. Real eigenvalues obtained as the closed-form solution of characteristic polynomial represent the sound speed in the bubbly flow regime that agrees well with the existing experimental data. The present sound speed shows more reasonable result in the extreme case than the Nguyens did. The present choked flow criterion derived by the present sound speed is employed in the MARS code and assessed by using the Marviken choked flow tests. The assessment results without any adjustment made by some discharge coefficients demonstrate more accurate predictions of choked flow rate in the bubbly flow regime than those of the earlier choked flow calculations. By calculating the Typical PWR (SBLOCA) problem, we make sure that the present model can reproduce the reasonable transients of integral reactor system

  6. Modeling the distribution of Culex tritaeniorhynchus to predict Japanese encephalitis distribution in the Republic of Korea

    Directory of Open Access Journals (Sweden)

    Penny Masuoka

    2010-11-01

    Full Text Available Over 35,000 cases of Japanese encephalitis (JE are reported worldwide each year. Culex tritaeniorhynchus is the primary vector of the JE virus, while wading birds are natural reservoirs and swine amplifying hosts. As part of a JE risk analysis, the ecological niche modeling programme, Maxent, was used to develop a predictive model for the distribution of Cx. tritaeniorhynchus in the Republic of Korea, using mosquito collection data, temperature, precipitation, elevation, land cover and the normalized difference vegetation index (NDVI. The resulting probability maps from the model were consistent with the known environmental limitations of the mosquito with low probabilities predicted for forest covered mountains. July minimum temperature and land cover were the most important variables in the model. Elevation, summer NDVI (July-September, precipitation in July, summer minimum temperature (May-August and maximum temperature for fall and winter months also contributed to the model. Comparison of the Cx. tritaeniorhynchus model to the distribution of JE cases in the Republic of Korea from 2001 to 2009 showed that cases among a highly vaccinated Korean population were located in high-probability areas for Cx. tritaeniorhynchus. No recent JE cases were reported from the eastern coastline, where higher probabilities of mosquitoes were predicted, but where only small numbers of pigs are raised. The geographical distribution of reported JE cases corresponded closely with the predicted high-probability areas for Cx. tritaeniorhynchus, making the map a useful tool for health risk analysis that could be used for planning preventive public health measures.

  7. From spatially variable streamflow to distributed hydrological models: Analysis of key modeling decisions

    Science.gov (United States)

    Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent

    2016-02-01

    This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the

  8. A Review of Distributed Control Techniques for Power Quality Improvement in Micro-grids

    Science.gov (United States)

    Zeeshan, Hafiz Muhammad Ali; Nisar, Fatima; Hassan, Ahmad

    2017-05-01

    Micro-grid is typically visualized as a small scale local power supply network dependent on distributed energy resources (DERs) that can operate simultaneously with grid as well as in standalone manner. The distributed generator of a micro-grid system is usually a converter-inverter type topology acting as a non-linear load, and injecting harmonics into the distribution feeder. Hence, the negative effects on power quality by the usage of distributed generation sources and components are clearly witnessed. In this paper, a review of distributed control approaches for power quality improvement is presented which encompasses harmonic compensation, loss mitigation and optimum power sharing in multi-source-load distributed power network. The decentralized subsystems for harmonic compensation and active-reactive power sharing accuracy have been analysed in detail. Results have been validated to be consistent with IEEE standards.

  9. Dynamical Analysis of SIR Epidemic Models with Distributed Delay

    Directory of Open Access Journals (Sweden)

    Wencai Zhao

    2013-01-01

    Full Text Available SIR epidemic models with distributed delay are proposed. Firstly, the dynamical behaviors of the model without vaccination are studied. Using the Jacobian matrix, the stability of the equilibrium points of the system without vaccination is analyzed. The basic reproduction number R is got. In order to study the important role of vaccination to prevent diseases, the model with distributed delay under impulsive vaccination is formulated. And the sufficient conditions of globally asymptotic stability of “infection-free” periodic solution and the permanence of the model are obtained by using Floquet’s theorem, small-amplitude perturbation skills, and comparison theorem. Lastly, numerical simulation is presented to illustrate our main conclusions that vaccination has significant effects on the dynamical behaviors of the model. The results can provide effective tactic basis for the practical infectious disease prevention.

  10. Distributed models coupling soakaways, urban drainage and groundwater

    DEFF Research Database (Denmark)

    Roldin, Maria Kerstin

    in receiving waters, urban flooding etc. WSUD structures are generally small, decentralized systems intended to manage stormwater near the source. Many of these alternative techniques are based on infiltration which can affect both the urban sewer system and urban groundwater levels if widely implemented......Alternative methods for stormwater management in urban areas, also called Water Sensitive Urban Design (WSUD) methods, have become increasingly important for the mitigation of urban stormwater management problems such as high runoff volumes, combined sewage overflows, poor water quality......, and how these can be modeled in an integrated environment with distributed urban drainage and groundwater flow models. The thesis: 1. Identifies appropriate models of soakaways for use in an integrated and distributed urban water and groundwater modeling system 2. Develops a modeling concept that is able...

  11. UV Stellar Distribution Model for the Derivation of Payload

    Directory of Open Access Journals (Sweden)

    Young-Jun Choi

    1999-12-01

    Full Text Available We present the results of a model calculation of the stellar distribution in a UV and centered at 2175Å corresponding to the well-known bump in the interstellar extinction curve. The stellar distribution model used here is based on the Bahcall-Soneira galaxy model (1980. The source code for model calculation was designed by Brosch (1991 and modified to investigate various designing factors for UV satellite payload. The model predicts UV stellar densities in different sky directions, and its results are compared with the TD-1 star counts for a number of sky regions. From this study, we can determine the field of view, size of optics, angular resolution, and number of stars in one orbit. There will provide the basic constrains in designing a satellite payload for UV observations.

  12. Using regional bird density distribution models to evaluate protected area networks and inform conservation planning

    Science.gov (United States)

    John D. Alexander; Jaime L. Stephens; Sam Veloz; Leo Salas; Josée S. Rousseau; C. John Ralph; Daniel A. Sarr

    2017-01-01

    As data about populations of indicator species become available, proactive strategies that improve representation of biological diversity within protected area networks should consider finer-scaled evaluations, especially in regions identified as important through course-scale analyses. We use density distribution models derived from a robust regional bird...

  13. Robust Improvement in Estimation of a Covariance Matrix in an Elliptically Contoured Distribution Respect to Quadratic Loss Function

    Directory of Open Access Journals (Sweden)

    Z. Khodadadi

    2008-03-01

    Full Text Available Let S be matrix of residual sum of square in linear model Y = Aβ + e where matrix e is distributed as elliptically contoured with unknown scale matrix Σ. In present work, we consider the problem of estimating Σ with respect to squared loss function, L(Σˆ , Σ = tr(ΣΣˆ −1 −I 2 . It is shown that improvement of the estimators were obtained by James, Stein [7], Dey and Srivasan [1] under the normality assumption remains robust under an elliptically contoured distribution respect to squared loss function

  14. Species distribution models of tropical deep-sea snappers.

    Directory of Open Access Journals (Sweden)

    Céline Gomez

    Full Text Available Deep-sea fisheries provide an important source of protein to Pacific Island countries and territories that are highly dependent on fish for food security. However, spatial management of these deep-sea habitats is hindered by insufficient data. We developed species distribution models using spatially limited presence data for the main harvested species in the Western Central Pacific Ocean. We used bathymetric and water temperature data to develop presence-only species distribution models for the commercially exploited deep-sea snappers Etelis Cuvier 1828, Pristipomoides Valenciennes 1830, and Aphareus Cuvier 1830. We evaluated the performance of four different algorithms (CTA, GLM, MARS, and MAXENT within the BIOMOD framework to obtain an ensemble of predicted distributions. We projected these predictions across the Western Central Pacific Ocean to produce maps of potential deep-sea snapper distributions in 32 countries and territories. Depth was consistently the best predictor of presence for all species groups across all models. Bathymetric slope was consistently the poorest predictor. Temperature at depth was a good predictor of presence for GLM only. Model precision was highest for MAXENT and CTA. There were strong regional patterns in predicted distribution of suitable habitat, with the largest areas of suitable habitat (> 35% of the Exclusive Economic Zone predicted in seven South Pacific countries and territories (Fiji, Matthew & Hunter, Nauru, New Caledonia, Tonga, Vanuatu and Wallis & Futuna. Predicted habitat also varied among species, with the proportion of predicted habitat highest for Aphareus and lowest for Etelis. Despite data paucity, the relationship between deep-sea snapper presence and their environments was sufficiently strong to predict their distribution across a large area of the Pacific Ocean. Our results therefore provide a strong baseline for designing monitoring programs that balance resource exploitation and

  15. CORCON-MOD1 modelling improvements

    International Nuclear Information System (INIS)

    Corradini, M.L.; Gonzales, F.G.; Vandervort, C.L.

    1986-01-01

    Given the unlikely occurrence of a severe accident in a light water reactor (LWR), the core may melt and slump into the reactor cavity below the reactor vessel. The interaction of the molten core with exposed concrete (a molten-core-concrete-interaction, MCCI) causes copious gas production which influences further heat transfer and concrete attack and may threaten containment integrity. In this paper the authors focus on the low-temperature phase of the MCCI where the molten pool is partially solidified, but is still capable of attacking concrete. The authors have developed some improved phenomenological models for pool freezing and molten core-coolant heat transfer and have incorporated them into the CORCON-MOD1 computer program. In the paper the authors compare the UW-CORCON/MOD1 calculations to CORCON/MOD2 and WECHSL results as well as the BETA experiments which are being conducted in Germany

  16. Modeling and Simulation of Power Distribution System in More Electric Aircraft

    Directory of Open Access Journals (Sweden)

    Zhangang Yang

    2015-01-01

    Full Text Available The More Electric Aircraft concept is a fast-developing trend in modern aircraft industry. With this new concept, the performance of the aircraft can be further optimized and meanwhile the operating and maintenance cost will be decreased effectively. In order to optimize the power system integrity and have the ability to investigate the performance of the overall system in any possible situations, one accurate simulation model of the aircraft power system will be very helpful and necessary. This paper mainly introduces a method to build a simulation model for the power distribution system, which is based on detailed component models. The power distribution system model consists of power generation unit, transformer rectifier unit, DC-DC converter unit, and DC-AC inverter unit. In order to optimize the performance of the power distribution system and improve the quality of the distributed power, a feedback control network is designed based on the characteristics of the power distribution system. The simulation result indicates that this new simulation model is well designed and it works accurately. Moreover, steady state performance and transient state performance of the model can fulfill the requirements of aircraft power distribution system in the realistic application.

  17. Large-watershed flood simulation and forecasting based on different-resolution distributed hydrological model

    Science.gov (United States)

    Li, J.

    2017-12-01

    Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.

  18. Fitting the Probability Distribution Functions to Model Particulate Matter Concentrations

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2017-01-01

    The main objective of this study is to identify the best probability distribution and the plotting position formula for modeling the concentrations of Total Suspended Particles (TSP) as well as the Particulate Matter with an aerodynamic diameter<10 μm (PM 10 ). The best distribution provides the estimated probabilities that exceed the threshold limit given by the Egyptian Air Quality Limit value (EAQLV) as well the number of exceedance days is estimated. The standard limits of the EAQLV for TSP and PM 10 concentrations are 24-h average of 230 μg/m 3 and 70 μg/m 3 , respectively. Five frequency distribution functions with seven formula of plotting positions (empirical cumulative distribution functions) are compared to fit the average of daily TSP and PM 10 concentrations in year 2014 for Ain Sokhna city. The Quantile-Quantile plot (Q-Q plot) is used as a method for assessing how closely a data set fits a particular distribution. A proper probability distribution that represents the TSP and PM 10 has been chosen based on the statistical performance indicator values. The results show that Hosking and Wallis plotting position combined with Frechet distribution gave the highest fit for TSP and PM 10 concentrations. Burr distribution with the same plotting position follows Frechet distribution. The exceedance probability and days over the EAQLV are predicted using Frechet distribution. In 2014, the exceedance probability and days for TSP concentrations are 0.052 and 19 days, respectively. Furthermore, the PM 10 concentration is found to exceed the threshold limit by 174 days

  19. STOCHASTIC MODEL OF THE SPIN DISTRIBUTION OF DARK MATTER HALOS

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Juhan [Center for Advanced Computation, Korea Institute for Advanced Study, Heogiro 85, Seoul 130-722 (Korea, Republic of); Choi, Yun-Young [Department of Astronomy and Space Science, Kyung Hee University, Gyeonggi 446-701 (Korea, Republic of); Kim, Sungsoo S.; Lee, Jeong-Eun [School of Space Research, Kyung Hee University, Gyeonggi 446-701 (Korea, Republic of)

    2015-09-15

    We employ a stochastic approach to probing the origin of the log-normal distributions of halo spin in N-body simulations. After analyzing spin evolution in halo merging trees, it was found that a spin change can be characterized by a stochastic random walk of angular momentum. Also, spin distributions generated by random walks are fairly consistent with those directly obtained from N-body simulations. We derived a stochastic differential equation from a widely used spin definition and measured the probability distributions of the derived angular momentum change from a massive set of halo merging trees. The roles of major merging and accretion are also statistically analyzed in evolving spin distributions. Several factors (local environment, halo mass, merging mass ratio, and redshift) are found to influence the angular momentum change. The spin distributions generated in the mean-field or void regions tend to shift slightly to a higher spin value compared with simulated spin distributions, which seems to be caused by the correlated random walks. We verified the assumption of randomness in the angular momentum change observed in the N-body simulation and detected several degrees of correlation between walks, which may provide a clue for the discrepancies between the simulated and generated spin distributions in the voids. However, the generated spin distributions in the group and cluster regions successfully match the simulated spin distribution. We also demonstrated that the log-normality of the spin distribution is a natural consequence of the stochastic differential equation of the halo spin, which is well described by the Geometric Brownian Motion model.

  20. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    International Nuclear Information System (INIS)

    1996-01-01

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of a two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues

  1. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-26

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of a two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.

  2. Simultaneous allocation of distributed resources using improved teaching learning based optimization

    International Nuclear Information System (INIS)

    Kanwar, Neeraj; Gupta, Nikhil; Niazi, K.R.; Swarnkar, Anil

    2015-01-01

    Highlights: • Simultaneous allocation of distributed energy resources in distribution networks. • Annual energy loss reduction is optimized using a multi-level load profile. • A new penalty factor approach is suggested to check node voltage deviations. • An improved TLBO is proposed by suggesting several modifications in standard TLBO. • An intelligent search is proposed to enhance the performance of solution technique. - Abstract: Active and reactive power flow in distribution networks can be effectively controlled by optimally placing distributed resources like shunt capacitors and distributed generators. This paper presents improved variant of Teaching Learning Based Optimization (TLBO) to efficiently and effectively deal with the problem of simultaneous allocation of these distributed resources in radial distribution networks while considering multi-level load scenario. Several algorithm specific modifications are suggested in the standard form of TLBO to cope against the intrinsic flaws of this technique. In addition, an intelligent search approach is proposed to restrict the problem search space without loss of diversity. This enhances the overall performance of the proposed method. The proposed method is investigated on IEEE 33-bus, 69-bus and 83-bus test distribution systems showing promising results

  3. A simplified model of saltcake moisture distribution. Letter report

    International Nuclear Information System (INIS)

    Simmons, C.S.

    1995-09-01

    This letter report describes the formulation of a simplified model for finding the moisture distribution in a saltcake waste profile that has been stabilized by pumping out the drainable interstitial liquid. The model is based on assuming that capillarity mainly governs the distribution of moisture in the porous saltcake waste. A stead upward flow of moisture driven by evaporation from the waste surface is conceptualized to occur for isothermal conditions. To obtain hydraulic parameters for unsaturated conditions, the model is calibrated or matched to the relative saturation distribution as measured by neutron probe scans. The model is demonstrated on Tanks 104-BY and 105-TX as examples. A value of the model is that it identifies the key physical parameters that control the surface moisture content in a waste profile. Moreover, the model can be used to estimate the brine application rate at the waste surface that would raise the moisture content there to a safe level. Thus, the model can be applied to help design a strategy for correcting the moisture conditions in a saltcake waste tank

  4. Nucleon parton distributions in a light-front quark model

    International Nuclear Information System (INIS)

    Gutsche, Thomas; Lyubovitskij, Valery E.; Schmidt, Ivan

    2017-01-01

    Continuing our analysis of parton distributions in the nucleon, we extend our light-front quark model in order to obtain both the helicity-independent and the helicity-dependent parton distributions, analytically matching the results of global fits at the initial scale μ∝ 1 GeV; they also contain the correct Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution. We also calculate the transverse parton, Wigner and Husimi distributions from a unified point of view, using our light-front wave functions and expressing them in terms of the parton distributions q_v(x) and δq_v(x). Our results are very relevant for the current and future program of the COMPASS experiment at SPS (CERN). (orig.)

  5. Nucleon parton distributions in a light-front quark model

    Energy Technology Data Exchange (ETDEWEB)

    Gutsche, Thomas [Universitaet Tuebingen, Institut fuer Theoretische Physik, Kepler Center for Astro and Particle Physics, Tuebingen (Germany); Lyubovitskij, Valery E. [Universitaet Tuebingen, Institut fuer Theoretische Physik, Kepler Center for Astro and Particle Physics, Tuebingen (Germany); Tomsk State University, Department of Physics, Tomsk (Russian Federation); Tomsk Polytechnic University, Laboratory of Particle Physics, Mathematical Physics Department, Tomsk (Russian Federation); Universidad Tecnica Federico Santa Maria, Departamento de Fisica y Centro Cientifico Tecnologico de Valparaiso (CCTVal), Valparaiso (Chile); Schmidt, Ivan [Universidad Tecnica Federico Santa Maria, Departamento de Fisica y Centro Cientifico Tecnologico de Valparaiso (CCTVal), Valparaiso (Chile)

    2017-02-15

    Continuing our analysis of parton distributions in the nucleon, we extend our light-front quark model in order to obtain both the helicity-independent and the helicity-dependent parton distributions, analytically matching the results of global fits at the initial scale μ∝ 1 GeV; they also contain the correct Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution. We also calculate the transverse parton, Wigner and Husimi distributions from a unified point of view, using our light-front wave functions and expressing them in terms of the parton distributions q{sub v}(x) and δq{sub v}(x). Our results are very relevant for the current and future program of the COMPASS experiment at SPS (CERN). (orig.)

  6. Effects of naloxone distribution to likely bystanders: Results of an agent-based model.

    Science.gov (United States)

    Keane, Christopher; Egan, James E; Hawk, Mary

    2018-05-01

    Opioid overdose deaths in the US rose dramatically in the past 16 years, creating an urgent national health crisis with no signs of immediate relief. In 2017, the President of the US officially declared the opioid epidemic to be a national emergency and called for additional resources to respond to the crisis. Distributing naloxone to community laypersons and people at high risk for opioid overdose can prevent overdose death, but optimal distribution methods have not yet been pinpointed. We conducted a sequential exploratory mixed methods design using qualitative data to inform an agent-based model to improve understanding of effective community-based naloxone distribution to laypersons to reverse opioid overdose. The individuals in the model were endowed with cognitive and behavioral variables and accessed naloxone via community sites such as pharmacies, hospitals, and urgent-care centers. We compared overdose deaths over a simulated 6-month period while varying the number of distribution sites (0, 1, and 10) and number of kits given to individuals per visit (1 versus 10). Specifically, we ran thirty simulations for each of thirteen distribution models and report average overdose deaths for each. The baseline comparator was no naloxone distribution. Our simulations explored the effects of distribution through syringe exchange sites with and without secondary distribution, which refers to distribution of naloxone kits by laypersons within their social networks and enables ten additional laypersons to administer naloxone to reverse opioid overdose. Our baseline model with no naloxone distribution predicted there would be 167.9 deaths in a six month period. A single distribution site, even with 10 kits picked up per visit, decreased overdose deaths by only 8.3% relative to baseline. However, adding secondary distribution through social networks to a single site resulted in 42.5% fewer overdose deaths relative to baseline. That is slightly higher than the 39

  7. Spatio-temporal modeling of nonlinear distributed parameter systems

    CERN Document Server

    Li, Han-Xiong

    2011-01-01

    The purpose of this volume is to provide a brief review of the previous work on model reduction and identifi cation of distributed parameter systems (DPS), and develop new spatio-temporal models and their relevant identifi cation approaches. In this book, a systematic overview and classifi cation on the modeling of DPS is presented fi rst, which includes model reduction, parameter estimation and system identifi cation. Next, a class of block-oriented nonlinear systems in traditional lumped parameter systems (LPS) is extended to DPS, which results in the spatio-temporal Wiener and Hammerstein s

  8. Improvements to the RADIOM non-LTE model

    Science.gov (United States)

    Busquet, M.; Colombant, D.; Klapisch, M.; Fyfe, D.; Gardner, J.

    2009-12-01

    In 1993, we proposed the RADIOM model [M. Busquet, Phys. Fluids 85 (1993) 4191] where an ionization temperature T z is used to derive non-LTE properties from LTE data. T z is obtained from an "extended Saha equation" where unbalanced transitions, like radiative decay, give the non-LTE behavior. Since then, major improvements have been made. T z has been shown to be more than a heuristic value, but describes the actual distribution of excited and ionized states and can be understood as an "effective temperature". Therefore we complement the extended Saha equation by introducing explicitly the auto-ionization/dielectronic capture. Also we use the SCROLL model to benchmark the computed values of T z.

  9. Distributionally Robust Return-Risk Optimization Models and Their Applications

    Directory of Open Access Journals (Sweden)

    Li Yang

    2014-01-01

    Full Text Available Based on the risk control of conditional value-at-risk, distributionally robust return-risk optimization models with box constraints of random vector are proposed. They describe uncertainty in both the distribution form and moments (mean and covariance matrix of random vector. It is difficult to solve them directly. Using the conic duality theory and the minimax theorem, the models are reformulated as semidefinite programming problems, which can be solved by interior point algorithms in polynomial time. An important theoretical basis is therefore provided for applications of the models. Moreover, an application of the models to a practical example of portfolio selection is considered, and the example is evaluated using a historical data set of four stocks. Numerical results show that proposed methods are robust and the investment strategy is safe.

  10. Regional drought assessment using a distributed hydrological model coupled with Standardized Runoff Index

    Directory of Open Access Journals (Sweden)

    H. Shen

    2015-05-01

    Full Text Available Drought assessment is essential for coping with frequent droughts nowadays. Owing to the large spatio-temporal variations in hydrometeorology in most regions in China, it is very necessary to use a physically-based hydrological model to produce rational spatial and temporal distributions of hydro-meteorological variables for drought assessment. In this study, the large-scale distributed hydrological model Variable Infiltration Capacity (VIC was coupled with a modified standardized runoff index (SRI for drought assessment in the Weihe River basin, northwest China. The result indicates that the coupled model is capable of reasonably reproducing the spatial distribution of drought occurrence. It reflected the spatial heterogeneity of regional drought and improved the physical mechanism of SRI. This model also has potential for drought forecasting, early warning and mitigation, given that accurate meteorological forcing data are available.

  11. The STIRPAT Analysis on Carbon Emission in Chinese Cities: An Asymmetric Laplace Distribution Mixture Model

    Directory of Open Access Journals (Sweden)

    Shanshan Wang

    2017-12-01

    Full Text Available In cities’ policy-making, it is a hot issue to grasp the determinants of carbon dioxide emission in Chinese cities. And the common method is to use the STIRPAT model, where its coefficients represent the influence intensity of each determinants of carbon emission. However, less work discusses estimation accuracy, especially in the framework of non-normal distribution and heterogeneity among cities’ emission. To improve the estimation accuracy, this paper employs a new method to estimate the STIRPAT model. The method uses a mixture of Asymmetric Laplace distributions (ALDs to approximate the true distribution of the error term. Meantime, a designed two-layer EM algorithm is used to obtain estimators. We test the robustness via the comparison results of five different models. We find that the ALDs Mixture Model is more reliable the others. Further, a significant Kuznets curve relationship is identified in China.

  12. Linear Model for Optimal Distributed Generation Size Predication

    Directory of Open Access Journals (Sweden)

    Ahmed Al Ameri

    2017-01-01

    Full Text Available This article presents a linear model predicting optimal size of Distributed Generation (DG that addresses the minimum power loss. This method is based fundamentally on strong coupling between active power and voltage angle as well as between reactive power and voltage magnitudes. This paper proposes simplified method to calculate the total power losses in electrical grid for different distributed generation sizes and locations. The method has been implemented and tested on several IEEE bus test systems. The results show that the proposed method is capable of predicting approximate optimal size of DG when compared with precision calculations. The method that linearizes a complex model showed a good result, which can actually reduce processing time required. The acceptable accuracy with less time and memory required can help the grid operator to assess power system integrated within large-scale distribution generation.

  13. CHARACTERIZING AND PROPAGATING MODELING UNCERTAINTIES IN PHOTOMETRICALLY DERIVED REDSHIFT DISTRIBUTIONS

    International Nuclear Information System (INIS)

    Abrahamse, Augusta; Knox, Lloyd; Schmidt, Samuel; Thorman, Paul; Anthony Tyson, J.; Zhan Hu

    2011-01-01

    The uncertainty in the redshift distributions of galaxies has a significant potential impact on the cosmological parameter values inferred from multi-band imaging surveys. The accuracy of the photometric redshifts measured in these surveys depends not only on the quality of the flux data, but also on a number of modeling assumptions that enter into both the training set and spectral energy distribution (SED) fitting methods of photometric redshift estimation. In this work we focus on the latter, considering two types of modeling uncertainties: uncertainties in the SED template set and uncertainties in the magnitude and type priors used in a Bayesian photometric redshift estimation method. We find that SED template selection effects dominate over magnitude prior errors. We introduce a method for parameterizing the resulting ignorance of the redshift distributions, and for propagating these uncertainties to uncertainties in cosmological parameters.

  14. From microscopic taxation and redistribution models to macroscopic income distributions

    Science.gov (United States)

    Bertotti, Maria Letizia; Modanese, Giovanni

    2011-10-01

    We present here a general framework, expressed by a system of nonlinear differential equations, suitable for the modeling of taxation and redistribution in a closed society. This framework allows one to describe the evolution of income distribution over the population and to explain the emergence of collective features based on knowledge of the individual interactions. By making different choices of the framework parameters, we construct different models, whose long-time behavior is then investigated. Asymptotic stationary distributions are found, which enjoy similar properties as those observed in empirical distributions. In particular, they exhibit power law tails of Pareto type and their Lorenz curves and Gini indices are consistent with some real world ones.

  15. Spatial distribution of emissions to air – the SPREAD model

    DEFF Research Database (Denmark)

    Plejdrup, Marlene Schmidt; Gyldenkærne, Steen

    The National Environmental Research Institute (NERI), Aarhus University, completes the annual national emission inventories for greenhouse gases and air pollutants according to Denmark’s obligations under international conventions, e.g. the climate convention, UNFCCC and the convention on long...... quality modelling in exposure studies. SPREAD includes emission distributions for each sector in the Danish inventory system; stationary combustion, mobile sources, fugitive emissions from fuels, industrial processes, solvents and other product use, agriculture and waste. This model enables generation...

  16. Nucleon quark distributions in a covariant quark-diquark model

    Energy Technology Data Exchange (ETDEWEB)

    Cloet, I.C. [Special Research Centre for the Subatomic Structure of Matter and Department of Physics and Mathematical Physics, University of Adelaide, SA 5005 (Australia) and Jefferson Lab, 12000 Jefferson Avenue, Newport News, VA 23606 (United States)]. E-mail: icloet@physics.adelaide.edu.au; Bentz, W. [Department of Physics, School of Science, Tokai University, Hiratsuka-shi, Kanagawa 259-1292 (Japan)]. E-mail: bentz@keyaki.cc.u-tokai.ac.jp; Thomas, A.W. [Jefferson Lab, 12000 Jefferson Avenue, Newport News, VA 23606 (United States)]. E-mail: awthomas@jlab.org

    2005-08-18

    Spin-dependent and spin-independent quark light-cone momentum distributions and structure functions are calculated for the nucleon. We utilize a modified Nambu-Jona-Lasinio model in which confinement is simulated by eliminating unphysical thresholds for nucleon decay into quarks. The nucleon bound state is obtained by solving the Faddeev equation in the quark-diquark approximation, where both scalar and axial-vector diquark channels are included. We find excellent agreement between our model results and empirical data.

  17. A Species Distribution Modeling Informed Conservation Assessment of Bog Spicebush

    Science.gov (United States)

    2016-09-14

    Adhikari, Barik , and Upadhaya 2012). These results sug- gest there are many locations potentially suitable for (re)introducing L. subcoriacea across its...References Adhikari, D., S. K. Barik , K. Upadhaya. 2012. Habitat distribution modelling for reintroduction of Ilex khasiana Purk., a critically

  18. Using WNTR to Model Water Distribution System Resilience

    Science.gov (United States)

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of di...

  19. Modelling flow dynamics in water distribution networks using ...

    African Journals Online (AJOL)

    One such approach is the Artificial Neural Networks (ANNs) technique. The advantage of ANNs is that they are robust and can be used to model complex linear and non-linear systems without making implicit assumptions. ANNs can be trained to forecast flow dynamics in a water distribution network. Such flow dynamics ...

  20. A Game-Theoretic Model for Distributed Programming by Contract

    DEFF Research Database (Denmark)

    Henriksen, Anders Starcke; Hvitved, Tom; Filinski, Andrzej

    2009-01-01

    We present an extension of the programming-by-contract (PBC) paradigm to a concurrent and distributed environment.  Classical PBC is characterized by absolute conformance of code to its specification, assigning blame in case of failures, and a hierarchical, cooperative decomposition model – none...

  1. Airport acoustics: Aircraft noise distribution and modelling of some ...

    African Journals Online (AJOL)

    Airport acoustics: Aircraft noise distribution and modelling of some aircraft parameters. MU Onuu, EO Obisung. Abstract. No Abstract. Nigerian Journal of Physics Vol. 17 (Supplement) 2005: pp. 177-186. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT.

  2. Noise Residual Learning for Noise Modeling in Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Forchhammer, Søren

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The noise model is one of the inherently difficult challenges in DVC. This paper considers Transform Domain Wyner-Ziv (TDWZ) coding and proposes...

  3. Business models for distributed generation in a liberalized market environment

    International Nuclear Information System (INIS)

    Gordijn, Jaap; Akkermans, Hans

    2007-01-01

    The analysis of the potential of emerging innovative technologies calls for a systems-theoretic approach that takes into account technical as well as socio-economic factors. This paper reports the main findings of several business case studies of different future applications in various countries of distributed power generation technologies, all based on a common methodology for networked business modeling and analysis. (author)

  4. Performance prediction model for distributed applications on multicore clusters

    CSIR Research Space (South Africa)

    Khanyile, NP

    2012-07-01

    Full Text Available discusses some of the short comings of this law in the current age. We propose a theoretical model for predicting the behavior of a distributed algorithm given the network restrictions of the cluster used. The paper focuses on the impact of latency...

  5. Modeling the distribution of extreme share return in Malaysia using Generalized Extreme Value (GEV) distribution

    Science.gov (United States)

    Hasan, Husna; Radi, Noor Fadhilah Ahmad; Kassim, Suraiya

    2012-05-01

    Extreme share return in Malaysia is studied. The monthly, quarterly, half yearly and yearly maximum returns are fitted to the Generalized Extreme Value (GEV) distribution. The Augmented Dickey Fuller (ADF) and Phillips Perron (PP) tests are performed to test for stationarity, while Mann-Kendall (MK) test is for the presence of monotonic trend. Maximum Likelihood Estimation (MLE) is used to estimate the parameter while L-moments estimate (LMOM) is used to initialize the MLE optimization routine for the stationary model. Likelihood ratio test is performed to determine the best model. Sherman's goodness of fit test is used to assess the quality of convergence of the GEV distribution by these monthly, quarterly, half yearly and yearly maximum. Returns levels are then estimated for prediction and planning purposes. The results show all maximum returns for all selection periods are stationary. The Mann-Kendall test indicates the existence of trend. Thus, we ought to model for non-stationary model too. Model 2, where the location parameter is increasing with time is the best for all selection intervals. Sherman's goodness of fit test shows that monthly, quarterly, half yearly and yearly maximum converge to the GEV distribution. From the results, it seems reasonable to conclude that yearly maximum is better for the convergence to the GEV distribution especially if longer records are available. Return level estimates, which is the return level (in this study return amount) that is expected to be exceeded, an average, once every t time periods starts to appear in the confidence interval of T = 50 for quarterly, half yearly and yearly maximum.

  6. Modelling Reliability of Supply and Infrastructural Dependency in Energy Distribution Systems

    OpenAIRE

    Helseth, Arild

    2008-01-01

    This thesis presents methods and models for assessing reliability of supply and infrastructural dependency in energy distribution systems with multiple energy carriers. The three energy carriers of electric power, natural gas and district heating are considered. Models and methods for assessing reliability of supply in electric power systems are well documented, frequently applied in the industry and continuously being subject to research and improvement. On the contrary, there are compar...

  7. A Traction Control Strategy with an Efficiency Model in a Distributed Driving Electric Vehicle

    OpenAIRE

    Lin, Cheng; Cheng, Xingqun

    2014-01-01

    Both active safety and fuel economy are important issues for vehicles. This paper focuses on a traction control strategy with an efficiency model in a distributed driving electric vehicle. In emergency situation, a sliding mode control algorithm was employed to achieve antislip control through keeping the wheels' slip ratios below 20%. For general longitudinal driving cases, an efficiency model aiming at improving the fuel economy was built through an offline optimization stream within the tw...

  8. Model documentation: Natural Gas Transmission and Distribution Model of the National Energy Modeling System; Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-02-24

    The Natural Gas Transmission and Distribution Model (NGTDM) is a component of the National Energy Modeling System (NEMS) used to represent the domestic natural gas transmission and distribution system. NEMS is the third in a series of computer-based, midterm energy modeling systems used since 1974 by the Energy Information Administration (EIA) and its predecessor, the Federal Energy Administration, to analyze domestic energy-economy markets and develop projections. This report documents the archived version of NGTDM that was used to produce the natural gas forecasts used in support of the Annual Energy Outlook 1994, DOE/EIA-0383(94). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic design, provides detail on the methodology employed, and describes the model inputs, outputs, and key assumptions. It is intended to fulfill the legal obligation of the EIA to provide adequate documentation in support of its models (Public Law 94-385, Section 57.b.2). This report represents Volume 1 of a two-volume set. (Volume 2 will report on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.) Subsequent chapters of this report provide: (1) an overview of the NGTDM (Chapter 2); (2) a description of the interface between the National Energy Modeling System (NEMS) and the NGTDM (Chapter 3); (3) an overview of the solution methodology of the NGTDM (Chapter 4); (4) the solution methodology for the Annual Flow Module (Chapter 5); (5) the solution methodology for the Distributor Tariff Module (Chapter 6); (6) the solution methodology for the Capacity Expansion Module (Chapter 7); (7) the solution methodology for the Pipeline Tariff Module (Chapter 8); and (8) a description of model assumptions, inputs, and outputs (Chapter 9).

  9. An Improved Method for Reconfiguring and Optimizing Electrical Active Distribution Network Using Evolutionary Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Nur Faziera Napis

    2018-05-01

    Full Text Available The presence of optimized distributed generation (DG with suitable distribution network reconfiguration (DNR in the electrical distribution network has an advantage for voltage support, power losses reduction, deferment of new transmission line and distribution structure and system stability improvement. However, installation of a DG unit at non-optimal size with non-optimal DNR may lead to higher power losses, power quality problem, voltage instability and incremental of operational cost. Thus, an appropriate DG and DNR planning are essential and are considered as an objective of this research. An effective heuristic optimization technique named as improved evolutionary particle swarm optimization (IEPSO is proposed in this research. The objective function is formulated to minimize the total power losses (TPL and to improve the voltage stability index (VSI. The voltage stability index is determined for three load demand levels namely light load, nominal load, and heavy load with proper optimal DNR and DG sizing. The performance of the proposed technique is compared with other optimization techniques, namely particle swarm optimization (PSO and iteration particle swarm optimization (IPSO. Four case studies on IEEE 33-bus and IEEE 69-bus distribution systems have been conducted to validate the effectiveness of the proposed IEPSO. The optimization results show that, the best achievement is done by IEPSO technique with power losses reduction up to 79.26%, and 58.41% improvement in the voltage stability index. Moreover, IEPSO has the fastest computational time for all load conditions as compared to other algorithms.

  10. Modelling the distribution of domestic ducks in Monsoon Asia

    Science.gov (United States)

    Van Bockel, Thomas P.; Prosser, Diann; Franceschini, Gianluca; Biradar, Chandra; Wint, William; Robinson, Tim; Gilbert, Marius

    2011-01-01

    Domestic ducks are considered to be an important reservoir of highly pathogenic avian influenza (HPAI), as shown by a number of geospatial studies in which they have been identified as a significant risk factor associated with disease presence. Despite their importance in HPAI epidemiology, their large-scale distribution in Monsoon Asia is poorly understood. In this study, we created a spatial database of domestic duck census data in Asia and used it to train statistical distribution models for domestic duck distributions at a spatial resolution of 1km. The method was based on a modelling framework used by the Food and Agriculture Organisation to produce the Gridded Livestock of the World (GLW) database, and relies on stratified regression models between domestic duck densities and a set of agro-ecological explanatory variables. We evaluated different ways of stratifying the analysis and of combining the prediction to optimize the goodness of fit of the predictions. We found that domestic duck density could be predicted with reasonable accuracy (mean RMSE and correlation coefficient between log-transformed observed and predicted densities being 0.58 and 0.80, respectively), using a stratification based on livestock production systems. We tested the use of artificially degraded data on duck distributions in Thailand and Vietnam as training data, and compared the modelled outputs with the original high-resolution data. This showed, for these two countries at least, that these approaches could be used to accurately disaggregate provincial level (administrative level 1) statistical data to provide high resolution model distributions.

  11. Some important results from the air pollution distribution model STACKS (1988-1992)

    International Nuclear Information System (INIS)

    Erbrink, J.J.

    1993-01-01

    Attention is paid to the results of the study on the distribution of air pollutants by high chimney-stacks of electric power plants. An important product of the study is the integrated distribution model STACKS (Short Term Air-pollutant Concentrations Kema modelling System). The improvements and the extensions of STACKS are described in relation to the National Model, which has been used to estimate the environmental effects of individual chimney-stacks. The National Model shows unacceptable variations for high pollutant sources. Based on the results of STACKS revision of the National model has been taken into consideration. By means of the revised National Model a more realistic estimation of the environmental effects of electric power plants can be carried out

  12. Regulation of electricity distribution: Issues for implementing a norm model

    International Nuclear Information System (INIS)

    Bjoerndal, Endre; Bjoerndal, Mette; Bjoernenak, Trond; Johnsen, Thore

    2005-01-01

    The Norwegian regulation of transmission and distribution of electricity is currently under revision, and several proposals, including price caps, various norm models and adjustments to the present revenue cap model, have been considered by the Norwegian regulator, NVE. Our starting point is that a successful and sustainable income-regulation-model for electricity distribution should be in accordance with the way of thinking, and the managerial tools of modern businesses. In the regulation it is assumed that decisions regarding operations and investments are made by independent, business oriented entities. The ambition of a dynamically efficient industry therefore requires that the regulatory model and its implementation support best practice business performance. This will influence how the cost base is determined and the way investments are dealt with. We will investigate a possible implementation of a regulatory model based on cost norms. In this we will distinguish between on the one hand, customer driven costs, and on the other hand, costs related to the network itself. The network related costs, which account for approximately 80% of the total cost of electricity distribution, include the costs of operating and maintaining the network, as well as capital costs. These are the ''difficult'' costs, as their levels depend on structural and climatic factors, as well as the number of customers and the load that is served. Additionally, the costs are not separable, since for instance maintenance and investments can be substitutable activities. The work concentrates on verifying the cost model, and evaluating implications for the use of the present efficiency model (DEA) in the regulation. Moreover, we consider how network related costs can be managed in a norm model. Finally, it is highlighted that an important part of a regulatory model based on cost norms is to devise quality measures and how to use them in the economic regulation. (Author)

  13. A distributed approach for parameters estimation in System Biology models

    International Nuclear Information System (INIS)

    Mosca, E.; Merelli, I.; Alfieri, R.; Milanesi, L.

    2009-01-01

    Due to the lack of experimental measurements, biological variability and experimental errors, the value of many parameters of the systems biology mathematical models is yet unknown or uncertain. A possible computational solution is the parameter estimation, that is the identification of the parameter values that determine the best model fitting respect to experimental data. We have developed an environment to distribute each run of the parameter estimation algorithm on a different computational resource. The key feature of the implementation is a relational database that allows the user to swap the candidate solutions among the working nodes during the computations. The comparison of the distributed implementation with the parallel one showed that the presented approach enables a faster and better parameter estimation of systems biology models.

  14. Parton distribution functions with QED corrections in the valon model

    Science.gov (United States)

    Mottaghizadeh, Marzieh; Taghavi Shahri, Fatemeh; Eslami, Parvin

    2017-10-01

    The parton distribution functions (PDFs) with QED corrections are obtained by solving the QCD ⊗QED DGLAP evolution equations in the framework of the "valon" model at the next-to-leading-order QCD and the leading-order QED approximations. Our results for the PDFs with QED corrections in this phenomenological model are in good agreement with the newly related CT14QED global fits code [Phys. Rev. D 93, 114015 (2016), 10.1103/PhysRevD.93.114015] and APFEL (NNPDF2.3QED) program [Comput. Phys. Commun. 185, 1647 (2014), 10.1016/j.cpc.2014.03.007] in a wide range of x =[10-5,1 ] and Q2=[0.283 ,108] GeV2 . The model calculations agree rather well with those codes. In the latter, we proposed a new method for studying the symmetry breaking of the sea quark distribution functions inside the proton.

  15. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  16. A model for fission product distribution in CANDU fuel

    International Nuclear Information System (INIS)

    Muzumdar, A.P.

    1983-01-01

    This paper describes a model to estimate the distribution of active fission products among the UO 2 grains, grain-boundaries, and the free void spaces in CANDU fuel elements during normal operation. This distribution is required for the calculation of the potential release of activity from failed fuel sheaths during a loss-of-coolant accident. The activity residing in the free spaces (''free'' inventory) is available for release upon sheath rupture, whereas relatively high fuel temperatures and/or thermal shock are required to release the activity in the grain boundaries or grains. A preliminary comparison of the model with the data from in-reactor sweep-gas experiments performed in Canada yields generally good agreement, with overprediction rather than under prediction of radiologically important isotopes, such as I 131 . The model also appears to generally agree with the ''free'' inventory release calculated using ANS-5.4. (author)

  17. A simple model for skewed species-lifetime distributions

    KAUST Repository

    Murase, Yohsuke

    2010-06-11

    A simple model of a biological community assembly is studied. Communities are assembled by successive migrations and extinctions of species. In the model, species are interacting with each other. The intensity of the interaction between each pair of species is denoted by an interaction coefficient. At each time step, a new species is introduced to the system with randomly assigned interaction coefficients. If the sum of the coefficients, which we call the fitness of a species, is negative, the species goes extinct. The species-lifetime distribution is found to be well characterized by a stretched exponential function with an exponent close to 1/2. This profile agrees not only with more realistic population dynamics models but also with fossil records. We also find that an age-independent and inversely diversity-dependent mortality, which is confirmed in the simulation, is a key mechanism accounting for the distribution. © IOP Publishing Ltd and Deutsche Physikalische Gesellschaft.

  18. Using Model Checking for Analyzing Distributed Power Control Problems

    Directory of Open Access Journals (Sweden)

    Thomas Brihaye

    2010-01-01

    Full Text Available Model checking (MC is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control (PC problem can be modeled by a timed game between a given transmitter and its environment, the authors wanted to know whether this approach can be applied to distributed PC. It turns out that it can be applied successfully and allows one to analyze realistic scenarios including the case of discrete transmit powers and games with incomplete information. The proposed methodology is as follows. We state some objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired properties are verified and determine a winning strategy.

  19. Voxel inversion of airborne electromagnetic data for improved model integration

    Science.gov (United States)

    Fiandaca, Gianluca; Auken, Esben; Kirkegaard, Casper; Vest Christiansen, Anders

    2014-05-01

    Inversion of electromagnetic data has migrated from single site interpretations to inversions including entire surveys using spatial constraints to obtain geologically reasonable results. Though, the model space is usually linked to the actual observation points. For airborne electromagnetic (AEM) surveys the spatial discretization of the model space reflects the flight lines. On the contrary, geological and groundwater models most often refer to a regular voxel grid, not correlated to the geophysical model space, and the geophysical information has to be relocated for integration in (hydro)geological models. We have developed a new geophysical inversion algorithm working directly in a voxel grid disconnected from the actual measuring points, which then allows for informing directly geological/hydrogeological models. The new voxel model space defines the soil properties (like resistivity) on a set of nodes, and the distribution of the soil properties is computed everywhere by means of an interpolation function (e.g. inverse distance or kriging). Given this definition of the voxel model space, the 1D forward responses of the AEM data are computed as follows: 1) a 1D model subdivision, in terms of model thicknesses, is defined for each 1D data set, creating "virtual" layers. 2) the "virtual" 1D models at the sounding positions are finalized by interpolating the soil properties (the resistivity) in the center of the "virtual" layers. 3) the forward response is computed in 1D for each "virtual" model. We tested the new inversion scheme on an AEM survey carried out with the SkyTEM system close to Odder, in Denmark. The survey comprises 106054 dual mode AEM soundings, and covers an area of approximately 13 km X 16 km. The voxel inversion was carried out on a structured grid of 260 X 325 X 29 xyz nodes (50 m xy spacing), for a total of 2450500 inversion parameters. A classical spatially constrained inversion (SCI) was carried out on the same data set, using 106054

  20. Application distribution model and related security attacks in VANET

    Science.gov (United States)

    Nikaein, Navid; Kanti Datta, Soumya; Marecar, Irshad; Bonnet, Christian

    2013-03-01

    In this paper, we present a model for application distribution and related security attacks in dense vehicular ad hoc networks (VANET) and sparse VANET which forms a delay tolerant network (DTN). We study the vulnerabilities of VANET to evaluate the attack scenarios and introduce a new attacker`s model as an extension to the work done in [6]. Then a VANET model has been proposed that supports the application distribution through proxy app stores on top of mobile platforms installed in vehicles. The steps of application distribution have been studied in detail. We have identified key attacks (e.g. malware, spamming and phishing, software attack and threat to location privacy) for dense VANET and two attack scenarios for sparse VANET. It has been shown that attacks can be launched by distributing malicious applications and injecting malicious codes to On Board Unit (OBU) by exploiting OBU software security holes. Consequences of such security attacks have been described. Finally, countermeasures including the concepts of sandbox have also been presented in depth.

  1. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    Science.gov (United States)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40

  2. Studying the Impact of Distributed Solar PV on Power Systems using Integrated Transmission and Distribution Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Himanshu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krad, Ibrahim [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-24

    This paper presents the results of a distributed solar PV impact assessment study that was performed using a synthetic integrated transmission (T) and distribution (D) model. The primary objective of the study was to present a new approach for distributed solar PV impact assessment, where along with detailed models of transmission and distribution networks, consumer loads were modeled using the physics of end-use equipment, and distributed solar PV was geographically dispersed and connected to the secondary distribution networks. The highlights of the study results were (i) increase in the Area Control Error (ACE) at high penetration levels of distributed solar PV; and (ii) differences in distribution voltages profiles and voltage regulator operations between integrated T&D and distribution only simulations.

  3. Modeling and optimization of parallel and distributed embedded systems

    CERN Document Server

    Munir, Arslan; Ranka, Sanjay

    2016-01-01

    This book introduces the state-of-the-art in research in parallel and distributed embedded systems, which have been enabled by developments in silicon technology, micro-electro-mechanical systems (MEMS), wireless communications, computer networking, and digital electronics. These systems have diverse applications in domains including military and defense, medical, automotive, and unmanned autonomous vehicles. The emphasis of the book is on the modeling and optimization of emerging parallel and distributed embedded systems in relation to the three key design metrics of performance, power and dependability.

  4. Models, Languages and Logics for Concurrent Distributed Systems

    DEFF Research Database (Denmark)

    The EEC Esprit Basic Research Action No 3011, Models, Languages and Logics for Con current Distributed Systems, CEDISYS, held its second workshop at Aarhus University in May, l991, following the successful workshop in San Miniato in 1990. The Aarhus Workshop was centered around CEDISYS research...... activities, and the selected themes of Applications and Automated Tools in the area of Distributed Systerns. The 24 participants were CEDISYS partners, and invited guests with expertise on the selected themes. This booklet contains the program of the workshop, short abstracts for the talks presented...

  5. Mechanistic model for void distribution in flashing flow

    International Nuclear Information System (INIS)

    Riznic, J.; Ishii, M.; Afgan, N.

    1987-01-01

    A problem of discharging of an initially subcooled liquid from a high pressure condition into a low pressure environment is quite important in several industrial systems such as nuclear reactors and chemical reactors. A new model for the flashing process is proposed here based on the wall nucleation theory, bubble growth model and drift-flux bubble transport model. In order to calculate the bubble number density, the bubble number transport equation with a distributed source from the wall nucleation sites is used. The model predictions in terms of the void fraction are compared to Moby Dick and BNL experimental data. It shows that satisfactory agreements could be obtained from the present model without any floating parameter to be adjusted with data. This result indicates that, at least for the experimental conditions considered here, the mechanistic prediction of the flashing phenomenon is possible based on the present wall nucleation based model. 43 refs., 4 figs

  6. A Meteorological Distribution System for High Resolution Terrestrial Modeling (MicroMet)

    Science.gov (United States)

    Liston, G. E.; Elder, K.

    2004-12-01

    Spatially distributed terrestrial models generally require atmospheric forcing data on horizontal grids that are of higher resolution than available meteorological data. Furthermore, the meteorological data collected may not necessarily represent the area of interest's meteorological variability. To address these deficiencies, computationally efficient and physically realistic methods must be developed to take available meteorological data sets (e.g., meteorological tower observations) and generate high-resolution atmospheric-forcing distributions. This poster describes MicroMet, a quasi-physically-based, but simple meteorological distribution model designed to produce high-resolution (e.g., 5-m to 1-km horizontal grid increments) meteorological data distributions required to run spatially distributed terrestrial models over a wide variety of landscapes. The model produces distributions of the seven fundamental atmospheric forcing variables required to run most terrestrial models: air temperature, relative humidity, wind speed, wind direction, incoming solar radiation, incoming longwave radiation, and precipitation. MicroMet includes a preprocessor that analyzes meteorological station data and identifies and repairs potential data deficiencies. The model uses known relationships between meteorological variables and the surrounding area (primarily topography) to distribute those variables over any given landscape. MicroMet performs two kinds of adjustments to available meteorological data: 1) when there are data at more than one location, at a given time, the data are spatially interpolated over the domain using a Barnes objective analysis scheme, and 2) physical sub-models are applied to each MicroMet variable to improve its realism at a given point in space and time with respect to the terrain. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) will be used as example Micro

  7. Using Model Checking for Analyzing Distributed Power Control Problems

    DEFF Research Database (Denmark)

    Brihaye, Thomas; Jungers, Marc; Lasaulce, Samson

    2010-01-01

    Model checking (MC) is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control ( PC) problem can be modeled by a timed game between a given transmitter and its environment, the authors...... objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired...

  8. Measured PET Data Characterization with the Negative Binomial Distribution Model.

    Science.gov (United States)

    Santarelli, Maria Filomena; Positano, Vincenzo; Landini, Luigi

    2017-01-01

    Accurate statistical model of PET measurements is a prerequisite for a correct image reconstruction when using statistical image reconstruction algorithms, or when pre-filtering operations must be performed. Although radioactive decay follows a Poisson distribution, deviation from Poisson statistics occurs on projection data prior to reconstruction due to physical effects, measurement errors, correction of scatter and random coincidences. Modelling projection data can aid in understanding the statistical nature of the data in order to develop efficient processing methods and to reduce noise. This paper outlines the statistical behaviour of measured emission data evaluating the goodness of fit of the negative binomial (NB) distribution model to PET data for a wide range of emission activity values. An NB distribution model is characterized by the mean of the data and the dispersion parameter α that describes the deviation from Poisson statistics. Monte Carlo simulations were performed to evaluate: (a) the performances of the dispersion parameter α estimator, (b) the goodness of fit of the NB model for a wide range of activity values. We focused on the effect produced by correction for random and scatter events in the projection (sinogram) domain, due to their importance in quantitative analysis of PET data. The analysis developed herein allowed us to assess the accuracy of the NB distribution model to fit corrected sinogram data, and to evaluate the sensitivity of the dispersion parameter α to quantify deviation from Poisson statistics. By the sinogram ROI-based analysis, it was demonstrated that deviation on the measured data from Poisson statistics can be quantitatively characterized by the dispersion parameter α, in any noise conditions and corrections.

  9. Comparison of different reliability improving investment strategies of Finnish medium-voltage distribution systems

    Energy Technology Data Exchange (ETDEWEB)

    Laagland, H.

    2012-07-01

    The electricity distribution sector in Finland is highly regulated and the return on investments in distribution networks is low. Low profits don't make the electricity distribution sector attractive to outside investors. During the second regulatory period of 2008-2011 incentives are included into the Finnish regulation model which allows higher profits for the network owners for right allocated network investments leading to lower operation and interruption costs. The goal of the thesis is to find cost-effective medium-voltage distribution system investment strategies for the Finnish power distribution companies with respect to the incentives of the second regulatory period. In this work the sectionalisation concept is further developed by deriving equations for a homogeneous electricity distribution system for the economical and reliability indices as a function of the number of sectionalisation zones. The cost-effective medium-voltage distribution system investment strategies are found by studying the technical and economic interaction of feeder automation on different network structures. Ten feeder automation schemes have been applied to six urban/rural area generic feeders and two real rural area feeders of a distribution company in western Finland. The analytical approach includes modelling of the feeders and feeder functions and calculation of the economical and reliability indices. The following investment areas are included: different electricity distribution systems, new substation, new switching station, central earth-fault current compensation, cabling and feeder automation. The value of the results of this work is that they reveal the influence that feeder automation has on the reliability and economy of different distribution structures. This created transparency enables a national and/or distribution company network investment strategy to optimise the economic benefits of investments. (orig.)

  10. Modelling the distribution of chickens, ducks, and geese in China

    Science.gov (United States)

    Prosser, Diann J.; Wu, Junxi; Ellis, Erie C.; Gale, Fred; Van Boeckel, Thomas P.; Wint, William; Robinson, Tim; Xiao, Xiangming; Gilbert, Marius

    2011-01-01

    Global concerns over the emergence of zoonotic pandemics emphasize the need for high-resolution population distribution mapping and spatial modelling. Ongoing efforts to model disease risk in China have been hindered by a lack of available species level distribution maps for poultry. The goal of this study was to develop 1 km resolution population density models for China's chickens, ducks, and geese. We used an information theoretic approach to predict poultry densities based on statistical relationships between poultry census data and high-resolution agro-ecological predictor variables. Model predictions were validated by comparing goodness of fit measures (root mean square error and correlation coefficient) for observed and predicted values for 1/4 of the sample data which were not used for model training. Final output included mean and coefficient of variation maps for each species. We tested the quality of models produced using three predictor datasets and 4 regional stratification methods. For predictor variables, a combination of traditional predictors for livestock mapping and land use predictors produced the best goodness of fit scores. Comparison of regional stratifications indicated that for chickens and ducks, a stratification based on livestock production systems produced the best results; for geese, an agro-ecological stratification produced best results. However, for all species, each method of regional stratification produced significantly better goodness of fit scores than the global model. Here we provide descriptive methods, analytical comparisons, and model output for China's first high resolution, species level poultry distribution maps. Output will be made available to the scientific and public community for use in a wide range of applications from epidemiological studies to livestock policy and management initiatives.

  11. Optimization model for the design of distributed wastewater treatment networks

    Directory of Open Access Journals (Sweden)

    Ibrić Nidret

    2012-01-01

    Full Text Available In this paper we address the synthesis problem of distributed wastewater networks using mathematical programming approach based on the superstructure optimization. We present a generalized superstructure and optimization model for the design of the distributed wastewater treatment networks. The superstructure includes splitters, treatment units, mixers, with all feasible interconnections including water recirculation. Based on the superstructure the optimization model is presented. The optimization model is given as a nonlinear programming (NLP problem where the objective function can be defined to minimize the total amount of wastewater treated in treatment operations or to minimize the total treatment costs. The NLP model is extended to a mixed integer nonlinear programming (MINLP problem where binary variables are used for the selection of the wastewater treatment technologies. The bounds for all flowrates and concentrations in the wastewater network are specified as general equations. The proposed models are solved using the global optimization solvers (BARON and LINDOGlobal. The application of the proposed models is illustrated on the two wastewater network problems of different complexity. First one is formulated as the NLP and the second one as the MINLP. For the second one the parametric and structural optimization is performed at the same time where optimal flowrates, concentrations as well as optimal technologies for the wastewater treatment are selected. Using the proposed model both problems are solved to global optimality.

  12. iSEDfit: Bayesian spectral energy distribution modeling of galaxies

    Science.gov (United States)

    Moustakas, John

    2017-08-01

    iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.

  13. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  14. Football fever: self-affirmation model for goal distributions

    Directory of Open Access Journals (Sweden)

    W. Janke

    2009-01-01

    Full Text Available The outcome of football games, as well as matches of most other popular team sports, depends on a combination of the skills of players and coaches and a number of external factors which, due to their complex nature, are presumably best viewed as random. Such parameters include the unpredictabilities of playing the ball, the players' shape of the day or environmental conditions such as the weather and the behavior of the audience. Under such circumstances, it appears worthwhile to analyze football score data with the toolbox of mathematical statistics in order to separate deterministic from stochastic effects and see what impact the cooperative and social nature of the "agents" of the system has on the resulting stochastic observables. Considering the probability distributions of scored goals for the home and away teams, it turns out that especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. On the contrary, some more specific probability densities such as those discussed in the context of extreme-value statistics or the so-called negative binomial distribution fit these data rather well. There seemed to be no good argument to date, however, why the simplest Poissonian model fails and, instead, the latter distributions should be observed. To fill this gap, we introduced a number of microscopic models for the scoring behavior, resulting in a Bernoulli random process with a simple component of self-affirmation. These models allow us to represent the observed probability distributions surprisingly well, and the phenomenological distributions used earlier can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the "FIFA World Cup" series, and found the proposed models to be applicable in

  15. Predicting Spatial Distribution of Key Honeybee Pests in Kenya Using Remotely Sensed and Bioclimatic Variables: Key Honeybee Pests Distribution Models

    Directory of Open Access Journals (Sweden)

    David M. Makori

    2017-02-01

    Full Text Available Bee keeping is indispensable to global food production. It is an alternate income source, especially in rural underdeveloped African settlements, and an important forest conservation incentive. However, dwindling honeybee colonies around the world are attributed to pests and diseases whose spatial distribution and influences are not well established. In this study, we used remotely sensed data to improve the reliability of pest ecological niche (EN models to attain reliable pest distribution maps. Occurrence data on four pests (Aethina tumida, Galleria mellonella, Oplostomus haroldi and Varroa destructor were collected from apiaries within four main agro-ecological regions responsible for over 80% of Kenya’s bee keeping. Africlim bioclimatic and derived normalized difference vegetation index (NDVI variables were used to model their ecological niches using Maximum Entropy (MaxEnt. Combined precipitation variables had a high positive logit influence on all remotely sensed and biotic models’ performance. Remotely sensed vegetation variables had a substantial effect on the model, contributing up to 40.8% for G. mellonella and regions with high rainfall seasonality were predicted to be high-risk areas. Projections (to 2055 indicated that, with the current climate change trend, these regions will experience increased honeybee pest risk. We conclude that honeybee pests could be modelled using bioclimatic data and remotely sensed variables in MaxEnt. Although the bioclimatic data were most relevant in all model results, incorporating vegetation seasonality variables to improve mapping the ‘actual’ habitat of key honeybee pests and to identify risk and containment zones needs to be further investigated.

  16. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  17. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  18. Gravity model improvement investigation. [improved gravity model for determination of ocean geoid

    Science.gov (United States)

    Siry, J. W.; Kahn, W. D.; Bryan, J. W.; Vonbun, F. F.

    1973-01-01

    This investigation was undertaken to improve the gravity model and hence the ocean geoid. A specific objective is the determination of the gravity field and geoid with a space resolution of approximately 5 deg and a height resolution of the order of five meters. The concept of the investigation is to utilize both GEOS-C altimeter and satellite-to-satellite tracking data to achieve the gravity model improvement. It is also planned to determine the geoid in selected regions with a space resolution of about a degree and a height resolution of the order of a meter or two. The short term objectives include the study of the gravity field in the GEOS-C calibration area outlined by Goddard, Bermuda, Antigua, and Cape Kennedy, and also in the eastern Pacific area which is viewed by ATS-F.

  19. Species distribution model transferability and model grain size - finer may not always be better.

    Science.gov (United States)

    Manzoor, Syed Amir; Griffiths, Geoffrey; Lukac, Martin

    2018-05-08

    Species distribution models have been used to predict the distribution of invasive species for conservation planning. Understanding spatial transferability of niche predictions is critical to promote species-habitat conservation and forecasting areas vulnerable to invasion. Grain size of predictor variables is an important factor affecting the accuracy and transferability of species distribution models. Choice of grain size is often dependent on the type of predictor variables used and the selection of predictors sometimes rely on data availability. This study employed the MAXENT species distribution model to investigate the effect of the grain size on model transferability for an invasive plant species. We modelled the distribution of Rhododendron ponticum in Wales, U.K. and tested model performance and transferability by varying grain size (50 m, 300 m, and 1 km). MAXENT-based models are sensitive to grain size and selection of variables. We found that over-reliance on the commonly used bioclimatic variables may lead to less accurate models as it often compromises the finer grain size of biophysical variables which may be more important determinants of species distribution at small spatial scales. Model accuracy is likely to increase with decreasing grain size. However, successful model transferability may require optimization of model grain size.

  20. Two-fluid model with droplet size distribution for condensing steam flows

    International Nuclear Information System (INIS)

    Wróblewski, Włodzimierz; Dykas, Sławomir

    2016-01-01

    The process of energy conversion in the low pressure part of steam turbines may be improved using new and more accurate numerical models. The paper presents a description of a model intended for the condensing steam flow modelling. The model uses a standard condensation model. A physical and a numerical model of the mono- and polydispersed wet-steam flow are presented. The proposed two-fluid model solves separate flow governing equations for the compressible, inviscid vapour and liquid phase. The method of moments with a prescribed function is used for the reconstruction of the water droplet size distribution. The described model is presented for the liquid phase evolution in the flow through the de Laval nozzle. - Highlights: • Computational Fluid Dynamics. • Steam condensation in transonic flows through the Laval nozzles. • In-house CFD code – two-phase flow, two-fluid monodispersed and polydispersed model.

  1. Dynamic modeling method of the bolted joint with uneven distribution of joint surface pressure

    Science.gov (United States)

    Li, Shichao; Gao, Hongli; Liu, Qi; Liu, Bokai

    2018-03-01

    The dynamic characteristics of the bolted joints have a significant influence on the dynamic characteristics of the machine tool. Therefore, establishing a reasonable bolted joint dynamics model is helpful to improve the accuracy of machine tool dynamics model. Because the pressure distribution on the joint surface is uneven under the concentrated force of bolts, a dynamic modeling method based on the uneven pressure distribution of the joint surface is presented in this paper to improve the dynamic modeling accuracy of the machine tool. The analytic formulas between the normal, tangential stiffness per unit area and the surface pressure on the joint surface can be deduced based on the Hertz contact theory, and the pressure distribution on the joint surface can be obtained by the finite element software. Futhermore, the normal and tangential stiffness distribution on the joint surface can be obtained by the analytic formula and the pressure distribution on the joint surface, and assigning it into the finite element model of the joint. Qualitatively compared the theoretical mode shapes and the experimental mode shapes, as well as quantitatively compared the theoretical modal frequencies and the experimental modal frequencies. The comparison results show that the relative error between the first four-order theoretical modal frequencies and the first four-order experimental modal frequencies is 0.2% to 4.2%. Besides, the first four-order theoretical mode shapes and the first four-order experimental mode shapes are similar and one-to-one correspondence. Therefore, the validity of the theoretical model is verified. The dynamic modeling method proposed in this paper can provide a theoretical basis for the accurate dynamic modeling of the bolted joint in machine tools.

  2. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  3. Distributed mode filtering rod fiber amplifier delivering 292W with improved mode stability

    DEFF Research Database (Denmark)

    Laurila, Marko; Jørgensen, Mette Marie; Hansen, Kristian Rymann

    2012-01-01

    We demonstrate a high power fiber (85μm core) amplifier delivering up to 292Watts of average output power using a mode-locked 30ps source at 1032nm. Utilizing a single mode distributed mode filter bandgap rod fiber, we demonstrate 44% power improvement before the threshold-like onset of mode inst...

  4. Improved Extreme-Scenario Extraction Method For The Economic Dispatch Of Active Distribution Networks

    DEFF Research Database (Denmark)

    Zhang, Yipu; Ai, Xiaomeng; Fang, Jiakun

    2017-01-01

    ) of active distribution network with renewables. The extreme scenarios are selected from the historical data using the improved minimum volume enclosing ellipsoid (MVEE) algorithm to guarantee the security of system operation while avoid frequently switching the transformer tap. It is theoretically proved...

  5. Distributed Leadership an Instrument for School Improvement: The Study of Public Senior High Schools in Ghana

    Science.gov (United States)

    Dampson, Dandy George; Havor, Felicia Mensah; Laryea, Prince

    2018-01-01

    The purpose of the study was to investigate the influence of distributed leadership in Public Senior High Schools (SHS) with regard to school improvement. Using the Explanatory Sequential Mixed-Method design, 92 teachers and 4 head masters and 4 assistant head masters were randomly and census sampled. Three research questions were formulated and…

  6. Evolution of a Family Nurse Practitioner Program to Improve Primary Care Distribution

    Science.gov (United States)

    Andrus, Len Hughes; Fenley, Mary D.

    1976-01-01

    Describes a Family Nurse Practitioner Program that has effectively improved the distribution of primary health care manpower in rural areas. Program characteristics include selection of personnel from areas of need, decentralization of clinical and didactic training sites, competency-based portable curriculum, and circuit-riding institutionally…

  7. Improved size distribution control of silicon nanocrystals in a spatially confined remote plasma

    NARCIS (Netherlands)

    Dogan, I.; Westerman, R. H. J.; M. C. M. van de Sanden,

    2015-01-01

    This work demonstrates how to improve the size distribution of silicon nanocrystals (Si-NCs) synthesized in a remote plasma, in which the flow dynamics and the particular chemistry initially resulted in the formation of small (2-10 nm) and large (50-120 nm) Si-NCs. Plasma consists of two regions: an

  8. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  9. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  10. Cost/worth assessment of reliability improvement in distribution networks by means of artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Bouhouras, Aggelos S.; Labridis, Dimitris P.; Bakirtzis, Anastasios G. [Power Systems Laboratory, Aristotle University of Thessaloniki, Dept. of Electrical and Computer Engineering, 54124 Thessaloniki (Greece)

    2010-06-15

    A major challenge for the power utilities today is to ensure a high level of reliability of supply to customers. Two main factors determine the feasibility of a project that improves the reliability of supply: the project cost (investment and operational) and the benefits that result from the implementation of the project. This paper examines the implementation of an Artificial Intelligence System in an urban distribution network, capable to locate and isolate short circuit faults in the feeder, thus accomplishing immediate restoration of electric supply to the customers. The paper describes the benefits of the project, which are supply reliability improvement and distribution network loss reduction through network reconfigurations. By comparison of the project benefits and costs the economic feasibility of such a project for an underground distribution feeder in Greece is demonstrated. (author)

  11. Modeling the brain morphology distribution in the general aging population

    Science.gov (United States)

    Huizinga, W.; Poot, D. H. J.; Roshchupkin, G.; Bron, E. E.; Ikram, M. A.; Vernooij, M. W.; Rueckert, D.; Niessen, W. J.; Klein, S.

    2016-03-01

    Both normal aging and neurodegenerative diseases such as Alzheimer's disease cause morphological changes of the brain. To better distinguish between normal and abnormal cases, it is necessary to model changes in brain morphology owing to normal aging. To this end, we developed a method for analyzing and visualizing these changes for the entire brain morphology distribution in the general aging population. The method is applied to 1000 subjects from a large population imaging study in the elderly, from which 900 were used to train the model and 100 were used for testing. The results of the 100 test subjects show that the model generalizes to subjects outside the model population. Smooth percentile curves showing the brain morphology changes as a function of age and spatiotemporal atlases derived from the model population are publicly available via an interactive web application at agingbrain.bigr.nl.

  12. Model for cadmium transport and distribution in CHO cells

    Energy Technology Data Exchange (ETDEWEB)

    Hayden, T.L.; Turner, J.E.; Williams, M.W.; Cook, J.S.; Hsie, A.W.

    1982-01-01

    A compartmental model is developed to study the transport and distribution of cadmium in Chinese hamster ovary (CHO) cells. Of central importance to the model is the role played by sequestering components which bind free Cd/sup 2 +/ ions. The most important of these is a low-molecular-weight protein, metallothionein, which is produced by the cells in response to an increase in the cellular concentration of Cd/sup 2 +/. Monte Carlo techniques are used to generate a stochastic model based on existing experimental data describing the intracellular transport of cadmium between different compartments. This approach provides an alternative to the usual numerical solution of differential-delay equations that arise in deterministic models. Our model suggests subcellular structures which may be responsible for the accumulation of cadmium and, hence, could account for cadmium detoxification. 4 figures, 1 table.

  13. Modeling stock return distributions with a quantum harmonic oscillator

    Science.gov (United States)

    Ahn, K.; Choi, M. Y.; Dai, B.; Sohn, S.; Yang, B.

    2017-11-01

    We propose a quantum harmonic oscillator as a model for the market force which draws a stock return from short-run fluctuations to the long-run equilibrium. The stochastic equation governing our model is transformed into a Schrödinger equation, the solution of which features “quantized” eigenfunctions. Consequently, stock returns follow a mixed χ distribution, which describes Gaussian and non-Gaussian features. Analyzing the Financial Times Stock Exchange (FTSE) All Share Index, we demonstrate that our model outperforms traditional stochastic process models, e.g., the geometric Brownian motion and the Heston model, with smaller fitting errors and better goodness-of-fit statistics. In addition, making use of analogy, we provide an economic rationale of the physics concepts such as the eigenstate, eigenenergy, and angular frequency, which sheds light on the relationship between finance and econophysics literature.

  14. Efficient Vaccine Distribution Based on a Hybrid Compartmental Model.

    Directory of Open Access Journals (Sweden)

    Zhiwen Yu

    Full Text Available To effectively and efficiently reduce the morbidity and mortality that may be caused by outbreaks of emerging infectious diseases, it is very important for public health agencies to make informed decisions for controlling the spread of the disease. Such decisions must incorporate various kinds of intervention strategies, such as vaccinations, school closures and border restrictions. Recently, researchers have paid increased attention to searching for effective vaccine distribution strategies for reducing the effects of pandemic outbreaks when resources are limited. Most of the existing research work has been focused on how to design an effective age-structured epidemic model and to select a suitable vaccine distribution strategy to prevent the propagation of an infectious virus. Models that evaluate age structure effects are common, but models that additionally evaluate geographical effects are less common. In this paper, we propose a new SEIR (susceptible-exposed-infectious šC recovered model, named the hybrid SEIR-V model (HSEIR-V, which considers not only the dynamics of infection prevalence in several age-specific host populations, but also seeks to characterize the dynamics by which a virus spreads in various geographic districts. Several vaccination strategies such as different kinds of vaccine coverage, different vaccine releasing times and different vaccine deployment methods are incorporated into the HSEIR-V compartmental model. We also design four hybrid vaccination distribution strategies (based on population size, contact pattern matrix, infection rate and infectious risk for controlling the spread of viral infections. Based on data from the 2009-2010 H1N1 influenza epidemic, we evaluate the effectiveness of our proposed HSEIR-V model and study the effects of different types of human behaviour in responding to epidemics.

  15. Comparison of probabilistic models of the distribution of counts

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1992-01-01

    The binominal, Poisson and modified Poisson models for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are proposed. The validity of the Poisson and the modified Poisson distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89m Y (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson distribution describes the counting experiment for short measuring times (up to T=0.5 T 1/2 ) and its application is recommended. However, the analysis of the data demonstrated that for long measurements (T≥1 T 1/2 ) Poisson distribution is not valid and the modified Poisson distribution is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. (author) 20 refs.; 7 figs.; 1 tab

  16. Infusing considerations of trophic dependencies into species distribution modelling.

    Science.gov (United States)

    Trainor, Anne M; Schmitz, Oswald J

    2014-12-01

    Community ecology involves studying the interdependence of species with each other and their environment to predict their geographical distribution and abundance. Modern species distribution analyses characterise species-environment dependency well, but offer only crude approximations of species interdependency. Typically, the dependency between focal species and other species is characterised using other species' point occurrences as spatial covariates to constrain the focal species' predicted range. This implicitly assumes that the strength of interdependency is homogeneous across space, which is not generally supported by analyses of species interactions. This discrepancy has an important bearing on the accuracy of inferences about habitat suitability for species. We introduce a framework that integrates principles from consumer-resource analyses, resource selection theory and species distribution modelling to enhance quantitative prediction of species geographical distributions. We show how to apply the framework using a case study of lynx and snowshoe hare interactions with each other and their environment. The analysis shows how the framework offers a spatially refined understanding of species distribution that is sensitive to nuances in biophysical attributes of the environment that determine the location and strength of species interactions. © 2014 John Wiley & Sons Ltd/CNRS.

  17. Experimental investigation of statistical models describing distribution of counts

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1992-01-01

    The binomial, Poisson and modified Poisson models which are used for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are considered. The validity of the Poisson and the modified Poisson statistical distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89 Y m (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson statistics describe the counting experiment for short measuring times (up to T=0.5T 1/2 ) and its application is recommended. However, analysis of the data demonstrated, with confidence, that for long measurements (T≥T 1/2 ) Poisson distribution is not valid and the modified Poisson function is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. Differences between the standard deviations evaluated on the basis of the Poisson and binomial models are especially significant for experiments with long measuring time (T/T 1/2 ≥2) and/or large detection efficiency (ε>0.30). Optimization of the measuring time for paired observations yields the same solution for either the binomial or the Poisson distribution. (orig.)

  18. Wealth distribution of simple exchange models coupled with extremal dynamics

    Science.gov (United States)

    Bagatella-Flores, N.; Rodríguez-Achach, M.; Coronel-Brizio, H. F.; Hernández-Montoya, A. R.

    2015-01-01

    Punctuated Equilibrium (PE) states that after long periods of evolutionary quiescence, species evolution can take place in short time intervals, where sudden differentiation makes new species emerge and some species extinct. In this paper, we introduce and study the effect of punctuated equilibrium on two different asset exchange models: the yard sale model (YS, winner gets a random fraction of a poorer player's wealth) and the theft and fraud model (TF, winner gets a random fraction of the loser's wealth). The resulting wealth distribution is characterized using the Gini index. In order to do this, we consider PE as a perturbation with probability ρ of being applied. We compare the resulting values of the Gini index at different increasing values of ρ in both models. We found that in the case of the TF model, the Gini index reduces as the perturbation ρ increases, not showing dependence with the agents number. While for YS we observe a phase transition which happens around ρc = 0.79. For perturbations ρ <ρc the Gini index reaches the value of one as time increases (an extreme wealth condensation state), whereas for perturbations greater than or equal to ρc the Gini index becomes different to one, avoiding the system reaches this extreme state. We show that both simple exchange models coupled with PE dynamics give more realistic results. In particular for YS, we observe a power low decay of wealth distribution.

  19. A modal approach to modeling spatially distributed vibration energy dissipation.

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, Daniel Joseph

    2010-08-01

    The nonlinear behavior of mechanical joints is a confounding element in modeling the dynamic response of structures. Though there has been some progress in recent years in modeling individual joints, modeling the full structure with myriad frictional interfaces has remained an obstinate challenge. A strategy is suggested for structural dynamics modeling that can account for the combined effect of interface friction distributed spatially about the structure. This approach accommodates the following observations: (1) At small to modest amplitudes, the nonlinearity of jointed structures is manifest primarily in the energy dissipation - visible as vibration damping; (2) Correspondingly, measured vibration modes do not change significantly with amplitude; and (3) Significant coupling among the modes does not appear to result at modest amplitudes. The mathematical approach presented here postulates the preservation of linear modes and invests all the nonlinearity in the evolution of the modal coordinates. The constitutive form selected is one that works well in modeling spatially discrete joints. When compared against a mathematical truth model, the distributed dissipation approximation performs well.

  20. Klaim-DB: A Modeling Language for Distributed Database Applications

    DEFF Research Database (Denmark)

    Wu, Xi; Li, Ximeng; Lluch Lafuente, Alberto

    2015-01-01

    and manipulation of structured data, with integrity and atomicity considerations. We present the formal semantics of KlaimDB and illustrate the use of the language in a scenario where the sales from different branches of a chain of department stores are aggregated from their local databases. It can be seen......We present the modelling language, Klaim-DB, for distributed database applications. Klaim-DB borrows the distributed nets of the coordination language Klaim but essentially re-incarnates the tuple spaces of Klaim as databases, and provides high-level language abstractions for the access...... that raising the abstraction level and encapsulating integrity checks (concerning the schema of tables, etc.) in the language primitives for database operations benefit the modelling task considerably....

  1. Estimating the cost of improving quality in electricity distribution: A parametric distance function approach

    International Nuclear Information System (INIS)

    Coelli, Tim J.; Gautier, Axel; Perelman, Sergio; Saplacan-Pop, Roxana

    2013-01-01

    The quality of electricity distribution is being more and more scrutinized by regulatory authorities, with explicit reward and penalty schemes based on quality targets having been introduced in many countries. It is then of prime importance to know the cost of improving the quality for a distribution system operator. In this paper, we focus on one dimension of quality, the continuity of supply, and we estimated the cost of preventing power outages. For that, we make use of the parametric distance function approach, assuming that outages enter in the firm production set as an input, an imperfect substitute for maintenance activities and capital investment. This allows us to identify the sources of technical inefficiency and the underlying trade-off faced by operators between quality and other inputs and costs. For this purpose, we use panel data on 92 electricity distribution units operated by ERDF (Electricité de France - Réseau Distribution) in the 2003–2005 financial years. Assuming a multi-output multi-input translog technology, we estimate that the cost of preventing one interruption is equal to 10.7€ for an average DSO. Furthermore, as one would expect, marginal quality improvements tend to be more expensive as quality itself improves. - Highlights: ► We estimate the implicit cost of outages for the main distribution company in France. ► For this purpose, we make use of a parametric distance function approach. ► Marginal quality improvements tend to be more expensive as quality itself improves. ► The cost of preventing one interruption varies from 1.8 € to 69.2 € (2005 prices). ► We estimate that, in average, it lays 33% above the regulated price of quality.

  2. Gravitational lensing by eigenvalue distributions of random matrix models

    Science.gov (United States)

    Martínez Alonso, Luis; Medina, Elena

    2018-05-01

    We propose to use eigenvalue densities of unitary random matrix ensembles as mass distributions in gravitational lensing. The corresponding lens equations reduce to algebraic equations in the complex plane which can be treated analytically. We prove that these models can be applied to describe lensing by systems of edge-on galaxies. We illustrate our analysis with the Gaussian and the quartic unitary matrix ensembles.

  3. The redshift distribution of cosmological samples: a forward modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina, E-mail: joerg.herbel@phys.ethz.ch, E-mail: tomasz.kacprzak@phys.ethz.ch, E-mail: adam.amara@phys.ethz.ch, E-mail: alexandre.refregier@phys.ethz.ch, E-mail: claudio.bruderer@phys.ethz.ch, E-mail: andrina.nicola@phys.ethz.ch [Institute for Astronomy, Department of Physics, ETH Zürich, Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland)

    2017-08-01

    Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  4. The redshift distribution of cosmological samples: a forward modeling approach

    Science.gov (United States)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina

    2017-08-01

    Determining the redshift distribution n(z) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n(z) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc{UFig} (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n(z) distributions for the acceptable models. We demonstrate the method by determining n(z) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n(z) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  5. Environmental radionuclide concentrations: statistical model to determine uniformity of distribution

    International Nuclear Information System (INIS)

    Cawley, C.N.; Fenyves, E.J.; Spitzberg, D.B.; Wiorkowski, J.; Chehroudi, M.T.

    1980-01-01

    In the evaluation of data from environmental sampling and measurement, a basic question is whether the radionuclide (or pollutant) is distributed uniformly. Since physical measurements have associated errors, it is inappropriate to consider the measurements alone in this determination. Hence, a statistical model has been developed. It consists of a weighted analysis of variance with subsequent t-tests between weighted and independent means. A computer program to perform the calculations is included

  6. Model-Driven Test Generation of Distributed Systems

    Science.gov (United States)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  7. Transverse Momentum Distributions of Electron in Simulated QED Model

    Science.gov (United States)

    Kaur, Navdeep; Dahiya, Harleen

    2018-05-01

    In the present work, we have studied the transverse momentum distributions (TMDs) for the electron in simulated QED model. We have used the overlap representation of light-front wave functions where the spin-1/2 relativistic composite system consists of spin-1/2 fermion and spin-1 vector boson. The results have been obtained for T-even TMDs in transverse momentum plane for fixed value of longitudinal momentum fraction x.

  8. The redshift distribution of cosmological samples: a forward modeling approach

    International Nuclear Information System (INIS)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina

    2017-01-01

    Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  9. Likelihood Inference of Nonlinear Models Based on a Class of Flexible Skewed Distributions

    Directory of Open Access Journals (Sweden)

    Xuedong Chen

    2014-01-01

    Full Text Available This paper deals with the issue of the likelihood inference for nonlinear models with a flexible skew-t-normal (FSTN distribution, which is proposed within a general framework of flexible skew-symmetric (FSS distributions by combining with skew-t-normal (STN distribution. In comparison with the common skewed distributions such as skew normal (SN, and skew-t (ST as well as scale mixtures of skew normal (SMSN, the FSTN distribution can accommodate more flexibility and robustness in the presence of skewed, heavy-tailed, especially multimodal outcomes. However, for this distribution, a usual approach of maximum likelihood estimates based on EM algorithm becomes unavailable and an alternative way is to return to the original Newton-Raphson type method. In order to improve the estimation as well as the way for confidence estimation and hypothesis test for the parameters of interest, a modified Newton-Raphson iterative algorithm is presented in this paper, based on profile likelihood for nonlinear regression models with FSTN distribution, and, then, the confidence interval and hypothesis test are also developed. Furthermore, a real example and simulation are conducted to demonstrate the usefulness and the superiority of our approach.

  10. Evaluation for the models of neutron diffusion theory in terms of power density distributions of the HTTR

    International Nuclear Information System (INIS)

    Takamatsu, Kuniyoshi; Shimakawa, Satoshi; Nojiri, Naoki; Fujimoto, Nozomu

    2003-10-01

    In the case of evaluations for the highest temperature of the fuels in the HTTR, it is very important to expect the power density distributions accurately; therefore, it is necessary to improve the analytical model with the neutron diffusion and the burn-up theory. The power density distributions are analyzed in terms of two models, the one mixing the fuels and the burnable poisons homogeneously and the other modeling them heterogeneously. Moreover these analytical power density distributions are compared with the ones derived from the gross gamma-ray measurements and the Monte Carlo calculational code with continuous energy. As a result the homogeneous mixed model isn't enough to expect the power density distributions of the core in the axial direction; on the other hand, the heterogeneous model improves the accuracy. (author)

  11. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    Science.gov (United States)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it

  12. Multivariate Birnbaum-Saunders Distributions: Modelling and Applications

    Directory of Open Access Journals (Sweden)

    Robert G. Aykroyd

    2018-03-01

    Full Text Available Since its origins and numerous applications in material science, the Birnbaum–Saunders family of distributions has now found widespread uses in some areas of the applied sciences such as agriculture, environment and medicine, as well as in quality control, among others. It is able to model varied data behaviour and hence provides a flexible alternative to the most usual distributions. The family includes Birnbaum–Saunders and log-Birnbaum–Saunders distributions in univariate and multivariate versions. There are now well-developed methods for estimation and diagnostics that allow in-depth analyses. This paper gives a detailed review of existing methods and of relevant literature, introducing properties and theoretical results in a systematic way. To emphasise the range of suitable applications, full analyses are included of examples based on regression and diagnostics in material science, spatial data modelling in agricultural engineering and control charts for environmental monitoring. However, potential future uses in new areas such as business, economics, finance and insurance are also discussed. This work is presented to provide a full tool-kit of novel statistical models and methods to encourage other researchers to implement them in these new areas. It is expected that the methods will have the same positive impact in the new areas as they have had elsewhere.

  13. Robust Hydrological Forecasting for High-resolution Distributed Models Using a Unified Data Assimilation Approach

    Science.gov (United States)

    Hernandez, F.; Liang, X.

    2017-12-01

    Reliable real-time hydrological forecasting, to predict important phenomena such as floods, is invaluable to the society. However, modern high-resolution distributed models have faced challenges when dealing with uncertainties that are caused by the large number of parameters and initial state estimations involved. Therefore, to rely on these high-resolution models for critical real-time forecast applications, considerable improvements on the parameter and initial state estimation techniques must be made. In this work we present a unified data assimilation algorithm called Optimized PareTo Inverse Modeling through Inverse STochastic Search (OPTIMISTS) to deal with the challenge of having robust flood forecasting for high-resolution distributed models. This new algorithm combines the advantages of particle filters and variational methods in a unique way to overcome their individual weaknesses. The analysis of candidate particles compares model results with observations in a flexible time frame, and a multi-objective approach is proposed which attempts to simultaneously minimize differences with the observations and departures from the background states by using both Bayesian sampling and non-convex evolutionary optimization. Moreover, the resulting Pareto front is given a probabilistic interpretation through kernel density estimation to create a non-Gaussian distribution of the states. OPTIMISTS was tested on a low-resolution distributed land surface model using VIC (Variable Infiltration Capacity) and on a high-resolution distributed hydrological model using the DHSVM (Distributed Hydrology Soil Vegetation Model). In the tests streamflow observations are assimilated. OPTIMISTS was also compared with a traditional particle filter and a variational method. Results show that our method can reliably produce adequate forecasts and that it is able to outperform those resulting from assimilating the observations using a particle filter or an evolutionary 4D variational

  14. Fuzzy Approximate Model for Distributed Thermal Solar Collectors Control

    KAUST Repository

    Elmetennani, Shahrazed

    2014-07-01

    This paper deals with the problem of controlling concentrated solar collectors where the objective consists of making the outlet temperature of the collector tracking a desired reference. The performance of the novel approximate model based on fuzzy theory, which has been introduced by the authors in [1], is evaluated comparing to other methods in the literature. The proposed approximation is a low order state representation derived from the physical distributed model. It reproduces the temperature transfer dynamics through the collectors accurately and allows the simplification of the control design. Simulation results show interesting performance of the proposed controller.

  15. Information Modeling for Direct Control of Distributed Energy Resources

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob

    2013-01-01

    We present an architecture for an unbundled liberalized electricity market system where a virtual power plant (VPP) is able to control a number of distributed energy resources (DERs) directly through a two-way communication link. The aggregator who operates the VPP utilizes the accumulated...... a desired accumulated response. In this paper, we design such an information model based on the markets that the aggregator participates in and based on the flexibility characteristics of the remote controlled DERs. The information model is constructed in a modular manner making the interface suitable...

  16. Predicted and measured velocity distribution in a model heat exchanger

    International Nuclear Information System (INIS)

    Rhodes, D.B.; Carlucci, L.N.

    1984-01-01

    This paper presents a comparison between numerical predictions, using the porous media concept, and measurements of the two-dimensional isothermal shell-side velocity distributions in a model heat exchanger. Computations and measurements were done with and without tubes present in the model. The effect of tube-to-baffle leakage was also investigated. The comparison was made to validate certain porous media concepts used in a computer code being developed to predict the detailed shell-side flow in a wide range of shell-and-tube heat exchanger geometries

  17. Modeling complexity in engineered infrastructure system: Water distribution network as an example

    Science.gov (United States)

    Zeng, Fang; Li, Xiang; Li, Ke

    2017-02-01

    The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.

  18. Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E.; Hoeschele, M.

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  19. A Sustainability-Oriented Multiobjective Optimization Model for Siting and Sizing Distributed Generation Plants in Distribution Systems

    Directory of Open Access Journals (Sweden)

    Guang Chen

    2013-01-01

    Full Text Available This paper proposes a sustainability-oriented multiobjective optimization model for siting and sizing DG plants in distribution systems. Life cycle exergy (LCE is used as a unified indicator of the entire system’s environmental sustainability, and it is optimized as an objective function in the model. Other two objective functions include economic cost and expected power loss. Chance constraints are used to control the operation risks caused by the uncertain power loads and renewable energies. A semilinearized simulation method is proposed and combined with the Latin hypercube sampling (LHS method to improve the efficiency of probabilistic load flow (PLF analysis which is repeatedly performed to verify the chance constraints. A numerical study based on the modified IEEE 33-node system is performed to verify the proposed method. Numerical results show that the proposed semilinearized simulation method reduces about 93.3% of the calculation time of PLF analysis and guarantees satisfying accuracy. The results also indicate that benefits for environmental sustainability of using DG plants can be effectively reflected by the proposed model which helps the planner to make rational decision towards sustainable development of the distribution system.

  20. Geomechanical Modeling for Improved CO2 Storage Security

    Science.gov (United States)

    Rutqvist, J.; Rinaldi, A. P.; Cappa, F.; Jeanne, P.; Mazzoldi, A.; Urpi, L.; Vilarrasa, V.; Guglielmi, Y.

    2017-12-01

    This presentation summarizes recent modeling studies on geomechanical aspects related to Geologic Carbon Sequestration (GCS,) including modeling potential fault reactivation, seismicity and CO2 leakage. The model simulations demonstrates that the potential for fault reactivation and the resulting seismic magnitude as well as the potential for creating a leakage path through overburden sealing layers (caprock) depends on a number of parameters such as fault orientation, stress field, and rock properties. The model simulations further demonstrate that seismic events large enough to be felt by humans requires brittle fault properties as well as continuous fault permeability allowing for the pressure to be distributed over a large fault patch to be ruptured at once. Heterogeneous fault properties, which are commonly encountered in faults intersecting multilayered shale/sandstone sequences, effectively reduce the likelihood of inducing felt seismicity and also effectively impede upward CO2 leakage. Site specific model simulations of the In Salah CO2 storage site showed that deep fractured zone responses and associated seismicity occurred in the brittle fractured sandstone reservoir, but at a very substantial reservoir overpressure close to the magnitude of the least principal stress. It is suggested that coupled geomechanical modeling be used to guide the site selection and assisting in identification of locations most prone to unwanted and damaging geomechanical changes, and to evaluate potential consequence of such unwanted geomechanical changes. The geomechanical modeling can be used to better estimate the maximum sustainable injection rate or reservoir pressure and thereby provide for improved CO2 storage security. Whether damaging geomechanical changes could actually occur very much depends on the local stress field and local reservoir properties such the presence of ductile rock and faults (which can aseismically accommodate for the stress and strain induced by