WorldWideScience

Sample records for model simulated values

  1. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira

    2014-08-01

    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  2. Simulating the Value of Concentrating Solar Power with Thermal Energy Storage in a Production Cost Model

    Energy Technology Data Exchange (ETDEWEB)

    Denholm, P.; Hummon, M.

    2012-11-01

    Concentrating solar power (CSP) deployed with thermal energy storage (TES) provides a dispatchable source of renewable energy. The value of CSP with TES, as with other potential generation resources, needs to be established using traditional utility planning tools. Production cost models, which simulate the operation of grid, are often used to estimate the operational value of different generation mixes. CSP with TES has historically had limited analysis in commercial production simulations. This document describes the implementation of CSP with TES in a commercial production cost model. It also describes the simulation of grid operations with CSP in a test system consisting of two balancing areas located primarily in Colorado.

  3. The Impact of Different Absolute Solar Irradiance Values on Current Climate Model Simulations

    Science.gov (United States)

    Rind, David H.; Lean, Judith L.; Jonas, Jeffrey

    2014-01-01

    Simulations of the preindustrial and doubled CO2 climates are made with the GISS Global Climate Middle Atmosphere Model 3 using two different estimates of the absolute solar irradiance value: a higher value measured by solar radiometers in the 1990s and a lower value measured recently by the Solar Radiation and Climate Experiment. Each of the model simulations is adjusted to achieve global energy balance; without this adjustment the difference in irradiance produces a global temperature change of 0.48C, comparable to the cooling estimated for the Maunder Minimum. The results indicate that by altering cloud cover the model properly compensates for the different absolute solar irradiance values on a global level when simulating both preindustrial and doubled CO2 climates. On a regional level, the preindustrial climate simulations and the patterns of change with doubled CO2 concentrations are again remarkably similar, but there are some differences. Using a higher absolute solar irradiance value and the requisite cloud cover affects the model's depictions of high-latitude surface air temperature, sea level pressure, and stratospheric ozone, as well as tropical precipitation. In the climate change experiments it leads to an underestimation of North Atlantic warming, reduced precipitation in the tropical western Pacific, and smaller total ozone growth at high northern latitudes. Although significant, these differences are typically modest compared with the magnitude of the regional changes expected for doubled greenhouse gas concentrations. Nevertheless, the model simulations demonstrate that achieving the highest possible fidelity when simulating regional climate change requires that climate models use as input the most accurate (lower) solar irradiance value.

  4. ARM Cloud Radar Simulator Package for Global Climate Models Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yuying [North Carolina State Univ., Raleigh, NC (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-05-01

    It has been challenging to directly compare U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility ground-based cloud radar measurements with climate model output because of limitations or features of the observing processes and the spatial gap between model and the single-point measurements. To facilitate the use of ARM radar data in numerical models, an ARM cloud radar simulator was developed to converts model data into pseudo-ARM cloud radar observations that mimic the instrument view of a narrow atmospheric column (as compared to a large global climate model [GCM] grid-cell), thus allowing meaningful comparison between model output and ARM cloud observations. The ARM cloud radar simulator value-added product (VAP) was developed based on the CloudSat simulator contained in the community satellite simulator package, the Cloud Feedback Model Intercomparison Project (CFMIP) Observation Simulator Package (COSP) (Bodas-Salcedo et al., 2011), which has been widely used in climate model evaluation with satellite data (Klein et al., 2013, Zhang et al., 2010). The essential part of the CloudSat simulator is the QuickBeam radar simulator that is used to produce CloudSat-like radar reflectivity, but is capable of simulating reflectivity for other radars (Marchand et al., 2009; Haynes et al., 2007). Adapting QuickBeam to the ARM cloud radar simulator within COSP required two primary changes: one was to set the frequency to 35 GHz for the ARM Ka-band cloud radar, as opposed to 94 GHz used for the CloudSat W-band radar, and the second was to invert the view from the ground to space so as to attenuate the beam correctly. In addition, the ARM cloud radar simulator uses a finer vertical resolution (100 m compared to 500 m for CloudSat) to resolve the more detailed structure of clouds captured by the ARM radars. The ARM simulator has been developed following the COSP workflow (Figure 1) and using the capabilities available in COSP

  5. Simulation modeling to derive the value-of-information for risky animal disease-import decisions.

    Science.gov (United States)

    Disney, W Terry; Peters, Mark A

    2003-11-12

    Simulation modeling can be used in aiding decision-makers in deciding when to invest in additional research and when the risky animal disease-import decision should go forward. Simulation modeling to evaluate value-of-information (VOI) techniques provides a robust, objective and transparent framework for assisting decision-makers in making risky animal and animal product decisions. In this analysis, the hypothetical risk from poultry disease in chicken-meat imports was modeled. Economic criteria were used to quantify alternative confidence-increasing decisions regarding potential import testing and additional research requirements. In our hypothetical example, additional information about poultry disease in the exporting country (either by requiring additional export-flock surveillance that results in no sign of disease, or by conducting additional research into lack of disease transmittal through chicken-meat ingestion) captured >75% of the value-of-information attainable regarding the chicken-meat-import decision.

  6. Dynamic Value at Risk: A Comparative Study Between Heteroscedastic Models and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    José Lamartine Távora Junior

    2006-12-01

    Full Text Available The objective of this paper was to analyze the risk management of a portfolio composed by Petrobras PN, Telemar PN and Vale do Rio Doce PNA stocks. It was verified if the modeling of Value-at-Risk (VaR through the place Monte Carlo simulation with volatility of GARCH family is supported by hypothesis of efficient market. The results have shown that the statistic evaluation in inferior to dynamics, evidencing that the dynamic analysis supplies support to the hypothesis of efficient market of the Brazilian share holding market, in opposition of some empirical evidences. Also, it was verified that the GARCH models of volatility is enough to accommodate the variations of the shareholding Brazilian market, since the model is capable to accommodate the great dynamic of the Brazilian market.

  7. Very high resolution regional climate model simulations over Greenland: Identifying added value

    DEFF Research Database (Denmark)

    Lucas-Picher, P.; Wulff-Nielsen, M.; Christensen, J.H.

    2012-01-01

    This study presents two simulations of the climate over Greenland with the regional climate model (RCM) HIRHAM5 at 0.05° and 0.25° resolution driven at the lateral boundaries by the ERA-Interim reanalysis for the period 1989–2009. These simulations are validated against observations from...... models. However, the bias between the simulations and the few available observations does not reduce with higher resolution. This is partly explained by the lack of observations in regions where the higher resolution is expected to improve the simulated climate. The RCM simulations show...... meteorological stations (Danish Meteorological Institute) at the coast and automatic weather stations on the ice sheet (Greenland Climate Network). Generally, the temperature and precipitation biases are small, indicating a realistic simulation of the climate over Greenland that is suitable to drive ice sheet...

  8. Very high resolution regional climate model simulations over Greenland: Identifying added value

    DEFF Research Database (Denmark)

    Lucas-Picher, P.; Wulff-Nielsen, M.; Christensen, J.H.;

    2012-01-01

    This study presents two simulations of the climate over Greenland with the regional climate model (RCM) HIRHAM5 at 0.05° and 0.25° resolution driven at the lateral boundaries by the ERA-Interim reanalysis for the period 1989–2009. These simulations are validated against observations from meteorol...

  9. Using Discrete Event Simulation to Model the Economic Value of Shorter Procedure Times on EP Lab Efficiency in the VALUE PVI Study.

    Science.gov (United States)

    Kowalski, Marcin; DeVille, J Brian; Svinarich, J Thomas; Dan, Dan; Wickliffe, Andrew; Kantipudi, Charan; Foell, Jason D; Filardo, Giovanni; Holbrook, Reece; Baker, James; Baydoun, Hassan; Jenkins, Mark; Chang-Sing, Peter

    2016-05-01

    The VALUE PVI study demonstrated that atrial fibrillation (AF) ablation procedures and electrophysiology laboratory (EP lab) occupancy times were reduced for the cryoballoon compared with focal radiofrequency (RF) ablation. However, the economic impact associated with the cryoballoon procedure for hospitals has not been determined. Assess the economic value associated with shorter AF ablation procedure times based on VALUE PVI data. A model was formulated from data from the VALUE PVI study. This model used a discrete event simulation to translate procedural efficiencies into metrics utilized by hospital administrators. A 1000-day period was simulated to determine the accrued impact of procedure time on an institution's EP lab when considering staff and hospital resources. The simulation demonstrated that procedures performed with the cryoballoon catheter resulted in several efficiencies, including: (1) a reduction of 36.2% in days with overtime (422 days RF vs 60 days cryoballoon); (2) 92.7% less cumulative overtime hours (370 hours RF vs 27 hours cryoballoon); and (3) an increase of 46.7% in days with time for an additional EP lab usage (186 days RF vs 653 days cryoballoon). Importantly, the added EP lab utilization could not support the time required for an additional AF ablation procedure. The discrete event simulation of the VALUE PVI data demonstrates the potential positive economic value of AF ablation procedures using the cryoballoon. These benefits include more days where overtime is avoided, fewer cumulative overtime hours, and more days with time left for additional usage of EP lab resources.

  10. [Value of simulation in pediatrics].

    Science.gov (United States)

    Oriot, D; Boureau-Voultoury, A; Ghazali, A; Brèque, C; Scépi, M

    2013-06-01

    The authors present the concepts of simulation and its utilization in pediatrics. Simulation in medicine is a teaching method that has not yet been developed in Europe and has not spread in pediatrics in France. Motivations for simulation are first and foremost ethical: "Never the first time on patients!" Simulation also provides benefits in teaching communication skills and theoretical concepts. It is an essential means to maintain patient safety by limiting the risk of errors. It covers teaching procedures requiring realistic models such as in teaching communication and crisis resource management. Simulation can also be used for teaching disclosure of bad news, using actors. Simulation skills are acquired during debriefing, when the supervisor acts as a facilitator. Evaluation is mandatory in simulation, dependent on the how realistic the models are and on the performance of a procedure or multidisciplinary team management. Performance can be objectively assessed only with validated tools. Simulation will become a mandatory teaching method in medicine.

  11. The impact of MCS models and EFAC values on the dose simulation for a proton pencil beam

    Science.gov (United States)

    Chen, Shih-Kuan; Chiang, Bing-Hao; Lee, Chung-Chi; Tung, Chuan-Jong; Hong, Ji-Hong; Chao, Tsi-Chian

    2017-08-01

    The Multiple Coulomb Scattering (MCS) model plays an important role in accurate MC simulation, especially for small field applications. The Rossi model is used in MCNPX 2.7.0, and the Lewis model in Geant4.9.6.p02. These two models may generate very different angular and spatial distributions in small field proton dosimetry. Beside angular and spatial distributions, step size is also an important issue that causes path length effects. The Energy Fraction (EFAC) value can be used in MCNPX 2.7.0 to control step sizes of MCS. In this study, we use MCNPX 2.7.0, Geant4.9.6.p02, and one pencil beam algorithm to evaluate the effect of dose deposition because of different MCS models and different EFAC values in proton disequilibrium situation. Different MCS models agree well with each other under a proton equilibrium situation. Under proton disequilibrium situations, the MCNPX and Geant4 results, however, show a significant deviation (up to 43%). In addition, the path length effects are more significant when EFAC is equal to 0.917 and 0.94 in small field proton dosimetry, and using a 0.97 EFAC value is the best for both accuracy and efficiency

  12. Measuring the Value-added of Oil Palm Products with Integrating SCOR Model and Discrete Event Simulation

    Directory of Open Access Journals (Sweden)

    Fitra Lestari

    2014-09-01

    Full Text Available Oil palm processing industry in Malaysia can directly distribute the finished products in exporting without considering the transformation value of the product to the end customer. Nevertheless, it influences the configuration of the supply chain strategy. The purpose of this study measures the performance of supply chain configuration in oil palm business. The model use tools for measuring supply chain configuration with integrating SCOR models and discrete event simulation. Finding of this study revealed that the highest value-added of oil palm derivative product is scenario 5 which it proposes 100% CPO and CPKO deliver to the local refinery without distributing to the export and its finished products are distributed to the export through the port. Finally, it gives consideration to the stakeholders in controlling the system and then makes sure the business process keeps running on the track.

  13. Singular value decomposition with self-modeling applied to determine bacteriorhodopsin intermediate spectra: analysis of simulated data.

    Science.gov (United States)

    Zimányi, L; Kulcsár, A; Lanyi, J K; Sears, D F; Saltiel, J

    1999-04-13

    An a priori model-independent method for the determination of accurate spectra of photocycle intermediates is developed. The method, singular value decomposition with self-modeling (SVD-SM), is tested on simulated difference spectra designed to mimic the photocycle of the Asp-96 --> Asn mutant of bacteriorhodopsin. Stoichiometric constraints, valid until the onset of the recovery of bleached bacteriorhodopsin at the end of the photocycle, guide the self-modeling procedure. The difference spectra of the intermediates are determined in eigenvector space by confining the search for their coordinates to a stoichiometric plane. In the absence of random noise, SVD-SM recovers the intermediate spectra and their time evolution nearly exactly. The recovery of input spectra and kinetics is excellent although somewhat less exact when realistic random noise is included in the input spectra. The difference between recovered and input kinetics is now visually discernible, but the same reaction scheme with nearly identical rate constants to those assumed in the simulation fits the output kinetics well. SVD-SM relegates the selection of a photocycle model to the late stage of the analysis. It thus avoids derivation of erroneous model-specific spectra that result from global model-fitting approaches that assume a model at the outset.

  14. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    Science.gov (United States)

    Hu, Ming-Che

    Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received

  15. Potential for added value in precipitation simulated by high-resolution nested Regional Climate Models and observations

    Energy Technology Data Exchange (ETDEWEB)

    Di Luca, Alejandro; Laprise, Rene [Universite du Quebec a Montreal (UQAM), Centre ESCER (Etude et Simulation du Climat a l' Echelle Regionale), Departement des Sciences de la Terre et de l' Atmosphere, PK-6530, Succ. Centre-ville, B.P. 8888, Montreal, QC (Canada); De Elia, Ramon [Universite du Quebec a Montreal, Ouranos Consortium, Centre ESCER (Etude et Simulation du Climat a l' Echelle Regionale), Montreal (Canada)

    2012-03-15

    Regional Climate Models (RCMs) constitute the most often used method to perform affordable high-resolution regional climate simulations. The key issue in the evaluation of nested regional models is to determine whether RCM simulations improve the representation of climatic statistics compared to the driving data, that is, whether RCMs add value. In this study we examine a necessary condition that some climate statistics derived from the precipitation field must satisfy in order that the RCM technique can generate some added value: we focus on whether the climate statistics of interest contain some fine spatial-scale variability that would be absent on a coarser grid. The presence and magnitude of fine-scale precipitation variance required to adequately describe a given climate statistics will then be used to quantify the potential added value (PAV) of RCMs. Our results show that the PAV of RCMs is much higher for short temporal scales (e.g., 3-hourly data) than for long temporal scales (16-day average data) due to the filtering resulting from the time-averaging process. PAV is higher in warm season compared to cold season due to the higher proportion of precipitation falling from small-scale weather systems in the warm season. In regions of complex topography, the orographic forcing induces an extra component of PAV, no matter the season or the temporal scale considered. The PAV is also estimated using high-resolution datasets based on observations allowing the evaluation of the sensitivity of changing resolution in the real climate system. The results show that RCMs tend to reproduce relatively well the PAV compared to observations although showing an overestimation of the PAV in warm season and mountainous regions. (orig.)

  16. Comparison of radiance and polarization values observed in the Mediterranean Sea and simulated in a Monte Carlo model

    DEFF Research Database (Denmark)

    Adams, J.T.; Aas, E.; Højerslev, N.K.;

    2002-01-01

    Measurements of the radiance and degree of polarization made in 1971 in the Mediterranean Sea are presented along with the simulation of all observed quantities by a Monte Carlo technique. It is shown that our independent scattering treatment utilizing a Stokes vector formalism to describe...... the polarization state of the light field produces remarkably good agreement with those values measured in situ. (C) 2002 Optical Society of America...

  17. A comparison of model-based imputation methods for handling missing predictor values in a linear regression model: A simulation study

    Science.gov (United States)

    Hasan, Haliza; Ahmad, Sanizah; Osman, Balkish Mohd; Sapri, Shamsiah; Othman, Nadirah

    2017-08-01

    In regression analysis, missing covariate data has been a common problem. Many researchers use ad hoc methods to overcome this problem due to the ease of implementation. However, these methods require assumptions about the data that rarely hold in practice. Model-based methods such as Maximum Likelihood (ML) using the expectation maximization (EM) algorithm and Multiple Imputation (MI) are more promising when dealing with difficulties caused by missing data. Then again, inappropriate methods of missing value imputation can lead to serious bias that severely affects the parameter estimates. The main objective of this study is to provide a better understanding regarding missing data concept that can assist the researcher to select the appropriate missing data imputation methods. A simulation study was performed to assess the effects of different missing data techniques on the performance of a regression model. The covariate data were generated using an underlying multivariate normal distribution and the dependent variable was generated as a combination of explanatory variables. Missing values in covariate were simulated using a mechanism called missing at random (MAR). Four levels of missingness (10%, 20%, 30% and 40%) were imposed. ML and MI techniques available within SAS software were investigated. A linear regression analysis was fitted and the model performance measures; MSE, and R-Squared were obtained. Results of the analysis showed that MI is superior in handling missing data with highest R-Squared and lowest MSE when percent of missingness is less than 30%. Both methods are unable to handle larger than 30% level of missingness.

  18. Value of bias-corrected satellite rainfall products in SWAT simulations and comparison with other models in the Mara basin

    Science.gov (United States)

    Serrat-Capdevila, A.; Abitew, T. A.; Roy, T.; van Griensven, A.; Valdes, J. B.; Bauwens, W.

    2015-12-01

    Hydrometeorological monitoring networks are often limited for basins located in the developing world such as the transboundary Mara Basin. The advent of earth observing systems have brought satellite rainfall and evapotranspiration products, which can be used to force hydrological models in data scarce basins. The objective of this study is to develop improved hydrologic simulations using distributed satellite rainfall products (CMORPH and TMPA) with a bias-correction, and compare the performance with different input data and models. The bias correction approach for the satellite-products (CMORPH and TMPA) involves the use of a distributed reference dataset (CHIRPS) and historical ground gauge records. We have applied the bias-corrected satellite products to force the Soil and Water Assessment Tool (SWAT) model for the Mara Basin. Firstly, we calibrate the SWAT parameters related to ET simulation using ET from remote sensing. Then, the SWAT parameters that control surface processes are calibrated using the available limited flow. From the analysis, we noted that not only the bias-corrected satellite rainfall but also augmenting limited flow data with monthly remote sensing ET improves the model simulation skill and reduces the parameter uncertainty to some extent. We have planned to compare these results from a lumped model forced by the same input satellite rainfall. This will shed light on the potential of satellite rainfall and remote sensing ET along with in situ data for hydrological processes modeling and the inherent uncertainty in a data scarce basin.

  19. Multifractal Value at Risk model

    Science.gov (United States)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  20. Simulation modeling of carcinogenesis.

    Science.gov (United States)

    Ellwein, L B; Cohen, S M

    1992-03-01

    A discrete-time simulation model of carcinogenesis is described mathematically using recursive relationships between time-varying model variables. The dynamics of cellular behavior is represented within a biological framework that encompasses two irreversible and heritable genetic changes. Empirical data and biological supposition dealing with both control and experimental animal groups are used together to establish values for model input variables. The estimation of these variables is integral to the simulation process as described in step-by-step detail. Hepatocarcinogenesis in male F344 rats provides the basis for seven modeling scenarios which illustrate the complexity of relationships among cell proliferation, genotoxicity, and tumor risk.

  1. Cross-sector diversification in financial conglomerates: simulations with a fair-value assets and liabilities model

    Directory of Open Access Journals (Sweden)

    Jacob A. Bikker

    2002-12-01

    Full Text Available Risk diversification is one of the many reasons for cross-sector mergers of financialinstitutes. This paper presents a fair-value type asset and liability model in order to identify diversification effects for financial conglomerates (PCs under various shocks. My analysis for the Netherlands reveals that diversification effects on PCs of especially interest rate shocks are very strong. In principle, substantial diversificationeffects argue for lower capital requirements for PCs. However, there are other non-negligible risks run by PCs to consider, namely contagion risk, regulatory arbitrage andcross-sector and TBTF moral hazard risks, which have not yet been quantified.

  2. Simulation model estimates of test accuracy and predictive values for the Danish Salmonella surveillance program in dairy herds

    DEFF Research Database (Denmark)

    Warnick, L.D.; Nielsen, L.R.; Nielsen, Jens

    2006-01-01

    antibody measurements for infected and noninfected herds were determined from field study data. Herd infection was defined as having either >= 1 Salmonella culture-positive fecal sample or >= 5% within-herd prevalence based on antibody measurements in serum or milk from individual animals. No distinction......The Danish government and cattle industry instituted a Salmonella surveillance program in October 2002 to help reduce Salmonella enterica subsp. enterica serotype Dublin (S. Dublin) infections. All dairy herds are tested by measuring antibodies in bulk tank milk at 3-month intervals. The program...... is based on a well-established ELISA, but the overall test program accuracy and misclassification was not previously investigated. We developed a model to simulate repeated bulk tank milk antibody measurements for dairy herds conditional on true infection status. The distributions of bulk tank milk...

  3. Delay modeling in logic simulation

    Energy Technology Data Exchange (ETDEWEB)

    Acken, J. M.; Goldstein, L. H.

    1980-01-01

    As digital integrated circuit size and complexity increases, the need for accurate and efficient computer simulation increases. Logic simulators such as SALOGS (SAndia LOGic Simulator), which utilize transition states in addition to the normal stable states, provide more accurate analysis than is possible with traditional logic simulators. Furthermore, the computational complexity of this analysis is far lower than that of circuit simulation such as SPICE. An eight-value logic simulation environment allows the use of accurate delay models that incorporate both element response and transition times. Thus, timing simulation with an accuracy approaching that of circuit simulation can be accomplished with an efficiency comparable to that of logic simulation. 4 figures.

  4. Using Short-term Hindcast Skill to Add Confidence to the Choice of Uncertain Model Parameter Values in CESM Climate Change Simulations

    Science.gov (United States)

    Hannay, C.; Neale, R. B.; Rothstein, M.

    2016-12-01

    Projections of future climate change are inherently uncertain and regional details are heavily dependent on coupled climate model formulations. Bernstein and Neelin (2016) show that projections of future climate using the Community Earth System Model (CESM) can vary significantly depending on the (reasonable) value used for important but uncertain model parameters. This includes a wide variation in the tropical precipitation response due to perturbations of parameters inherent to the formulation of deep convection parameterization. The question is therefore which model formulation should be trusted most? Since true validation is, of course, not possible at present day, guidance has to be provided by other proxies. Using a simple metric that climate models that performing best in a standard present-day (AMIP-type) configuration should be trusted most for future climate projections is unsatisfactory here, as only a small tuning effort is required to produce simulations equally skillful to the unperturbed model configurations. Here we employ an alternative approach for "trusting" the future climate projections. It is based on using CESM for a series of CAPT-type hindcast simulations, mirroring the limited perturbed parameter ensemble approach of Bernstein and Neelin (2016). Simulation sets are run for the YOTC period of 2009-2010 using CAM5 at 1 degree resolution. In this talk we will show the regional variations of climate change signals in the hydrological cycle in response to deep convection dependent parameter sets (e.g., entrainment, timescale) and contrast them with the equivalent hindcast experiments using the same parameter set. With this analysis we are able to provide guidance as to which parameter value selections result in the highest skill in the hindcasts and how that corresponds with the equivalent CESM future climate change signals.

  5. A simulation model to quantify the value of implementing whole-herd Bovine viral diarrhea virus testing strategies in beef cow-calf herds.

    Science.gov (United States)

    Nickell, Jason S; White, Brad J; Larson, Robert L; Renter, David G; Sanderson, Mike W

    2011-03-01

    Although numerous diagnostic tests are available to identify cattle persistently infected (PI) with Bovine viral diarrhea virus (BVDV) in cow-calf herds, data are sparse when evaluating the economic viability of individual tests or diagnostic strategies. Multiple factors influence BVDV testing in determining if testing should be performed and which strategy to use. A stochastic model was constructed to estimate the value of implementing various whole-herd BVDV cow-calf testing protocols. Three common BVDV tests (immunohistochemistry, antigen-capture enzyme-linked immunosorbent assay, and polymerase chain reaction) performed on skin tissue were evaluated as single- or two-test strategies. The estimated testing value was calculated for each strategy at 3 herd sizes that reflect typical farm sizes in the United States (50, 100, and 500 cows) and 3 probabilities of BVDV-positive herd status (0.077, 0.19, 0.47) based upon the literature. The economic value of testing was the difference in estimated gross revenue between simulated cow-calf herds that either did or did not apply the specific testing strategy. Beneficial economic outcomes were more frequently observed when the probability of a herd being BVDV positive was 0.47. Although the relative value ranking of many testing strategies varied by each scenario, the two-test strategy composed of immunohistochemistry had the highest estimated value in all but one herd size-herd prevalence permutation. These data indicate that the estimated value of applying BVDV whole-herd testing strategies is influenced by the selected strategy, herd size, and the probability of herd BVDV-positive status; therefore, these factors should be considered when designing optimum testing strategies for cow-calf herds.

  6. Coal value chain - simulation model

    CSIR Research Space (South Africa)

    Fourie, M

    2005-08-01

    Full Text Available m u la tio n M o de l M e la n ie Fo u rie (S a so l T ec hn o lo gy ) a n d Jo ha n Ja n se va n R en sb u rg (C SI R ) co py rig ht re se rv e d 20 05 , Sa so l T e ch n... o lo gy & Sa so l M in in g 19 th SA IIE a n d 35 th O R SS A Co n fe re n ce 20 05 O u tli n e Ba ck gr o u n d Si m u la tio n o bje ct iv e s Si m u la tio n m o de l M...

  7. Simulation Tool for Inventory Models: SIMIN

    OpenAIRE

    Pratiksha Saxen; Tulsi Kushwaha

    2014-01-01

    In this paper, an integrated simulation optimization model for the inventory system is developed. An effective algorithm is developed to evaluate and analyze the back-end stored simulation results. This paper proposes simulation tool SIMIN (Inventory Simulation) to simulate inventory models. SIMIN is a tool which simulates and compares the results of different inventory models. To overcome various practical restrictive assumptions, SIMIN provides values for a number of performance measurement...

  8. Simulating future value in intertemporal choice

    Science.gov (United States)

    Solway, Alec; Lohrenz, Terry; Montague, P. Read

    2017-01-01

    The laboratory study of how humans and other animals trade-off value and time has a long and storied history, and is the subject of a vast literature. However, despite a long history of study, there is no agreed upon mechanistic explanation of how intertemporal choice preferences arise. Several theorists have recently proposed model-based reinforcement learning as a candidate framework. This framework describes a suite of algorithms by which a model of the environment, in the form of a state transition function and reward function, can be converted on-line into a decision. The state transition function allows the model-based system to make decisions based on projected future states, while the reward function assigns value to each state, together capturing the necessary components for successful intertemporal choice. Empirical work has also pointed to a possible relationship between increased prospection and reduced discounting. In the current paper, we look for direct evidence of a relationship between temporal discounting and model-based control in a large new data set (n = 168). However, testing the relationship under several different modeling formulations revealed no indication that the two quantities are related. PMID:28225034

  9. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  10. The perceived value of using BIM for energy simulation

    Science.gov (United States)

    Lewis, Anderson M.

    Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using

  11. Use of simulation modeling to estimate herd-level sensitivity, specificity, and predictive values of diagnostic tests for detection of tuberculosis in cattle.

    Science.gov (United States)

    Norby, Bo; Bartlett, Paul C; Grooms, Daniel L; Kaneene, John B; Bruning-Fann, Colleen S

    2005-07-01

    To estimate herd-level sensitivity (HSe), specificity (HSp), and predictive values for a positive (HPVP) and negative (HPVN) test result for several testing scenarios for detection of tuberculosis in cattle by use of simulation modeling. Empirical distributions of all herds (15,468) and herds in a 10-county area (1,016) in Michigan. 5 test scenarios were simulated: scenario 1, serial interpretation of the caudal fold tuberculin (CFT) test and comparative cervical test (CCT); scenario 2, serial interpretation of the CFT test and CCT, microbial culture for mycobacteria, and polymerase chain reaction assay; scenario 3, same as scenario 2 but specificity was fixed at 1.0; and scenario 4, sensitivity was 0.9 (scenario 4a) or 0.95 (scenario 4b), and specificity was fixed at 1.0. Estimates for HSe were reasonably high, ranging between 0.712 and 0.840. Estimates for HSp were low when specificity was not fixed at 1.0. Estimates of HPVP were low for scenarios 1 and 2 (0.042 and 0.143, respectively) but increased to 1.0 when specificity was fixed at 1.0. The HPVN remained high for all 5 scenarios, ranging between 0.995 and 0.997. As herd size increased, HSe increased and HSp and HPVP decreased. However, fixing specificity at 1.0 had only minor effects on HSp and HPVN, but HSe was low when the herd size was small. Tests used for detecting cattle herds infected with tuberculosis work well on a herd basis. Herds with < approximately 100 cattle should be tested more frequently or for a longer duration than larger herds to ensure that these small herds are free of tuberculosis.

  12. Participatory Systems Modeling to Explore Sustainable Solutions: Triple-Value Simulation Modeling Cases Tackle Nutrient and Watershed Management from a Socio-Ecological Systems (ses) Perspective

    Science.gov (United States)

    Buchholtz ten Brink, M. R.; Heineman, K.; Foley, G. J.; Ruder, E.; Tanners, N.; Bassi, A.; Fiksel, J.

    2016-12-01

    Decision makers often need assistance in understanding dynamic interactions and linkages among economic, environmental and social systems in coastal watersheds. They also need scientific input to better evaluate potential costs and benefits of alternative policy interventions. The US EPA is applying sustainability science to address these needs. Triple Value (3V) Scoping and Modeling projects bring a systems approach to understand complex environmental problems, incorporate local knowledge, and allow decision-makers to explore policy scenarios. This leads to better understanding of feedbacks and outcomes to both human and environmental systems.The Suffolk County, NY (eastern Long Island) 3V Case uses SES interconnections to explore possible policy options and scenarios for intervention to mitigate the effects of excess nitrogen (N) loading to ground, surface, and estuarine waters. Many of the environmental impacts of N pollution have adverse effects on social and economic well-being and productivity. Key are loss of enjoyment and recreational use of local beach environments and loss of income and revenues from tourism and local fisheries. Stakeholders generated this Problem Statement: Suffolk County is experiencing widespread degradation to groundwater and the coastal marine environment caused by excess nitrogen. How can local stakeholders and decision makers in Suffolk County arrest and reverse this degradation, restore conditions to support a healthy thriving ecosystem, strengthen the County's resilience to emerging and expected environmental threats from global climate change, support and promote economic growth, attract a vibrant and sustainable workforce, and maintain and enhance quality of life and affordability for all County residents? They then built a Causal Loop Diagram of indicators and relationships that reflect these issues and identified a set of alternative policy interventions to address them. The project team conducted an extensive review of

  13. Brand Value - Proposed Model Danrise

    Directory of Open Access Journals (Sweden)

    Daniel Nascimento Pereira da Silva

    2011-12-01

    Full Text Available Brands have taken dominance in the strategies of enterprises once they are able to generate feelings, sensations and emotions in their clients. These values, value for the enterprises and for the brands themselves, are not measurable. A strong brand configures itself as the highest representative of an enterprise and the brand is regarded as an asset of the enterprise. The evolution of a brand, as an intangible and strategic asset, becomes more vitally important for the enterprises, as a way of maximizing the results. This need, whether of the market or the enterprises, justifies the direction of the research for this vector – the value of the brand. A main objective of the research is to present a new model of brand evaluation. This model is supported by a tangible and intangible aspects and the intangible aspect, evaluates the knowledge and capacity of their managers and workers to build a brand with value through the correct ordering of the priorities of the dimensions of the proposed model. The model was tested on the brand ‗Blue Rise.‘ 

  14. Simulating cyber warfare and cyber defenses: information value considerations

    Science.gov (United States)

    Stytz, Martin R.; Banks, Sheila B.

    2011-06-01

    Simulating cyber warfare is critical to the preparation of decision-makers for the challenges posed by cyber attacks. Simulation is the only means we have to prepare decision-makers for the inevitable cyber attacks upon the information they will need for decision-making and to develop cyber warfare strategies and tactics. Currently, there is no theory regarding the strategies that should be used to achieve objectives in offensive or defensive cyber warfare, and cyber warfare occurs too rarely to use real-world experience to develop effective strategies. To simulate cyber warfare by affecting the information used for decision-making, we modify the information content of the rings that are compromised during in a decision-making context. The number of rings affected and value of the information that is altered (i.e., the closeness of the ring to the center) is determined by the expertise of the decision-maker and the learning outcome(s) for the simulation exercise. We determine which information rings are compromised using the probability that the simulated cyber defenses that protect each ring can be compromised. These probabilities are based upon prior cyber attack activity in the simulation exercise as well as similar real-world cyber attacks. To determine which information in a compromised "ring" to alter, the simulation environment maintains a record of the cyber attacks that have succeeded in the simulation environment as well as the decision-making context. These two pieces of information are used to compute an estimate of the likelihood that the cyber attack can alter, destroy, or falsify each piece of information in a compromised ring. The unpredictability of information alteration in our approach adds greater realism to the cyber event. This paper suggests a new technique that can be used for cyber warfare simulation, the ring approach for modeling context-dependent information value, and our means for considering information value when assigning cyber

  15. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  16. Mathematical modeling and simulation in animal health - Part II: principles, methods, applications, and value of physiologically based pharmacokinetic modeling in veterinary medicine and food safety assessment.

    Science.gov (United States)

    Lin, Z; Gehring, R; Mochel, J P; Lavé, T; Riviere, J E

    2016-10-01

    This review provides a tutorial for individuals interested in quantitative veterinary pharmacology and toxicology and offers a basis for establishing guidelines for physiologically based pharmacokinetic (PBPK) model development and application in veterinary medicine. This is important as the application of PBPK modeling in veterinary medicine has evolved over the past two decades. PBPK models can be used to predict drug tissue residues and withdrawal times in food-producing animals, to estimate chemical concentrations at the site of action and target organ toxicity to aid risk assessment of environmental contaminants and/or drugs in both domestic animals and wildlife, as well as to help design therapeutic regimens for veterinary drugs. This review provides a comprehensive summary of PBPK modeling principles, model development methodology, and the current applications in veterinary medicine, with a focus on predictions of drug tissue residues and withdrawal times in food-producing animals. The advantages and disadvantages of PBPK modeling compared to other pharmacokinetic modeling approaches (i.e., classical compartmental/noncompartmental modeling, nonlinear mixed-effects modeling, and interspecies allometric scaling) are further presented. The review finally discusses contemporary challenges and our perspectives on model documentation, evaluation criteria, quality improvement, and offers solutions to increase model acceptance and applications in veterinary pharmacology and toxicology.

  17. Theory Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Shlachter, Jack [Los Alamos National Laboratory

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  18. Achieving Value in Primary Care: The Primary Care Value Model.

    Science.gov (United States)

    Rollow, William; Cucchiara, Peter

    2016-03-01

    The patient-centered medical home (PCMH) model provides a compelling vision for primary care transformation, but studies of its impact have used insufficiently patient-centered metrics with inconsistent results. We propose a framework for defining patient-centered value and a new model for value-based primary care transformation: the primary care value model (PCVM). We advocate for use of patient-centered value when measuring the impact of primary care transformation, recognition, and performance-based payment; for financial support and research and development to better define primary care value-creating activities and their implementation; and for use of the model to support primary care organizations in transformation.

  19. Analysis of Macro-micro Simulation Models for Service-Oriented Public Platform: Coordination of Networked Services and Measurement of Public Values

    Science.gov (United States)

    Kinoshita, Yumiko

    When service sectors are a major driver for the growth of the world economy, we are challenged to implement service-oriented infrastructure as e-Gov platform to achieve further growth and innovation for both developed and developing countries. According to recent trends in service industry, it is clarified that main factors for the growth of service sectors are investment into knowledge, trade, and the enhanced capacity of micro, small, and medium-sized enterprises (MSMEs). In addition, the design and deployment of public service platform require appropriate evaluation methodology. Reflecting these observations, this paper proposes macro-micro simulation approach to assess public values (PV) focusing on MSMEs. Linkage aggregate variables (LAVs) are defined to show connection between macro and micro impacts of public services. As a result, the relationship of demography, business environment, macro economy, and socio-economic impact are clarified and their values are quantified from the behavioral perspectives of citizens and firms.

  20. Modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Casetti, E.; Vogt, W.G.; Mickle, M.H.

    1984-01-01

    This conference includes papers on the uses of supercomputers, multiprocessors, artificial intelligence and expert systems in various energy applications. Topics considered include knowledge-based expert systems for power engineering, a solar air conditioning laboratory computer system, multivariable control systems, the impact of power system disturbances on computer systems, simulating shared-memory parallel computers, real-time image processing with multiprocessors, and network modeling and simulation of greenhouse solar systems.

  1. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  2. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Directory of Open Access Journals (Sweden)

    R Scott Braithwaite

    2010-02-01

    Full Text Available BACKGROUND: Evidence suggests that cost sharing (i.e.,copayments and deductibles decreases health expenditures but also reduces essential care. Value-based insurance design (VBID has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. METHODS AND FINDINGS: We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1 applying VBID solely to pharmacy benefits and (2 applying VBID to both pharmacy benefits and other health care services (e.g., devices. We assumed that cost sharing would be eliminated for high-value services ($300,000 per life-year. All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80% of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase. CONCLUSION: Broader diffusion of VBID may amplify benefits from

  3. The models of phase transformations definition in the software Deform and their effect on the output values from the numerical simulation of gear thermal processing.

    Directory of Open Access Journals (Sweden)

    Sona Benesova

    2014-11-01

    Full Text Available With the aid of DEFORM® software it is possible to conduct numerical simulation of workpiece phase composition during and upon heat treatment. The computation can be based on either the graphical representation of TTT diagram of the steel in question or one of the mathematical models integrated in the software, the latter being applicable if the required constants are known. The present paper gives an evaluation of differences between results of numerical simulations with various definitions of phase transformation for the heat treatment of a gearwheel and a specially prepared specimen of simple shape. It was found that the preparation of input data in terms of thorough mapping of characteristics of the material is essential. 

  4. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.;

    , have the potential to include also mutual wake interaction phenomenons. The basic conjecture behind the dynamic wake meandering (DWM) model is that wake transportation in the atmospheric boundary layer is driven by the large scale lateral- and vertical turbulence components. Based on this conjecture...... and trailed vorticity, has been approached by a simple semi-empirical model essentially based on an eddy viscosity philosophy. Contrary to previous attempts to model wake loading, the DWM approach opens for a unifying description in the sense that turbine power- and load aspects can be treated simultaneously...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  5. What's the Value of VAM (Value-Added Modeling)?

    Science.gov (United States)

    Scherrer, Jimmy

    2012-01-01

    The use of value-added modeling (VAM) in school accountability is expanding, but deciding how to embrace VAM is difficult. Various experts say it's too unreliable, causes more harm than good, and has a big margin for error. Others assert VAM is imperfect but useful, and provides valuable feedback. A closer look at the models, and their use,…

  6. PDOP values for simulated GPS/Galileo positioning

    DEFF Research Database (Denmark)

    Cederholm, Jens Peter

    2005-01-01

    are estimated at 4140 points aruond the earth. The simulation that is carried out with a 15 degrees cut-off angle is repeated every 10 minutes for 72 hours. The simulation shows that mean PDOP values are improved significantly when using a combined system compared to using only GPS....

  7. Simulation Study on Heat Value Control System of Natural Gas Used for Color TV Tubes Production

    Institute of Scientific and Technical Information of China (English)

    ZHENG Bin

    2006-01-01

    In order to know the character of the heat value control system, determine the influence of natural gas quality and flow on the heat value, and learn how to adjust the parameters of control system, the model of the whole system is established, and simulation of the system is adopted in Matlab/Simulink. The simulation result shows that the feedback system with feed-forward block controls the heat value very well, and the simulation result can effectively guide the engineering design of the heat value control system, and the efficiency of engineering is improved.

  8. Proving the ecosystem value through hydrological modelling

    Science.gov (United States)

    Dorner, W.; Spachinger, K.; Porter, M.; Metzka, R.

    2008-11-01

    Ecosystems provide valuable functions. Also natural floodplains and river structures offer different types of ecosystem functions such as habitat function, recreational area and natural detention. From an economic stand point the loss (or rehabilitation) of these natural systems and their provided natural services can be valued as a damage (or benefit). Consequently these natural goods and services must be economically valued in project assessments e.g. cost-benefit-analysis or cost comparison. Especially in smaller catchments and river systems exists significant evidence that natural flood detention reduces flood risk and contributes to flood protection. Several research projects evaluated the mitigating effect of land use, river training and the loss of natural flood plains on development, peak and volume of floods. The presented project analysis the hypothesis that ignoring natural detention and hydrological ecosystem services could result in economically inefficient solutions for flood protection and mitigation. In test areas, subcatchments of the Danube in Germany, a combination of hydrological and hydrodynamic models with economic evaluation techniques was applied. Different forms of land use, river structure and flood protection measures were assed and compared from a hydrological and economic point of view. A hydrodynamic model was used to simulate flows to assess the extent of flood affected areas and damages to buildings and infrastructure as well as to investigate the impacts of levees and river structure on a local scale. These model results provided the basis for an economic assessment. Different economic valuation techniques, such as flood damage functions, cost comparison method and substation-approach were used to compare the outcomes of different hydrological scenarios from an economic point of view and value the ecosystem service. The results give significant evidence that natural detention must be evaluated as part of flood mitigation projects

  9. Mean Value Modelling of Turbocharged SI Engines

    DEFF Research Database (Denmark)

    Müller, Martin; Hendricks, Elbert; Sorenson, Spencer C.

    1998-01-01

    The development of a computer simulation to predict the performance of a turbocharged spark ignition engine during transient operation. New models have been developed for the turbocharged and the intercooling system. An adiabatic model for the intake manifold is presented....

  10. Gap Model for Dual Customer Values

    Institute of Scientific and Technical Information of China (English)

    HOU Lun; TANG Xiaowo

    2008-01-01

    The customer value, the key problem in customer relationship management (CRM), was studied to construct a gap model for dual customer values. A basic description of customer values is given, and then the gaps between products and services in different periods for the customers and companies are analyzed based on the product or service life-cycle. The main factors that influence the perceived customer value were analyzed to define the "recognized value gap" and a gap model for the dual customer values was constructed to supply companies with a tool to analyze existing customer value gaps and improve customer relationship management.

  11. Preparations, models, and simulations.

    Science.gov (United States)

    Rheinberger, Hans-Jörg

    2015-01-01

    This paper proposes an outline for a typology of the different forms that scientific objects can take in the life sciences. The first section discusses preparations (or specimens)--a form of scientific object that accompanied the development of modern biology in different guises from the seventeenth century to the present: as anatomical-morphological specimens, as microscopic cuts, and as biochemical preparations. In the second section, the characteristics of models in biology are discussed. They became prominent from the end of the nineteenth century onwards. Some remarks on the role of simulations--characterising the life sciences of the turn from the twentieth to the twenty-first century--conclude the paper.

  12. Simulating positive-operator-valued measures with projective measurements

    OpenAIRE

    Oszmaniec, Michał; Guerini, Leonardo; Wittek, Peter; Acín, Antonio

    2016-01-01

    Standard projective measurements represent a subset of all possible measurements in quantum physics, defined by positive-operator-valued measures. We study what quantum measurements are projective simulable, that is, can be simulated by using projective measurements and classical randomness. We first prove that every measurement on a given quantum system can be realised by classical processing of projective measurements on the system plus an ancilla of the same dimension. Then, given a genera...

  13. Value Modeling for Enterprise Resilience

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Dale L.; Lancaster, Mary J.

    2015-10-20

    Abstract. The idea that resilience is a tangible, measureable, and desirable system attribute has grown rapidly over the last decade beyond is origins in explaining ecological, physiological, psychological, and social systems. Operational enterprise resilience requires two types of measurement. First, the system must monitor various operational conditions in order to respond to disruptions. These measurements are part of one or more observation, orientation, decision, and action (OODA) loops The OODA control processes that implement a resilience strategy use these measurements to provide robustness, rapid recovery and reconstitution. In order to assess the effectiveness of the resilience strategy, a different class of measurements is necessary. This second type consists of measurements about how well the OODA processes cover critical enterprise functions and the hazards to which the enterprise is exposed. They allow assessment of how well enterprise management processes anticipate, mitigate, and adapt to a changing environment and the degree to which the system is fault tolerant. This paper nominates a theoretical framework, in the form of definitions, a model, and a syntax, that accounts for this important distinction, and in so doing provides a mechanism for bridging resilience management process models and the many proposed cyber-defense metric enumerations.

  14. Appropriate model selection methods for nonstationary generalized extreme value models

    Science.gov (United States)

    Kim, Hanbeen; Kim, Sooyoung; Shin, Hongjoon; Heo, Jun-Haeng

    2017-04-01

    Several evidences of hydrologic data series being nonstationary in nature have been found to date. This has resulted in the conduct of many studies in the area of nonstationary frequency analysis. Nonstationary probability distribution models involve parameters that vary over time. Therefore, it is not a straightforward process to apply conventional goodness-of-fit tests to the selection of an appropriate nonstationary probability distribution model. Tests that are generally recommended for such a selection include the Akaike's information criterion (AIC), corrected Akaike's information criterion (AICc), Bayesian information criterion (BIC), and likelihood ratio test (LRT). In this study, the Monte Carlo simulation was performed to compare the performances of these four tests, with regard to nonstationary as well as stationary generalized extreme value (GEV) distributions. Proper model selection ratios and sample sizes were taken into account to evaluate the performances of all the four tests. The BIC demonstrated the best performance with regard to stationary GEV models. In case of nonstationary GEV models, the AIC proved to be better than the other three methods, when relatively small sample sizes were considered. With larger sample sizes, the AIC, BIC, and LRT presented the best performances for GEV models which have nonstationary location and/or scale parameters, respectively. Simulation results were then evaluated by applying all four tests to annual maximum rainfall data of selected sites, as observed by the Korea Meteorological Administration.

  15. MODELING A VALUE CHAIN IN PUBLIC SECTOR

    Directory of Open Access Journals (Sweden)

    Daiva Rapcevičienė

    2014-08-01

    Full Text Available Purpose – Over the past three decades comprehensive insights were made in order to design and manage the value chain. A lot of scholars discuss differences between private sector value chain – creation profit for the business and public sector value chain, the approach that public sector creates value through the services that it provides. However, there is a lack of a common understanding of what public sector value chain is in general. This paper reviews the literature on how the private value chain was transformed into public value chain and reviews a determination and architecture of a value chain in public sector which gives a structural approach to greater picture of how all structure works. It reviews an approach that the value chain for the public sector shows how the public sector organizes itself to ensure it is of value to the citizens. Design/methodology/approach – descriptive method, analysis of scientific literature. Findings – The public sector value chain is an adaptation of the private sector value chain. The difference between the two is that the customer is the focus of the public sector context, versus the profit focus in the private sector context. There are significant similarities between the two chain models. Each of the chain models are founded on a series of core components. For the public sector context, the core components are people, service and trust. Research limitations/implications – this paper based on presenting value chain for both private and public sectors and giving deeper knowledge for public sector value chain model. Practical implications – comprehension of general value chain model concept and public sector value chain model helps to see multiple connections throughout the entire process: from the beginning to the end. The paper presents the theoretical framework for further study of the value chain model for waste management creation. Originality/Value – The paper reveals the systematic

  16. Uterine Contraction Modeling and Simulation

    Science.gov (United States)

    Liu, Miao; Belfore, Lee A.; Shen, Yuzhong; Scerbo, Mark W.

    2010-01-01

    Building a training system for medical personnel to properly interpret fetal heart rate tracing requires developing accurate models that can relate various signal patterns to certain pathologies. In addition to modeling the fetal heart rate signal itself, the change of uterine pressure that bears strong relation to fetal heart rate and provides indications of maternal and fetal status should also be considered. In this work, we have developed a group of parametric models to simulate uterine contractions during labor and delivery. Through analysis of real patient records, we propose to model uterine contraction signals by three major components: regular contractions, impulsive noise caused by fetal movements, and low amplitude noise invoked by maternal breathing and measuring apparatus. The regular contractions are modeled by an asymmetric generalized Gaussian function and least squares estimation is used to compute the parameter values of the asymmetric generalized Gaussian function based on uterine contractions of real patients. Regular contractions are detected based on thresholding and derivative analysis of uterine contractions. Impulsive noise caused by fetal movements and low amplitude noise by maternal breathing and measuring apparatus are modeled by rational polynomial functions and Perlin noise, respectively. Experiment results show the synthesized uterine contractions can mimic the real uterine contractions realistically, demonstrating the effectiveness of the proposed algorithm.

  17. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  18. Incorporating Customer Lifetime Value into Marketing Simulation Games

    Science.gov (United States)

    Cannon, Hugh M.; Cannon, James N.; Schwaiger, Manfred

    2010-01-01

    Notwithstanding the emerging prominence of customer lifetime value (CLV) and customer equity (CE) in the marketing literature during the past decade, virtually nothing has been done to address these concepts in the literature on simulation and gaming. This article addresses the failing, discussing the nature of CLV and CE and demonstrating how…

  19. Predictability of extreme values in geophysical models

    NARCIS (Netherlands)

    Sterk, A.E.; Holland, M.P.; Rabassa, P.; Broer, H.W.; Vitolo, R.

    2012-01-01

    Extreme value theory in deterministic systems is concerned with unlikely large (or small) values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical model

  20. Simulation of the Present Value of Perpetuity in Vasi(c)k Interest Model%Vasi(c)k利率模型下永久年金现值变量的分布模拟

    Institute of Scientific and Technical Information of China (English)

    段白鸽; 张连增

    2012-01-01

    The perpetuity with interest accumulation function being Brownian motion with positive drift has been investigated thoroughly, and the present value of perpetuity has inverse gamma distribution. When the interest force is Vasick process, the analytical expression for the distribution of present value of perpetuity is difficult to obtain. We first consider the perpetuity when the interest force is Vasick process, and provide the explicit expression for the mean of the present value of perpetuity using the hypergeometric function 1F1. In addition, using the software Mathematica, some numerical results are given. As a benchmark, we then simulate the present value of perpetuity in Vasick model, which is implemented with R software. For the mean of the present value of perpetuity, the simulated results fit very well to the theoretical solution.%在利率积累函数是带有正漂移项的布朗运动的情形下,永久年金现值变量服从逆伽玛分布.当利息力过程是Vasi(c)k过程时,很难得到永久年金现值变量的分布的解析解.首先考虑了利息力过程是Vasi(c)k过程的永久年金,给出永久年金现值变量均值的解析解,其中涉及到超几何函数1F1,并应用Mathematica软件给出了相应的数值解.进一步在Vasi(c)k模型下应用R软件模拟了永久年金现值变量的分布,对于永久年金现值变量的均值,数值模拟的结果与解析解非常接近.

  1. An Interval-Valued Approach to Business Process Simulation Based on Genetic Algorithms and the BPMN

    Directory of Open Access Journals (Sweden)

    Mario G.C.A. Cimino

    2014-05-01

    Full Text Available Simulating organizational processes characterized by interacting human activities, resources, business rules and constraints, is a challenging task, because of the inherent uncertainty, inaccuracy, variability and dynamicity. With regard to this problem, currently available business process simulation (BPS methods and tools are unable to efficiently capture the process behavior along its lifecycle. In this paper, a novel approach of BPS is presented. To build and manage simulation models according to the proposed approach, a simulation system is designed, developed and tested on pilot scenarios, as well as on real-world processes. The proposed approach exploits interval-valued data to represent model parameters, in place of conventional single-valued or probability-valued parameters. Indeed, an interval-valued parameter is comprehensive; it is the easiest to understand and express and the simplest to process, among multi-valued representations. In order to compute the interval-valued output of the system, a genetic algorithm is used. The resulting process model allows forming mappings at different levels of detail and, therefore, at different model resolutions. The system has been developed as an extension of a publicly available simulation engine, based on the Business Process Model and Notation (BPMN standard.

  2. Modeling Business Strategy: A Consumer Value Perspective

    OpenAIRE

    Svee, Eric-Oluf; Giannoulis, Constantinos; Zdravkovic, Jelena

    2011-01-01

    Part 3: Business Modeling; International audience; Business strategy lays out the plan of an enterprise to achieve its vision by providing value to its customers. Typically, business strategy focuses on economic value and its relevant exchanges with customers and does not directly address consumer values. However, consumer values drive customers’ choices and decisions to use a product or service, and therefore should have a direct impact on business strategy. This paper explores whether and h...

  3. The Mixed Instrumental Controller: Using Value of Information to Combine Habitual Choice and Mental Simulation

    Directory of Open Access Journals (Sweden)

    Giovanni ePezzulo

    2013-03-01

    Full Text Available Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available "cached" value of actions (linked to model-free mechanisms or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms. Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated "Value of Information" exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus - ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation.

  4. The mixed instrumental controller: using value of information to combine habitual choice and mental simulation.

    Science.gov (United States)

    Pezzulo, Giovanni; Rigoli, Francesco; Chersi, Fabian

    2013-01-01

    Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available "cached" value of actions (linked to model-free mechanisms) or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms). Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated "Value of Information" exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus - ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation.

  5. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances....

  6. A Binomial Integer-Valued ARCH Model.

    Science.gov (United States)

    Ristić, Miroslav M; Weiß, Christian H; Janjić, Ana D

    2016-11-01

    We present an integer-valued ARCH model which can be used for modeling time series of counts with under-, equi-, or overdispersion. The introduced model has a conditional binomial distribution, and it is shown to be strictly stationary and ergodic. The unknown parameters are estimated by three methods: conditional maximum likelihood, conditional least squares and maximum likelihood type penalty function estimation. The asymptotic distributions of the estimators are derived. A real application of the novel model to epidemic surveillance is briefly discussed. Finally, a generalization of the introduced model is considered by introducing an integer-valued GARCH model.

  7. Evaluating uncertainty in simulation models

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  8. Valuing condition specific health states using simulation contact lenses

    OpenAIRE

    Czoski-Murray, C.; Carlton, J; Brazier, J; Kang, H. K.; Young, T A; Papo, N.L.

    2009-01-01

    OBJECTIVE: This paper reports on a study that used contact lenses to simulate the effects of a visual impairment caused by Age Related Macular Degeneration (ARMD). The aim was to examine the feasibility of using this method of simulation and to compare the results from this experiment with those obtained from ARMD patients (n=209) using generic preference-based measures (HUI3 and EQ-5D) and patient time trade-off TTO.\\ud \\ud METHODS: Utility values were elicited from healthy participants (n=1...

  9. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operation...... in case of such faults. The design of the controller is described and its performance assessed by simulations. The control strategies are explained and the behaviour of the turbine discussed....

  10. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operation...

  11. Multiple Valued Logic for Synthesis and Simulation of Digital Circuits

    Directory of Open Access Journals (Sweden)

    Bharathi.S.L

    2015-04-01

    Full Text Available The Multiple valued logic(MVL has increased attention in the last decades because of the possibility to represent the information with more than two discrete levels.Advancing from two-valued to four-valued logic provides a progressive approach. In new technologies, the most delay and power occurs in the connections between gates. When designing a function using MVL, we need fewer gates,which implies less number of connections, then less delay. In the existing system, the 4:1 multiplexer is designed using the MVL logic and various paramaters are analysed. In the proposed system, the idea of designing a Barrel shifter using the multiple valued logic and the parameters are all analyzed. All these designs are verified using Modelsim simulator.

  12. Predictability of extreme values in geophysical models

    Directory of Open Access Journals (Sweden)

    A. E. Sterk

    2012-09-01

    Full Text Available Extreme value theory in deterministic systems is concerned with unlikely large (or small values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical models. We study whether finite-time Lyapunov exponents are larger or smaller for initial conditions leading to extremes. General statements on whether extreme values are better or less predictable are not possible: the predictability of extreme values depends on the observable, the attractor of the system, and the prediction lead time.

  13. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  14. IVOA Recommendation: Simulation Data Model

    CERN Document Server

    Lemson, Gerard; Cervino, Miguel; Gheller, Claudio; Gray, Norman; LePetit, Franck; Louys, Mireille; Ooghe, Benjamin; Wagner, Rick; Wozniak, Herve

    2014-01-01

    In this document and the accompanying documents we describe a data model (Simulation Data Model) describing numerical computer simulations of astrophysical systems. The primary goal of this standard is to support discovery of simulations by describing those aspects of them that scientists might wish to query on, i.e. it is a model for meta-data describing simulations. This document does not propose a protocol for using this model. IVOA protocols are being developed and are supposed to use the model, either in its original form or in a form derived from the model proposed here, but more suited to the particular protocol. The SimDM has been developed in the IVOA Theory Interest Group with assistance of representatives of relevant working groups, in particular DM and Semantics.

  15. Modeling and Simulation with INS.

    Science.gov (United States)

    Roberts, Stephen D.; And Others

    INS, the Integrated Network Simulation language, puts simulation modeling into a network framework and automatically performs such programming activities as placing the problem into a next event structure, coding events, collecting statistics, monitoring status, and formatting reports. To do this, INS provides a set of symbols (nodes and branches)…

  16. Simulation modeling of estuarine ecosystems

    Science.gov (United States)

    Johnson, R. W.

    1980-01-01

    A simulation model has been developed of Galveston Bay, Texas ecosystem. Secondary productivity measured by harvestable species (such as shrimp and fish) is evaluated in terms of man-related and controllable factors, such as quantity and quality of inlet fresh-water and pollutants. This simulation model used information from an existing physical parameters model as well as pertinent biological measurements obtained by conventional sampling techniques. Predicted results from the model compared favorably with those from comparable investigations. In addition, this paper will discuss remotely sensed and conventional measurements in the framework of prospective models that may be used to study estuarine processes and ecosystem productivity.

  17. Modeling and Simulating Environmental Effects

    OpenAIRE

    Guest, Peter S.; Murphree, Tom; Frederickson, Paul A.; Guest, Arlene A.

    2012-01-01

    MOVES Research & Education Systems Seminar: Presentation; Session 4: Collaborative NWDC/NPS M&S Research; Moderator: Curtis Blais; Modeling and Simulating Environmental Effects; speakers: Peter Guest, Paul Frederickson & Tom Murphree Environmental Effects Group

  18. Simulation of Information Superiority Value Chain Model of C4ISR System Based on System Dynamics%系统动力学的C4ISR系统信息优势价值链模型仿真

    Institute of Scientific and Technical Information of China (English)

    李小全; 蓝鹏飞; 程懿

    2011-01-01

    To obtain informtiaon superiority is the core of Net-Center War(NCW), C4ISR system is the basis of obtaining information superiority, decision-making superiority and action superiority in the NCW. Firstly, information superiority value chain of C4ISR system is analyzed based on C4ISR system work process in this paper. Secondly, according to the building process of battlefield situation perception information superiority value chain, C4ISR system information superiority value chain System Dynamics (SD) confront model has been established using SD theory. At last, an combat example is taken for an example to study the simulation model, the results show that this method is feasible and efficiency to study complex military information system.%实现信息优势是网络中心战的核心,C4ISR系统是打赢网络中心战、夺取信息优势、决策优势和行动优势的客观基础.从C4ISR系统的工作流程出发,分析了C4ISR系统信息优势价值链,围绕战场态势感知信息优势价值链的生成过程,运用系统动力学方法建立了对抗条件下C4ISR系统信息优势价值链的SD模型,并结合作战背景对试验结果进行了仿真分析,验证了该方法研究复杂军事信息系统的可行性.

  19. TREAT Modeling and Simulation Strategy

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  20. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  1. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  2. A Simulation Model Articulation of the REA Ontology

    Science.gov (United States)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  3. A VRLA battery simulation model

    Energy Technology Data Exchange (ETDEWEB)

    Pascoe, P.E.; Anbuky, A.H. [Invensys Energy Systems NZ Limited, Christchurch (New Zealand)

    2004-05-01

    A valve regulated lead acid (VRLA) battery simulation model is an invaluable tool for the standby power system engineer. The obvious use for such a model is to allow the assessment of battery performance. This may involve determining the influence of cells suffering from state of health (SOH) degradation on the performance of the entire string, or the running of test scenarios to ascertain the most suitable battery size for the application. In addition, it enables the engineer to assess the performance of the overall power system. This includes, for example, running test scenarios to determine the benefits of various load shedding schemes. It also allows the assessment of other power system components, either for determining their requirements and/or vulnerabilities. Finally, a VRLA battery simulation model is vital as a stand alone tool for educational purposes. Despite the fundamentals of the VRLA battery having been established for over 100 years, its operating behaviour is often poorly understood. An accurate simulation model enables the engineer to gain a better understanding of VRLA battery behaviour. A system level multipurpose VRLA battery simulation model is presented. It allows an arbitrary battery (capacity, SOH, number of cells and number of strings) to be simulated under arbitrary operating conditions (discharge rate, ambient temperature, end voltage, charge rate and initial state of charge). The model accurately reflects the VRLA battery discharge and recharge behaviour. This includes the complex start of discharge region known as the coup de fouet. (author)

  4. Anybody can do Value at Risk: A Teaching Study using Parametric Computation and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Yun Hsing Cheung

    2012-12-01

    Full Text Available The three main Value at Risk (VaR methodologies are historical, parametric and Monte Carlo Simulation.Cheung & Powell (2012, using a step-by-step teaching study, showed how a nonparametric historical VaRmodel could be constructed using Excel, thus benefitting teachers and researchers by providing them with areadily useable teaching study and an inexpensive and flexible VaR modelling option. This article extends thatwork by demonstrating how parametric and Monte Carlo Simulation VaR models can also be constructed inExcel, thus providing a total Excel modelling package encompassing all three VaR methods.

  5. Modelling and Simulation of Crude Oil Dispersion

    Directory of Open Access Journals (Sweden)

    Abdulfatai JIMOH

    2006-01-01

    Full Text Available This research work was carried out to develop a model equation for the dispersion of crude oil in water. Seven different crude oils (Bonny Light, Antan Terminal, Bonny Medium, Qua Iboe Light, Brass Light Mbede, Forcados Blend and Heavy H were used as the subject crude oils. The developed model equation in this project which is given as...It was developed starting from the equation for the oil dispersion rate in water which is given as...The developed equation was then simulated with the aid of MathCAD 2000 Professional software. The experimental and model results obtained from the simulation of the model equation were plotted on the same axis against time of dispersion. The model results revealed close fittings between the experimental and the model results because the correlation coefficients and the r-square values calculated using Spreadsheet Program were both found to be unity (1.00.

  6. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....../panel in Standard Test Conditions (STC) are shown, as well as the parameters extraction from the data-sheet values. The temperature dependence of the cell dark saturation current is expressed with an alternative formula, which gives better correlation with the datasheet values of the power temperature dependence....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  7. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are bor

  8. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and field...

  9. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic- Equation system. Being able to operate...

  10. The Values of College Students in Business Simulation Game: A Means-End Chain Approach

    Science.gov (United States)

    Lin, Yu-Ling; Tu, Yu-Zu

    2012-01-01

    Business simulation games (BSGs) enable students to practice making decisions in a virtual environment, accumulate experience in application of strategies, and train themselves in modes of decision-making. This study examines the value sought by players of BSG. In this study, a means-end chain (MEC) model was adopted as the basis, and ladder…

  11. The Values of College Students in Business Simulation Game: A Means-End Chain Approach

    Science.gov (United States)

    Lin, Yu-Ling; Tu, Yu-Zu

    2012-01-01

    Business simulation games (BSGs) enable students to practice making decisions in a virtual environment, accumulate experience in application of strategies, and train themselves in modes of decision-making. This study examines the value sought by players of BSG. In this study, a means-end chain (MEC) model was adopted as the basis, and ladder…

  12. An analytical high value target acquisition model

    OpenAIRE

    Becker, Kevin J.

    1986-01-01

    Approved for public release; distribution is unlimited An Analytical High Value Target (HVT) acquisition model is developed for a generic anti-ship cruise missile system. the target set is represented as a single HVT within a field of escorts. The HVT's location is described by a bivariate normal probability distribution. the escorts are represented by a spatially homogeneous Poisson random field surrounding the HVT. Model output consists of the probability that at least one missile of...

  13. Value-Added Modeling in Physical Education

    Science.gov (United States)

    Hushman, Glenn; Hushman, Carolyn

    2015-01-01

    The educational reform movement in the United States has resulted in a variety of states moving toward a system of value-added modeling (VAM) to measure a teacher's contribution to student achievement. Recently, many states have begun using VAM scores as part of a larger system to evaluate teacher performance. In the past decade, only "core…

  14. A Model for Valuing Military Talents

    Institute of Scientific and Technical Information of China (English)

    LIU Hong-sheng

    2002-01-01

    The method of collocating military talents is a difficult problem. It is different from other talents, for the characteristic of military talents. This paper presents a model for valuing military talents,which can assists the military leaders to collocate military talents properly.

  15. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  16. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  17. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    Mexico , March 1979. 14. Kinney, G. F.,.::. IeiN, .hoce 1h Ir, McMillan, p. 57, 1962. 15. Courant and Friedrichs, ,U: r. on moca an.: Jho...AD 79 275 NEW MEXICO UNIV ALBUGUERGUE ERIC H WANG CIVIL ENGINE-ETC F/6 18/3 COMPUTATIONAL MODELING OF SIMULATION TESTS.(U) JUN 80 6 LEIGH, W CHOWN, B...COMPUTATIONAL MODELING OF SIMULATION TESTS00 0G. Leigh W. Chown B. Harrison Eric H. Wang Civil Engineering Research Facility University of New Mexico

  18. SIMULATION OF COLLECTIVE RISK MODEL

    Directory of Open Access Journals (Sweden)

    Viera Pacáková

    2007-12-01

    Full Text Available The article focuses on providing brief theoretical definitions of the basic terms and methods of modeling and simulations of insurance risks in non-life insurance by means of mathematical and statistical methods using statistical software. While risk assessment of insurance company in connection with its solvency is a rather complex and comprehensible problem, its solution starts with statistical modeling of number and amount of individual claims. Successful solution of these fundamental problems enables solving of curtail problems of insurance such as modeling and simulation of collective risk, premium an reinsurance premium calculation, estimation of probabiliy of ruin etc. The article also presents some essential ideas underlying Monte Carlo methods and their applications to modeling of insurance risk. Solving problem is to find the probability distribution of the collective risk in non-life insurance portfolio. Simulation of the compound distribution function of the aggregate claim amount can be carried out, if the distibution functions of the claim number process and the claim size are assumed given. The Monte Carlo simulation is suitable method to confirm the results of other methods and for treatments of catastrophic claims, when small collectives are studied. Analysis of insurance risks using risk theory is important part of the project Solvency II. Risk theory is analysis of stochastic features of non-life insurance process. The field of application of risk theory has grown rapidly. There is a need to develop the theory into form suitable for practical purposes and demostrate their application. Modern computer simulation techniques open up a wide field of practical applications for risk theory concepts, without requiring the restricive assumptions and sophisticated mathematics. This article presents some comparisons of the traditional actuarial methods and of simulation methods of the collective risk model.

  19. Empirical likelihood-based evaluations of Value at Risk models

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Value at Risk (VaR) is a basic and very useful tool in measuring market risks. Numerous VaR models have been proposed in literature. Therefore, it is of great interest to evaluate the efficiency of these models, and to select the most appropriate one. In this paper, we shall propose to use the empirical likelihood approach to evaluate these models. Simulation results and real life examples show that the empirical likelihood method is more powerful and more robust than some of the asymptotic method available in literature.

  20. Modeling and Simulation. III. Simulation of a Model for Development of Visual Cortical Specificity.

    Science.gov (United States)

    1986-12-15

    of parameter values. Experiment, model, and simulation 5’ The simulations we consider mimic, in form, classic deprivation experiments. Kittens are...second paper of the series (ref. 8) reviews the results of numerous experiments on the neuronal development of kitten visual cortex. We have...restricted to a very limited range of oriented contours (see citations in ref. 8). Kittens were raised, for example, viewing only horizontal or only vertical

  1. Efficient Smoothing for Boundary Value Models

    Science.gov (United States)

    1989-12-29

    IEEE Transactions on Automatic Control , vol. 29, pp. 803-821, 1984. [2] A. Bagchi and H. Westdijk, "Smoothing...and likelihood ratio for Gaussian boundary value processes," IEEE Transactions on Automatic Control , vol. 34, pp. 954-962, 1989. [3] R. Nikoukhah et...77-96, 1988. [6] H. L. Weinert and U. B. Desai, "On complementary models and fixed- interval smoothing," IEEE Transactions on Automatic Control ,

  2. Value Concept and Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Truong Hong Trinh

    2014-12-01

    Full Text Available This paper approaches the value added method for Gross Domestic Product (GDP measurement that explains the interrelationship between the expenditure approach and the income approach. The economic growth model is also proposed with three key elements of capital accumulation, technological innovation, and institutional reform. Although capital accumulation and technological innovation are two integrated elements in driving economic growth, institutional reforms play a key role in creating incentives that effect the transitional and steady state growth rate in the real world economy. The paper provides a theoretical insight on economic growth to understand incentives and driving forces in economic growth model.

  3. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... to operate a boiler plant dynamically means that the boiler designs must be able to absorb any fluctuations in water level and temperature gradients resulting from the pressure change in the boiler. On the one hand a large water-/steam space may be required, i.e. to build the boiler as big as possible. Due...

  4. Intelligent Mobility Modeling and Simulation

    Science.gov (United States)

    2015-03-04

    cog.cs.drexel.edu/act-r/index.html) •Models sensory / motor performance of human driver or teleoperator 27UNCLASSIFIED: Distribution Statement A. Approved for...U.S. ARMY TANK AUTOMOTIVE RESEARCH, DEVELOPMENT AND ENGINEERING CENTER Intelligent Mobility Modeling and Simulation 1 Dr. P. Jayakumar, S. Arepally...Prescribed by ANSI Std Z39-18 Contents 1. Mobility - Autonomy - Latency Relationship 2. Machine - Human Partnership 3. Development of Shared Control

  5. Simulating secondary succession of elk forage values in a managed forest landscape, western Washington

    Science.gov (United States)

    Jenkins, Kurt J.; Starkey, Edward E.

    1996-01-01

    Modern timber management practices often influence forage production for elk (Cervus elaphus) on broad temporal and spatial scales in forested landscapes. We incorporated site-specific information on postharvesting forest succession and forage characteristics in a simulation model to evaluate past and future influences of forest management practices on forage values for elk in a commercially managed Douglas fir (Pseudotsuga menziesii, PSME)-western hemlock (Tsuga heterophylla, TSHE) forest in western Washington. We evaluated future effects of: (1) clear-cut logging 0, 20, and 40% of harvestable stands every five years; (2) thinning 20-year-old Douglas fir forests; and (3) reducing the harvesting cycle from 60 to 45 years. Reconstruction of historical patterns of vegetation succession indicated that forage values peaked in the 1960s and declined from the 1970s to the present, but recent values still were higher than may have existed in the unmanaged landscape in 1945. Increased forest harvesting rates had little short-term influence on forage trends because harvestable stands were scarce. Simulations of forest thinning also produced negligible benefits because thinning did not improve forage productivity appreciably at the stand level. Simulations of reduced harvesting cycles shortened the duration of declining forage values from approximately 30 to 15 years. We concluded that simulation models are useful tools for examining landscape responses of forage production to forest management strategies, but the options examined provided little potential for improving elk forages in the immediate future.

  6. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  7. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Mehdi Sadeghi; Saeed Shavvalpour [Imam Sadiq University, Tehran (Iran). Economics Dept.

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification. ''Value-at-risk'' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period. (author)

  8. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sadeghi, Mehdi [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: sadeghi@isu.ac.ir; Shavvalpour, Saeed [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: shavalpoor@isu.ac.ir

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification.' Value-at-risk' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period.

  9. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...

  10. Modeling and Simulation of Nanoindentation

    Science.gov (United States)

    Huang, Sixie; Zhou, Caizhi

    2017-08-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  11. Multiscale Stochastic Simulation and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    James Glimm; Xiaolin Li

    2006-01-10

    Acceleration driven instabilities of fluid mixing layers include the classical cases of Rayleigh-Taylor instability, driven by a steady acceleration and Richtmyer-Meshkov instability, driven by an impulsive acceleration. Our program starts with high resolution methods of numerical simulation of two (or more) distinct fluids, continues with analytic analysis of these solutions, and the derivation of averaged equations. A striking achievement has been the systematic agreement we obtained between simulation and experiment by using a high resolution numerical method and improved physical modeling, with surface tension. Our study is accompanies by analysis using stochastic modeling and averaged equations for the multiphase problem. We have quantified the error and uncertainty using statistical modeling methods.

  12. Continuous Spatial Process Models for Spatial Extreme Values

    KAUST Repository

    Sang, Huiyan

    2010-01-28

    We propose a hierarchical modeling approach for explaining a collection of point-referenced extreme values. In particular, annual maxima over space and time are assumed to follow generalized extreme value (GEV) distributions, with parameters μ, σ, and ξ specified in the latent stage to reflect underlying spatio-temporal structure. The novelty here is that we relax the conditionally independence assumption in the first stage of the hierarchial model, an assumption which has been adopted in previous work. This assumption implies that realizations of the the surface of spatial maxima will be everywhere discontinuous. For many phenomena including, e. g., temperature and precipitation, this behavior is inappropriate. Instead, we offer a spatial process model for extreme values that provides mean square continuous realizations, where the behavior of the surface is driven by the spatial dependence which is unexplained under the latent spatio-temporal specification for the GEV parameters. In this sense, the first stage smoothing is viewed as fine scale or short range smoothing while the larger scale smoothing will be captured in the second stage of the modeling. In addition, as would be desired, we are able to implement spatial interpolation for extreme values based on this model. A simulation study and a study on actual annual maximum rainfall for a region in South Africa are used to illustrate the performance of the model. © 2009 International Biometric Society.

  13. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  14. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  15. Animal models for simulating weightlessness

    Science.gov (United States)

    Morey-Holton, E.; Wronski, T. J.

    1982-01-01

    NASA has developed a rat model to simulate on earth some aspects of the weightlessness alterations experienced in space, i.e., unloading and fluid shifts. Comparison of data collected from space flight and from the head-down rat suspension model suggests that this model system reproduces many of the physiological alterations induced by space flight. Data from various versions of the rat model are virtually identical for the same parameters; thus, modifications of the model for acute, chronic, or metabolic studies do not alter the results as long as the critical components of the model are maintained, i.e., a cephalad shift of fluids and/or unloading of the rear limbs.

  16. Standard for Models and Simulations

    Science.gov (United States)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  17. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  18. Simulation guided value stream mapping and lean improvement: A case study of a tubular machining facility

    Directory of Open Access Journals (Sweden)

    Wei Xia

    2013-06-01

    Full Text Available Purpose: This paper describes a typical Value stream mapping (VSM application enhanced by the discrete event simulation (DES to a dedicated tubular manufacturing process. Design/Methodology/Approach: VSM is prescribed as part of lean production portfolio of tools, not only highlights process inefficiencies, transactional and communication mismatches, but also guides improvement areas. Meanwhile, DES is used to reduce uncertainty and create consensus by visualizing dynamic process views. It is served as a complementary tool for the traditional VSM to provide sufficient justification and quantifiable evidence needed to convince the lean approaches. A simulation model is developed to replicate the operation of an existing system, and that of a proposed system that modifies the existing design to incorporate lean manufacturing shop floor principles. Findings: A comprehensive model for the tubular manufacturing process is constructed, and distinctive scenarios are derived to uncover an optimal future state of the process. Various simulation scenarios are developed. The simulated results are acquired and investigated, and they are well matched with the real production data. Originality/Value: DES is demonstrated as a guided tool to assist organizations with the decision to implement lean approaches by quantifying benefits from applying the VSM. A roadmap is provided to illustrate how the VSM is used to design a desired future state. The developed simulation scenarios mimic the behavior of the actual manufacturing process in an intuitive manner.

  19. Scoring performance on computer-based patient simulations: beyond value of information.

    Science.gov (United States)

    Downs, S M; Marasigan, F; Abraham, V; Wildemuth, B; Friedman, C P

    1999-01-01

    As computer based clinical case simulations become increasingly popular for training and evaluating clinicians, approaches are needed to evaluate a trainee's or examinee's solution of the simulated cases. In 1997 we developed a decision analytic approach to scoring performance on computerized patient case simulations, using expected value of information (VOI) to generate a score each time the user requested clinical information from the simulation. Although this measure has many desirable characteristics, we found that the VOI was zero for the majority of information requests. We enhanced our original algorithm to measure potential decrements in expected utility that could result from using results of information requests that have zero VOI. Like the original algorithm, the new approach uses decision models, represented as influence diagrams, to represent the diagnostic problem. The process of solving computer based patient simulations involves repeated cycles of requesting and receiving these data from the simulations. Each time the user requests clinical data from the simulation, the influence diagram is evaluated to determine the expected VOI of the requested clinical datum. The VOI is non-zero only it the requested datum has the potential to change the leading diagnosis. The VOI is zero when the data item requested does not map to any node in the influence diagram or when the item maps to a node but does not change the leading diagnosis regardless of it's value. Our new algorithm generates a score for each of these situations by modeling what would happen to the expected utility of the model if the user changes the leading diagnosis based on the results. The resulting algorithm produces a non-zero score for all information requests. The score is the VOI when the VOI is non-zero It is a negative number when the VOI is zero.

  20. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance, a co....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence.......A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance......, a correct spectral shape, and non-Gaussian statistics, is selected in order to evaluate the model turbulence. An actual turbulence record is analyzed in detail providing both a standard for comparison and input statistics for the generalized spectral analysis, which in turn produces a set of orthonormal...

  1. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.

  2. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  3. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  4. The Symmetric Solutions of Affiliated Value Model

    Institute of Scientific and Technical Information of China (English)

    Che Ka-jia; Li Zhi-chen

    2004-01-01

    In a symmetric affiliated value model, this paper analyses High-Technology industrial firms' competitive strategy in research and development (R&D). We obtain the symmetric Bayesian Nash Equilibrium functions with or without government's prize:b1(x)=v(x,x)Fn-1(x|x)-∫x0Fn-1(y|y)dv(y,y), b2(x)=∫x0[v(y,y)+v0]dFn-1(y|y), and b3(x)=∫x0v(y,y)(fn-1(y|y))/(1-Fn-1(y|y))dy. We find the firm's investment level will increase in prize, only when the constant prize v0≥v(y,y)(Fn-1(y|y))/(1-Fn-1(y|y)), does the firm invest more aggressively with constant prize than with variable prize.

  5. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Danielsson; C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  6. Creating Value in Marketing and Business Simulations: An Author's Viewpoint

    Science.gov (United States)

    Cadotte, Ernest R.

    2016-01-01

    Simulations are a form of competitive training that can provide transformational learning. Participants are pushed by the competition and their own desire to win as well as the continual feedback, encouragement, and guidance of a Business Coach. Simulations enable students to apply their knowledge and practice their business skills over and over.…

  7. Creating Value in Marketing and Business Simulations: An Author's Viewpoint

    Science.gov (United States)

    Cadotte, Ernest R.

    2016-01-01

    Simulations are a form of competitive training that can provide transformational learning. Participants are pushed by the competition and their own desire to win as well as the continual feedback, encouragement, and guidance of a Business Coach. Simulations enable students to apply their knowledge and practice their business skills over and over.…

  8. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...... of the boiler is (with an acceptable accuracy) proportional with the volume of the boiler. For the dynamic operation capability a cost function penalizing limited dynamic operation capability and vise-versa has been dened. The main idea is that it by mean of the parameters in this function is possible to t its...

  9. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  10. Advancing Material Models for Automotive Forming Simulations

    Science.gov (United States)

    Vegter, H.; An, Y.; ten Horn, C. H. L. J.; Atzema, E. H.; Roelofsen, M. E.

    2005-08-01

    Simulations in automotive industry need more advanced material models to achieve highly reliable forming and springback predictions. Conventional material models implemented in the FEM-simulation models are not capable to describe the plastic material behaviour during monotonic strain paths with sufficient accuracy. Recently, ESI and Corus co-operate on the implementation of an advanced material model in the FEM-code PAMSTAMP 2G. This applies to the strain hardening model, the influence of strain rate, and the description of the yield locus in these models. A subsequent challenge is the description of the material after a change of strain path. The use of advanced high strength steels in the automotive industry requires a description of plastic material behaviour of multiphase steels. The simplest variant is dual phase steel consisting of a ferritic and a martensitic phase. Multiphase materials also contain a bainitic phase in addition to the ferritic and martensitic phase. More physical descriptions of strain hardening than simple fitted Ludwik/Nadai curves are necessary. Methods to predict plastic behaviour of single-phase materials use a simple dislocation interaction model based on the formed cells structures only. At Corus, a new method is proposed to predict plastic behaviour of multiphase materials have to take hard phases into account, which deform less easily. The resulting deformation gradients create geometrically necessary dislocations. Additional micro-structural information such as morphology and size of hard phase particles or grains is necessary to derive the strain hardening models for this type of materials. Measurements available from the Numisheet benchmarks allow these models to be validated. At Corus, additional measured values are available from cross-die tests. This laboratory test can attain critical deformations by large variations in blank size and processing conditions. The tests are a powerful tool in optimising forming simulations

  11. Measuring Virtual Simulations Value in Training Exercises - USMC Use Case

    Science.gov (United States)

    2015-12-04

    The assessment results provide support for the thesis that both the primary and secondary training audiences are able to realize training value through...assessments for T&R standards. Post Event Training Impacts In addition to the data collected during LSE-14, the realized training value of...The realized value of the LVC integrated training capability has resulted in it being a required training event prior to Integrated Training Exercise

  12. Vertical eddy heat fluxes from model simulations

    Science.gov (United States)

    Stone, Peter H.; Yao, Mao-Sung

    1991-01-01

    Vertical eddy fluxes of heat are calculated from simulations with a variety of climate models, ranging from three-dimensional GCMs to a one-dimensional radiative-convective model. The models' total eddy flux in the lower troposphere is found to agree well with Hantel's analysis from observations, but in the mid and upper troposphere the models' values are systematically 30 percent to 50 percent smaller than Hantel's. The models nevertheless give very good results for the global temperature profile, and the reason for the discrepancy is unclear. The model results show that the manner in which the vertical eddy flux is carried is very sensitive to the parameterization of moist convection. When a moist adiabatic adjustment scheme with a critical value for the relative humidity of 100 percent is used, the vertical transports by large-scale eddies and small-scale convection on a global basis are equal: but when a penetrative convection scheme is used, the large-scale flux on a global basis is only about one-fifth to one-fourth the small-scale flux. Comparison of the model results with observations indicates that the results with the latter scheme are more realistic. However, even in this case, in mid and high latitudes the large and small-scale vertical eddy fluxes of heat are comparable in magnitude above the planetary boundary layer.

  13. Applications of Joint Tactical Simulation Modeling

    Science.gov (United States)

    1997-12-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING by Steve VanLandingham December 1997...SUBTITLE APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING 5. FUNDING NUMBERS 6. AUTHOR(S) VanLandingham, Steve 7. PERFORMING ORGANIZATION NAME(S...release; distribution is unlimited. APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING Steve VanLandingham Lieutenant, United States Navy B.S

  14. Stochastic models to simulate paratuberculosis in dairy herds

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Weber, M.F.; Kudahl, Anne Margrethe Braad

    2011-01-01

    Stochastic simulation models are widely accepted as a means of assessing the impact of changes in daily management and the control of different diseases, such as paratuberculosis, in dairy herds. This paper summarises and discusses the assumptions of four stochastic simulation models and their use...... the models are somewhat different in their underlying principles and do put slightly different values on the different strategies, their overall findings are similar. Therefore, simulation models may be useful in planning paratuberculosis strategies in dairy herds, although as with all models caution...

  15. Petroleum reservoir data for testing simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Lloyd, J.M.; Harrison, W.

    1980-09-01

    This report consists of reservoir pressure and production data for 25 petroleum reservoirs. Included are 5 data sets for single-phase (liquid) reservoirs, 1 data set for a single-phase (liquid) reservoir with pressure maintenance, 13 data sets for two-phase (liquid/gas) reservoirs and 6 for two-phase reservoirs with pressure maintenance. Also given are ancillary data for each reservoir that could be of value in the development and validation of simulation models. A bibliography is included that lists the publications from which the data were obtained.

  16. Schwinger model simulations with dynamical overlap fermions

    CERN Document Server

    Bietenholz, W; Volkholz, J

    2007-01-01

    We present simulation results for the 2-flavour Schwinger model with dynamical overlap fermions. In particular we apply the overlap hypercube operator at seven light fermion masses. In each case we collect sizable statistics in the topological sectors 0 and 1. Since the chiral condensate Sigma vanishes in the chiral limit, we observe densities for the microscopic Dirac spectrum, which have not been addressed yet by Random Matrix Theory (RMT). Nevertheless, by confronting the averages of the lowest eigenvalues in different topological sectors with chiral RMT in unitary ensemble we obtain -- for the very light fermion masses -- values for $\\Sigma$ that follow closely the analytical predictions in the continuum.

  17. Schwinger model simulations with dynamical overlap fermions

    Energy Technology Data Exchange (ETDEWEB)

    Bietenholz, W. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Shcheredin, S. [Bielefeld Univ. (Germany). Fakultaet fuer Physik; Volkholz, J. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2007-11-15

    We present simulation results for the 2-flavour Schwinger model with dynamical overlap fermions. In particular we apply the overlap hypercube operator at seven light fermion masses. In each case we collect sizable statistics in the topological sectors 0 and 1. Since the chiral condensate {sigma} vanishes in the chiral limit, we observe densities for the microscopic Dirac spectrum, which have not been addressed yet by Random Matrix Theory (RMT). Nevertheless, by confronting the averages of the lowest eigenvalues in different topological sectors with chiral RMT in unitary ensemble we obtain - for the very light fermion masses - values for {sigma} that follow closely the analytical predictions in the continuum. (orig.)

  18. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...... already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity. © IWA Publishing 2013....... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  19. The Advancement Value Chain: An Exploratory Model

    Science.gov (United States)

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  20. The Advancement Value Chain: An Exploratory Model

    Science.gov (United States)

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  1. SWEEPOP a simulation model for Target Simulation Mode minesweeping

    NARCIS (Netherlands)

    Keus, H.E.; Beckers, A.L.D.; Cleophas, P.L.H.

    2005-01-01

    SWEEPOP is a flexible model that simulates the physical interaction between objects in a maritime underwater environment. The model was built to analyse the deployment and the performance of a Target Simulation Mode (TSM) minesweeping system for the Royal Netherlands Navy (RNLN) and to support its p

  2. Runoff Simulation of Shitoukoumen Reservoir Basin Based on SWAT Model

    Institute of Scientific and Technical Information of China (English)

    XIE; Miao; LI; Hong-yan; LIU; Tie-juan; RU; Shi-rong

    2012-01-01

    [Objective]The study aimed to simulate the runoff of Shitoukoumen Reservoir basin by using SWAT model. [Method] Based on DEM elevation, land use type, soil type and hydrometeorological data, SWAT model, a distributed hydrological model was established to simulate the monthly runoff of Shitoukoumen Reservoir basin, and the years 2006 and 2010 were chosen as the calibration and validation period respectively. [Result] The simulation results indicated that SWAT model could be used to simulate the runoff of Shitoukoumen Reservoir basin, and the simulation effect was good. However, the response of the model to local rainstorm was not obvious, so that the actual runoff in June and July of 2010 was abnormally higher than the simulation value. [Conclusion] The research could provide theoretical references for the plan and management of water resources in Shitoukoumen Reservoir basin in future.

  3. An infinitesimal model for quantitative trait genomic value prediction.

    Directory of Open Access Journals (Sweden)

    Zhiqiu Hu

    Full Text Available We developed a marker based infinitesimal model for quantitative trait analysis. In contrast to the classical infinitesimal model, we now have new information about the segregation of every individual locus of the entire genome. Under this new model, we propose that the genetic effect of an individual locus is a function of the genome location (a continuous quantity. The overall genetic value of an individual is the weighted integral of the genetic effect function along the genome. Numerical integration is performed to find the integral, which requires partitioning the entire genome into a finite number of bins. Each bin may contain many markers. The integral is approximated by the weighted sum of all the bin effects. We now turn the problem of marker analysis into bin analysis so that the model dimension has decreased from a virtual infinity to a finite number of bins. This new approach can efficiently handle virtually unlimited number of markers without marker selection. The marker based infinitesimal model requires high linkage disequilibrium of all markers within a bin. For populations with low or no linkage disequilibrium, we develop an adaptive infinitesimal model. Both the original and the adaptive models are tested using simulated data as well as beef cattle data. The simulated data analysis shows that there is always an optimal number of bins at which the predictability of the bin model is much greater than the original marker analysis. Result of the beef cattle data analysis indicates that the bin model can increase the predictability from 10% (multiple marker analysis to 33% (multiple bin analysis. The marker based infinitesimal model paves a way towards the solution of genetic mapping and genomic selection using the whole genome sequence data.

  4. Optimisation of a Crossdocking Distribution Centre Simulation Model

    CERN Document Server

    Adewunmi, Adrian

    2010-01-01

    This paper reports on continuing research into the modelling of an order picking process within a Crossdocking distribution centre using Simulation Optimisation. The aim of this project is to optimise a discrete event simulation model and to understand factors that affect finding its optimal performance. Our initial investigation revealed that the precision of the selected simulation output performance measure and the number of replications required for the evaluation of the optimisation objective function through simulation influences the ability of the optimisation technique. We experimented with Common Random Numbers, in order to improve the precision of our simulation output performance measure, and intended to use the number of replications utilised for this purpose as the initial number of replications for the optimisation of our Crossdocking distribution centre simulation model. Our results demonstrate that we can improve the precision of our selected simulation output performance measure value using C...

  5. PDF added value of a high resolution climate simulation for precipitation

    Science.gov (United States)

    Soares, Pedro M. M.; Cardoso, Rita M.

    2015-04-01

    dynamical downscaling, based on simple PDF skill scores. The measure can assess the full quality of the PDFs and at the same time integrates a flexible manner to weight differently the PDF tails. In this study we apply the referred method to characaterize the PDF added value of a high resolution simulation with the WRF model. Results from a WRF climate simulation centred at the Iberian Penisnula with two nested grids, a larger one at 27km and a smaller one at 9km. This simulation is forced by ERA-Interim. The observational data used covers from rain gauges precipitation records to observational regular grids of daily precipitation. Two regular gridded precipitation datasets are used. A Portuguese grid precipitation dataset developed at 0.2°× 0.2°, from observed rain gauges daily precipitation. A second one corresponding to the ENSEMBLES observational gridded dataset for Europe, which includes daily precipitation values at 0.25°. The analisys shows an important PDF added value from the higher resolution simulation, regarding the full PDF and the extremes. This method shows higher potential to be applied to other simulation exercises and to evaluate other variables.

  6. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantification of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to define parts...

  7. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantication of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to dene parts...

  8. Techniques and Simulation Models in Risk Management

    OpenAIRE

    Mirela GHEORGHE

    2012-01-01

    In the present paper, the scientific approach of the research starts from the theoretical framework of the simulation concept and then continues in the setting of the practical reality, thus providing simulation models for a broad range of inherent risks specific to any organization and simulation of those models, using the informatics instrument @Risk (Palisade). The reason behind this research lies in the need for simulation models that will allow the person in charge with decision taking i...

  9. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    OpenAIRE

    Jin Xiao; Bing Zhu; Geer Teng; Changzheng He; Dunhu Liu

    2014-01-01

    Scientific customer value segmentation (CVS) is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM) model. On the one hand, ODCEM integrates the preprocess of missing values and the classif...

  10. Simulation and Modeling Application in Agricultural Mechanization

    Directory of Open Access Journals (Sweden)

    R. M. Hudzari

    2012-01-01

    Full Text Available This experiment was conducted to determine the equations relating the Hue digital values of the fruits surface of the oil palm with maturity stage of the fruit in plantation. The FFB images were zoomed and captured using Nikon digital camera, and the calculation of Hue was determined using the highest frequency of the value for R, G, and B color components from histogram analysis software. New procedure in monitoring the image pixel value for oil palm fruit color surface in real-time growth maturity was developed. The estimation of day harvesting prediction was calculated based on developed model of relationships for Hue values with mesocarp oil content. The simulation model is regressed and predicts the day of harvesting or a number of days before harvest of FFB. The result from experimenting on mesocarp oil content can be used for real-time oil content determination of MPOB color meter. The graph to determine the day of harvesting the FFB was presented in this research. The oil was found to start developing in mesocarp fruit at 65 days before fruit at ripe maturity stage of 75% oil to dry mesocarp.

  11. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our understan...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...

  12. Psychosocial value of space simulation for extended spaceflight

    Science.gov (United States)

    Kanas, N.

    1997-01-01

    There have been over 60 studies of Earth-bound activities that can be viewed as simulations of manned spaceflight. These analogs have involved Antarctic and Arctic expeditions, submarines and submersible simulators, land-based simulators, and hypodynamia environments. None of these analogs has accounted for all the variables related to extended spaceflight (e.g., microgravity, long-duration, heterogeneous crews), and some of the stimulation conditions have been found to be more representative of space conditions than others. A number of psychosocial factors have emerged from the simulation literature that correspond to important issues that have been reported from space. Psychological factors include sleep disorders, alterations in time sense, transcendent experiences, demographic issues, career motivation, homesickness, and increased perceptual sensitivities. Psychiatric factors include anxiety, depression, psychosis, psychosomatic symptoms, emotional reactions related to mission stage, asthenia, and postflight personality, and marital problems. Finally, interpersonal factors include tension resulting from crew heterogeneity, decreased cohesion over time, need for privacy, and issues involving leadership roles and lines of authority. Since future space missions will usually involve heterogeneous crews working on complicated objectives over long periods of time, these features require further study. Socio-cultural factors affecting confined crews (e.g., language and dialect, cultural differences, gender biases) should be explored in order to minimize tension and sustain performance. Career motivation also needs to be examined for the purpose of improving crew cohesion and preventing subgrouping, scapegoating, and territorial behavior. Periods of monotony and reduced activity should be addressed in order to maintain morale, provide meaningful use of leisure time, and prevent negative consequences of low stimulation, such as asthenia and crew member withdrawal

  13. The role of non-epistemic values in engineering models.

    Science.gov (United States)

    Diekmann, Sven; Peterson, Martin

    2013-03-01

    We argue that non-epistemic values, including moral ones, play an important role in the construction and choice of models in science and engineering. Our main claim is that non-epistemic values are not only "secondary values" that become important just in case epistemic values leave some issues open. Our point is, on the contrary, that non-epistemic values are as important as epistemic ones when engineers seek to develop the best model of a process or problem. The upshot is that models are neither value-free, nor depend exclusively on epistemic values or use non-epistemic values as tie-breakers.

  14. Simulation of Heat Transfer to the Gas Coolant with Low Prandtl Number Value

    Directory of Open Access Journals (Sweden)

    T. N. Kulikova

    2015-01-01

    Full Text Available The work concerns the simulating peculiarities of heat transfer to the gas coolants with low values of the Prandtl number, in particular, to the binary mixtures of inert gases.The paper presents simulation results of heat transfer to the fully established flow of a helium-xenon mixture in the round tube of 6 mm in diameter with the boundary condition of the second kind. It considers a flow of three helium-xenon mixtures with different helium content and molecular Prandtl numbers within the range 0.239–0.322 and with Reynolds numbers ranged from 10000 to 50000. During numerical simulation a temperature factor changed from 1.034 to 1.061. CFD-code STAR-CCM+ that is designed for solving a wide range of problems of hydrodynamics, heat transfer and stress was used as the primary software.The applicability of the five models for the turbulent Prandtl number is examined. It is shown that the choice of the model has a significant influence on the heat transfer coefficient. The paper presents structural characteristics of the flow in the wall region. It estimates a thermal stabilization section to be approximately as long as 30 diameters of tube.Simulation results are compared with the known data on heat transfer to gas coolants with low values of the Prandtl number. It is shown that V2F low-Reynolds number -ε turbulence model with an approximation for the turbulent Prandtl number used according Kays-CrawfordWeigand gives the best compliance with the results predicted by relationships of Kays W.M. and Petukhov B.S. The approximating correlation summarizes a set of simulation results.Application of the work results is reasonable when conducting the numerical simulation of heat transfer to binary gas mixtures in channels of different forms. The presented approximating correlation allows rapid estimate of heat transfer coefficients to the gas coolants with a low value of the molecular Prandl number within the investigated range with a flow through the

  15. Extreme Value Predictions using Monte Carlo Simulations with Artificially Increased Load Spectrum

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2011-01-01

    In the analysis of structures subjected to stationary stochastic load processes the mean out-crossing rate plays an important role as it can be used to determine the extreme value distribution of any response, usually assuming that the sequence of mean out-crossings can be modelled as a Poisson p...... be scaled down to its actual value. In the present paper the usefulness of this approach is investigated, considering problems related to wave loads on marine structures. Here the load scale parameter is conveniently taken as the square of the significant wave height.......In the analysis of structures subjected to stationary stochastic load processes the mean out-crossing rate plays an important role as it can be used to determine the extreme value distribution of any response, usually assuming that the sequence of mean out-crossings can be modelled as a Poisson...... to be valid in the Monte Carlo simulations, making it possible to increase the out-crossing rates and thus reduce the necessary length of the time domain simulations by applying a larger load spectrum than relevant from a design point of view. The mean out-crossing rate thus obtained can then afterwards...

  16. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  17. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  18. The New Digital Media Value Network: Proposing an Interactive Model of Digital Media Value Activities

    Directory of Open Access Journals (Sweden)

    Sylvia Chan-Olmsted

    2016-07-01

    Full Text Available This study models the dynamic nature of today’s media markets using the framework of value-adding activities in the provision and consumption of media products. The proposed user-centric approach introduces the notion that the actions of external users, social media, and interfaces affect the internal value activities of media firms via a feedback loop, and therefore should themselves be considered value activities. The model also suggests a more comprehensive list of indicators for value assessment.

  19. Modeling and Simulation of Photoelectronic Lambda Bipolar Transistor

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Based on the region model of lambda bipolar transistor (LBT), a dividing region theory model of PLBT is set up,simulated and verified. Firstly, the principal operations of different kinds of photoelectronic lambda bipolar transistor (PLBT) are characterized by a simple circuit model.Through mathematical analysis of the equivalent circuit, the typical characteristics curve is divided into positive resistance, peak, negative resistance and cutoff regions. Secondly, by analyzing and simulating this model, the ratio of MOSFET width to channel length, threshold voltage and common emitter gain are discovered as the main structure parameters that determine the characteristic curves of PLBT. And peak region width, peak current value, negative resistance value and valley voltage value of PLBT can be changed conveniently according to the actual demands by modifying these parameters. Finally comparisons of the characteristics of the fabricated devices and the simulation results are made, which show that the analytical results are in agreement with the observed devices characteristics.

  20. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  1. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  2. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  3. Linear regression model selection using p-values when the model dimension grows

    CERN Document Server

    Pokarowski, Piotr; Teisseyre, Paweł

    2012-01-01

    We consider a new criterion-based approach to model selection in linear regression. Properties of selection criteria based on p-values of a likelihood ratio statistic are studied for families of linear regression models. We prove that such procedures are consistent i.e. the minimal true model is chosen with probability tending to 1 even when the number of models under consideration slowly increases with a sample size. The simulation study indicates that introduced methods perform promisingly when compared with Akaike and Bayesian Information Criteria.

  4. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    in their specification of the conditional variance, conditional correlation, innovation distribution, and estimation approach. All of the models belong to the dynamic conditional correlation class, which is particularly suitable because it allows consistent estimations of the risk neutral dynamics with a manageable...... of correlation models, we propose a new model that allows for correlation spillovers without too many parameters. This model performs about 60% better than the existing correlation models we consider. Relaxing a Gaussian innovation for a Laplace innovation assumption improves the pricing in a more minor way...

  5. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  6. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  7. On the added value of WUDAPT for Urban Climate Modelling

    Science.gov (United States)

    Brousse, Oscar; Martilli, Alberto; Mills, Gerald; Bechtel, Benjamin; Hammerberg, Kris; Demuzere, Matthias; Wouters, Hendrik; Van Lipzig, Nicole; Ren, Chao; Feddema, Johannes J.; Masson, Valéry; Ching, Jason

    2017-04-01

    Over half of the planet's population now live in cities and is expected to grow up to 65% by 2050 (United Nations, 2014), most of whom will actually occupy new emerging cities of the global South. Cities' impact on climate is known to be a key driver of environmental change (IPCC, 2014) and has been studied for decades now (Howard, 1875). Still very little is known about our cities' structure around the world, preventing urban climate simulations to be done and hence guidance to be provided for mitigation. Assessing the need to bridge the urban knowledge gap for urban climate modelling perspectives, the World Urban Database and Access Portal Tool - WUDAPT - project (Ching et al., 2015; Mills et al., 2015) developed an innovative technique to map cities globally rapidly and freely. The framework established by Bechtel and Daneke (2012) derives Local Climate Zones (Stewart and Oke, 2012) city maps out of LANDSAT 8 OLI-TIRS imagery (Bechtel et al., 2015) through a supervised classification by a Random Forest Classification algorithm (Breiman, 2001). The first attempt to implement Local Climate Zones (LCZ) out of the WUDAPT product within a major climate model was carried out by Brousse et al. (2016) over Madrid, Spain. This study proved the applicability of LCZs as an enhanced urban parameterization within the WRF model (Chen et al. 2011) employing the urban canopy model BEP-BEM (Martilli, 2002; Salamanca et al., 2010), using the averaged values of the morphological and physical parameters' ranges proposed by Stewart and Oke (2012). Other studies have now used the Local Climate Zones for urban climate modelling purposes (Alexander et al., 2016; Wouters et al. 2016; Hammerberg et al., 2017; Brousse et al., 2017) and demonstrated the added value of the WUDAPT dataset. As urban data accessibility is one of the major challenge for simulations in emerging countries, this presentation will show results of simulations using LCZs and the capacity of the WUDAPT framework to be

  8. Characteristics of Hands-On Simulations with Added Value for Innovative Secondary and Higher Vocational Education

    Science.gov (United States)

    Khaled, Anne; Gulikers, Judith; Biemans, Harm; van der Wel, Marjan; Mulder, Martin

    2014-01-01

    The intentions with which hands-on simulations are used in vocational education are not always clear. Also, pedagogical-didactic approaches in hands-on simulations are not well conceptualised from a learning theory perspective. This makes it difficult to pinpoint the added value that hands-on simulations can have in an innovative vocational…

  9. The Added Value of Business Models

    NARCIS (Netherlands)

    Vliet, Harry van

    2014-01-01

    An overview of innovations in a particular area, for example retail developments in the fashion sector (Van Vliet, 2014), and a subsequent discussion about the probability as to whether these innovations will realise a ‘breakthrough’, has to be supplemented with the question of what the added value

  10. Modeling value creation with enterprise architecture

    NARCIS (Netherlands)

    Singh, Prince Mayurank; Jonkers, H.; Iacob, Maria Eugenia; van Sinderen, Marten J.

    2014-01-01

    Firms may not succeed in business if strategies are not properly implemented in practice. Every firm needs to know, represent and master its value creation logic, not only to stay in business but also to keep growing. This paper is about focusing on an important topic in the field of strategic

  11. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  12. Nonsmooth Modeling and Simulation for Switched Circuits

    CERN Document Server

    Acary, Vincent; Brogliato, Bernard

    2011-01-01

    "Nonsmooth Modeling and Simulation for Switched Circuits" concerns the modeling and the numerical simulation of switched circuits with the nonsmooth dynamical systems (NSDS) approach, using piecewise-linear and multivalued models of electronic devices like diodes, transistors, switches. Numerous examples (ranging from introductory academic circuits to various types of power converters) are analyzed and many simulation results obtained with the INRIA open-source SICONOS software package are presented. Comparisons with SPICE and hybrid methods demonstrate the power of the NSDS approach

  13. Juno model rheometry and simulation

    Science.gov (United States)

    Sampl, Manfred; Macher, Wolfgang; Oswald, Thomas; Plettemeier, Dirk; Rucker, Helmut O.; Kurth, William S.

    2016-10-01

    The experiment Waves aboard the Juno spacecraft, which will arrive at its target planet Jupiter in 2016, was devised to study the plasma and radio waves of the Jovian magnetosphere. We analyzed the Waves antennas, which consist of two nonparallel monopoles operated as a dipole. For this investigation we applied two independent methods: the experimental technique, rheometry, which is based on a downscaled model of the spacecraft to measure the antenna properties in an electrolytic tank and numerical simulations, based on commercial computer codes, from which the quantities of interest (antenna impedances and effective length vectors) are calculated. In this article we focus on the results for the low-frequency range up to about 4 MHz, where the antenna system is in the quasi-static regime. Our findings show that there is a significant deviation of the effective length vectors from the physical monopole directions, caused by the presence of the conducting spacecraft body. The effective axes of the antenna monopoles are offset from the mechanical axes by more than 30°, and effective lengths show a reduction to about 60% of the antenna rod lengths. The antennas' mutual capacitances are small compared to the self-capacitances, and the latter are almost the same for the two monopoles. The overall performance of the antennas in dipole configuration is very stable throughout the frequency range up to about 4-5 MHz and therefore can be regarded as the upper frequency bound below which the presented quasi-static results are applicable.

  14. Modeling and Computer Simulation of AN Insurance Policy:

    Science.gov (United States)

    Acharyya, Muktish; Acharyya, Ajanta Bhowal

    We have developed a model for a life-insurance policy. In this model, the net gain is calculated by computer simulation for a particular type of lifetime distribution function. We observed that the net gain becomes maximum for a particular value of upper age for last premium.

  15. Network Modeling and Simulation A Practical Perspective

    CERN Document Server

    Guizani, Mohsen; Khan, Bilal

    2010-01-01

    Network Modeling and Simulation is a practical guide to using modeling and simulation to solve real-life problems. The authors give a comprehensive exposition of the core concepts in modeling and simulation, and then systematically address the many practical considerations faced by developers in modeling complex large-scale systems. The authors provide examples from computer and telecommunication networks and use these to illustrate the process of mapping generic simulation concepts to domain-specific problems in different industries and disciplines. Key features: Provides the tools and strate

  16. A simulation of water pollution model parameter estimation

    Science.gov (United States)

    Kibler, J. F.

    1976-01-01

    A parameter estimation procedure for a water pollution transport model is elaborated. A two-dimensional instantaneous-release shear-diffusion model serves as representative of a simple transport process. Pollution concentration levels are arrived at via modeling of a remote-sensing system. The remote-sensed data are simulated by adding Gaussian noise to the concentration level values generated via the transport model. Model parameters are estimated from the simulated data using a least-squares batch processor. Resolution, sensor array size, and number and location of sensor readings can be found from the accuracies of the parameter estimates.

  17. Do dynamic regional models add value to the global model projections of Indian monsoon?

    Science.gov (United States)

    Singh, Swati; Ghosh, Subimal; Sahana, A. S.; Vittal, H.; Karmakar, Subhankar

    2017-02-01

    Dynamic Regional Climate Models (RCMs) work at fine resolution for a limited region and hence they are presumed to simulate regional climate better than General Circulation Models (GCMs). Simulations by RCMs are used for impacts assessment, often without any evaluation. There is a growing debate on the added value made by the regional models to the projections of GCMs specifically for the regions like, United States and Europe. Evaluation of RCMs for Indian Summer Monsoon Rainfall (ISMR) has been overlooked in literature, though there are few disjoint studies on Indian monsoon extremes and biases. Here we present a comprehensive study on the evaluations of RCMs for the ISMR with all its important characteristics such as northward and eastward propagation, onset, seasonal rainfall patterns, intra-seasonal oscillations, spatial variability and patterns of extremes. We evaluate nine regional simulations from Coordinated Regional Climate Downscaling Experiment and compare them with their host Coupled Model Intercomparison Project-5 GCM projections. We do not find any consistent improvement in the RCM simulations with respect to their host GCMs for any of the characteristics of Indian monsoon except the spatial variation. We also find that the simulations of the ISMR characteristics by a good number of RCMs, are worse than those of their host GCMs. No consistent added value is observed in the RCM simulations of changes in ISMR characteristics over recent periods, compared to past; though there are few exceptions. These results highlight the need for proper evaluation before utilizing regional models for impacts assessment and subsequent policy making for sustainable climate change adaptation.

  18. SIRS Dynamics on Random Networks: Simulations and Analytical Models

    Science.gov (United States)

    Rozhnova, Ganna; Nunes, Ana

    The standard pair approximation equations (PA) for the Susceptible-Infective-Recovered-Susceptible (SIRS) model of infection spread on a network of homogeneous degree k predict a thin phase of sustained oscillations for parameter values that correspond to diseases that confer long lasting immunity. Here we present a study of the dependence of this oscillatory phase on the parameter k and of its relevance to understand the behaviour of simulations on networks. For k = 4, we compare the phase diagram of the PA model with the results of simulations on regular random graphs (RRG) of the same degree. We show that for parameter values in the oscillatory phase, and even for large system sizes, the simulations either die out or exhibit damped oscillations, depending on the initial conditions. This failure of the standard PA model to capture the qualitative behaviour of the simulations on large RRGs is currently being investigated.

  19. Optimization of Operations Resources via Discrete Event Simulation Modeling

    Science.gov (United States)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  20. Teaching Incision and Drainage: Perceived Educational Value of Abscess Models.

    Science.gov (United States)

    Adams, Cynthia M; Nigrovic, Lise E; Hayes, Gavin; Weinstock, Peter H; Nagler, Joshua

    2017-07-17

    Incision and drainage (I&D) of skin abscesses is an important procedural skill for pediatric emergency medicine providers. Practical skills training using simulation provides an opportunity to learn and gain confidence with this invasive procedure. Our objective was to assess the perceived educational value of 2 versions of an abscess model as part of an educational workshop for teaching I&D. A combined didactic and practical skills workshop was developed for use at 2 national conferences. The didactic content was created through an iterative process. To facilitate hands-on training, 2 versions of an abscess model were created: 1 constructed from a negative mold and the other using a 3-dimensional printer. Participants were surveyed regarding prior experience with I&D, procedural confidence, and perceptions of the educational utility of the models. Seventy physicians and 75 nurse practitioners participated in the study. Procedural confidence improved after training using each version of the model, with the greatest improvements noted among novice learners. Ninety-four percent of physicians, and 99% of nurse practitioners rated the respective models as either "educational" or "very educational," and 97% and 100%, respectively, would recommend the abscess models to others. A combined didactic and practical skills educational workshop using novel abscess models was effective at improving learners' confidence. Our novel models provide an effective strategy for teaching procedural skills such as I&D and demonstrate a novel use of 3-dimensional printers in medical education. Further study is needed to determine if these educational gains translate into improvement in clinical performance or patient outcomes.

  1. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    Directory of Open Access Journals (Sweden)

    Jin Xiao

    2014-01-01

    Full Text Available Scientific customer value segmentation (CVS is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM model. On the one hand, ODCEM integrates the preprocess of missing values and the classification modeling into one step; on the other hand, it utilizes multiple classifiers ensemble technology in constructing the classification models. The empirical results in credit scoring dataset “German” from UCI and the real customer churn prediction dataset “China churn” show that the ODCEM outperforms four commonly used “two-step” models and the ensemble based model LMF and can provide better decision support for market managers.

  2. Characterization of fluvial sedimentology for reservoir simulation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Henriquez, A.; Tyler, K.J.; Hurst, A. (Statoil, Stavanger (NO))

    1990-09-01

    This paper presents a critical study of 3D stochastic simulation of a fluvial reservoir and of the transfer of the geological model to a reservoir simulation grid. The stochastic model is conditioned by sand-body thickness and position in wellbores. Geological input parameters-sand-body orientation and width/thickness ratios-are often difficult to determine, and are invariably subject to interpretation. Net/gross ratio (NGR) and sand-body thickness are more easily estimated. Sand-body connectedness varies, depending on the modeling procedure; however, a sedimentary process-related model gives intermediate values for connectedness between the values for a regular packing model and the stochastic model. The geological model is transferred to a reservoir simulation grid by use of transmissibility multipliers and an NGR value for each block. The transfer of data smooths out much of the detailed geological information, and the calculated recovery factors are insensitive to the continuity measured in the geological model. Hence, the authors propose improvements to the interface between geological and reservoir simulation models.

  3. VHDL simulation with access to transistor models

    Science.gov (United States)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  4. Aminoglycoside nephrotoxicity: modeling, simulation, and control.

    Science.gov (United States)

    Rougier, Florent; Claude, Daniel; Maurin, Michel; Sedoglavic, Alexandre; Ducher, Michel; Corvaisier, Stéphane; Jelliffe, Roger; Maire, Pascal

    2003-03-01

    The main constraints on the administration of aminoglycosides are the risks of nephrotoxicity and ototoxicity, which can lead to acute, renal, vestibular, and auditory toxicities. In the present study we focused on nephrotoxicity. No reliable predictor of nephrotoxicity has been found to date. We have developed a deterministic model which describes the pharmacokinetic behavior of aminoglycosides (with a two-compartment model), the kinetics of aminoglycoside accumulation in the renal cortex, the effects of aminoglycosides on renal cells, the resulting effects on renal function by tubuloglomerular feedback, and the resulting effects on serum creatinine concentrations. The pharmacokinetic parameter values were estimated by use of the NPEM program. The estimated pharmacodynamic parameter values were obtained after minimization of the least-squares objective function between the measured and the calculated serum creatinine concentrations. A simulation program assessed the influences of the dosage regimens on the occurrence of nephrotoxicity. We have also demonstrated the relevancy of modeling of the circadian rhythm of the renal function. We have shown the ability of the model to fit with 49 observed serum creatinine concentrations for a group of eight patients treated for endocarditis by comparison with 49 calculated serum creatinine concentrations (r(2) = 0.988; P < 0.001). We have found that for the same daily dose, the nephrotoxicity observed with a thrice-daily administration schedule appears more rapidly, induces a greater decrease in renal function, and is more prolonged than those that occur with less frequent administration schedules (for example, once-daily administration). Moreover, for once-daily administration, we have demonstrated that the time of day of administration can influence the incidence of aminoglycoside nephrotoxicity. The lowest level of nephrotoxicity was observed when aminoglycosides were administered at 1:30 p.m. Clinical application of this

  5. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  6. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical m

  7. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single mat...

  8. Dynamic Average-Value Modeling of Doubly-Fed Induction Generator Wind Energy Conversion Systems

    Science.gov (United States)

    Shahab, Azin

    In a Doubly-fed Induction Generator (DFIG) wind energy conversion system, the rotor of a wound rotor induction generator is connected to the grid via a partial scale ac/ac power electronic converter which controls the rotor frequency and speed. In this research, detailed models of the DFIG wind energy conversion system with Sinusoidal Pulse-Width Modulation (SPWM) scheme and Optimal Pulse-Width Modulation (OPWM) scheme for the power electronic converter are developed in detail in PSCAD/EMTDC. As the computer simulation using the detailed models tends to be computationally extensive, time consuming and even sometimes not practical in terms of speed, two modified approaches (switching-function modeling and average-value modeling) are proposed to reduce the simulation execution time. The results demonstrate that the two proposed approaches reduce the simulation execution time while the simulation results remain close to those obtained using the detailed model simulation.

  9. Modeling churn using customer lifetime value

    OpenAIRE

    Glady, Nicolas; Baesens, Bart; Croux, Christophe

    2009-01-01

    The definition and modeling of customer loyalty have been central issues in customer relationship management since many years. Recent papers propose solutions to detect customers that are becoming less loyal, also called churners. The churner status is then defined as a function of the volume of commercial transactions. In the context of a Belgian retail financial service company, our first contribution is to redefine the notion of customer loyalty by considering it from a customer-centric vi...

  10. Modeling customer loyalty using customer lifetime value.

    OpenAIRE

    Glady, N.; Baesens, Bart; Croux, Christophe

    2006-01-01

    The definition and modeling of customer loyalty have been central issues in customer relationship management since many years. Recent papers propose solutions to detect customers that are becoming less loyal, also called churners. The churner status is then defined as a function of the volume of commercial transactions. In the context of a Belgian retail financial service company, our first contribution will be to redefine the notion of customer's loyalty by considering it from a customer-cen...

  11. Modeling the Marginal Value of Rainforest Losses

    OpenAIRE

    Strand, Jon

    2015-01-01

    A rainforest can be modeled as a dynamic asset subject to various risks, including risk of fire. Any small part of the forest can be in one of two states: either untouched by forest fire, or already damaged by fire, in which case there is both a local forest loss and increased dryness over a broader area. In this paper, two Bellman equations are constructed, one for unharmed forest and a s...

  12. How processing digital elevation models can affect simulated water budgets.

    Science.gov (United States)

    Kuniansky, Eve L; Lowery, Mark A; Campbell, Bruce G

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  13. Managing health care decisions and improvement through simulation modeling.

    Science.gov (United States)

    Forsberg, Helena Hvitfeldt; Aronsson, Håkan; Keller, Christina; Lindblad, Staffan

    2011-01-01

    Simulation modeling is a way to test changes in a computerized environment to give ideas for improvements before implementation. This article reviews research literature on simulation modeling as support for health care decision making. The aim is to investigate the experience and potential value of such decision support and quality of articles retrieved. A literature search was conducted, and the selection criteria yielded 59 articles derived from diverse applications and methods. Most met the stated research-quality criteria. This review identified how simulation can facilitate decision making and that it may induce learning. Furthermore, simulation offers immediate feedback about proposed changes, allows analysis of scenarios, and promotes communication on building a shared system view and understanding of how a complex system works. However, only 14 of the 59 articles reported on implementation experiences, including how decision making was supported. On the basis of these articles, we proposed steps essential for the success of simulation projects, not just in the computer, but also in clinical reality. We also presented a novel concept combining simulation modeling with the established plan-do-study-act cycle for improvement. Future scientific inquiries concerning implementation, impact, and the value for health care management are needed to realize the full potential of simulation modeling.

  14. Modeling and simulation technology readiness levels.

    Energy Technology Data Exchange (ETDEWEB)

    Clay, Robert L.; Shneider, Max S.; Marburger, S. J.; Trucano, Timothy Guy

    2006-01-01

    This report summarizes the results of an effort to establish a framework for assigning and communicating technology readiness levels (TRLs) for the modeling and simulation (ModSim) capabilities at Sandia National Laboratories. This effort was undertaken as a special assignment for the Weapon Simulation and Computing (WSC) program office led by Art Hale, and lasted from January to September 2006. This report summarizes the results, conclusions, and recommendations, and is intended to help guide the program office in their decisions about the future direction of this work. The work was broken out into several distinct phases, starting with establishing the scope and definition of the assignment. These are characterized in a set of key assertions provided in the body of this report. Fundamentally, the assignment involved establishing an intellectual framework for TRL assignments to Sandia's modeling and simulation capabilities, including the development and testing of a process to conduct the assignments. To that end, we proposed a methodology for both assigning and understanding the TRLs, and outlined some of the restrictions that need to be placed on this process and the expected use of the result. One of the first assumptions we overturned was the notion of a ''static'' TRL--rather we concluded that problem context was essential in any TRL assignment, and that leads to dynamic results (i.e., a ModSim tool's readiness level depends on how it is used, and by whom). While we leveraged the classic TRL results from NASA, DoD, and Sandia's NW program, we came up with a substantially revised version of the TRL definitions, maintaining consistency with the classic level definitions and the Predictive Capability Maturity Model (PCMM) approach. In fact, we substantially leveraged the foundation the PCMM team provided, and augmented that as needed. Given the modeling and simulation TRL definitions and our proposed assignment methodology, we

  15. A social diffusion model with an application on election simulation.

    Science.gov (United States)

    Lou, Jing-Kai; Wang, Fu-Min; Tsai, Chin-Hua; Hung, San-Chuan; Kung, Perng-Hwa; Lin, Shou-De; Chen, Kuan-Ta; Lei, Chin-Laung

    2014-01-01

    Issues about opinion diffusion have been studied for decades. It has so far no empirical approach to model the interflow and formation of crowd's opinion in elections due to two reasons. First, unlike the spread of information or flu, individuals have their intrinsic attitudes to election candidates in advance. Second, opinions are generally simply assumed as single values in most diffusion models. However, in this case, an opinion should represent preference toward multiple candidates. Previously done models thus may not intuitively interpret such scenario. This work is to design a diffusion model which is capable of managing the aforementioned scenario. To demonstrate the usefulness of our model, we simulate the diffusion on the network built based on a publicly available bibliography dataset. We compare the proposed model with other well-known models such as independent cascade. It turns out that our model consistently outperforms other models. We additionally investigate electoral issues with our model simulator.

  16. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  17. Simulation model of metallurgical production management

    Directory of Open Access Journals (Sweden)

    P. Šnapka

    2013-07-01

    Full Text Available This article is focused to the problems of the metallurgical production process intensification. The aim is the explaining of simulation model which presents metallurgical production management system adequated to new requirements. The knowledge of a dynamic behavior and features of metallurgical production system and its management are needed to this model creation. Characteristics which determine the dynamics of metallurgical production process are characterized. Simulation model is structured as functional blocks and their linkages with regard to organizational and temporal hierarchy of their actions. The creation of presented simulation model is based on theoretical findings of regulation, hierarchical systems and optimization.

  18. When experts are oceans apart: comparing expert performance values for proficiency-based laparoscopic simulator training.

    Science.gov (United States)

    Luursema, Jan-Maarten; Rovers, Maroeska M; Alken, Alexander; Kengen, Bas; van Goor, Harry

    2015-01-01

    Surgical training is moving away from the operating room toward simulation-based skills training facilities. This has led to the development of proficiency-based training courses in which expert performance data are used for feedback and assessment. However, few expert value data sets have been published, and no standard method for generating expert values has been adopted by the field. To investigate the effect of different proficiency value data sets on simulator training courses, we (1) compared 2 published expert performance data sets for the LapSim laparoscopic virtual-reality simulator (by van Dongen et al. and Heinrichs et al.) and (2) assessed the effect of using either set on LapSim training data obtained from 16 local residents in surgery and gynecology. Across all simulator tasks, the experts consulted by van Dongen et al. performed better on motion efficiency, but not on duration or damage control. Applying both proficiency sets to training data collected during a basic skills laparoscopic simulator course, residents would have graduated on an average in 1.5 fewer sessions using the Heinrichs expert values compared with the van Dongen expert values. The selection of proficiency values for proficiency-based simulator training courses affects training length, skills level assessment, and training costs. Standardized, well-controlled methods are necessary to create valid and reliable expert values for use in training and research. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  19. The HackensackUMC Value-Based Care Model: Building Essentials for Value-Based Purchasing.

    Science.gov (United States)

    Douglas, Claudia; Aroh, Dianne; Colella, Joan; Quadri, Mohammed

    2016-01-01

    The Affordable Care Act, 2010, and the subsequent shift from a quantity-focus to a value-centric reimbursement model led our organization to create the HackensackUMC Value-Based Care Model to improve our process capability and performance to meet and sustain the triple aims of value-based purchasing: higher quality, lower cost, and consumer perception. This article describes the basics of our model and illustrates how we used it to reduce the costs of our patient sitter program.

  20. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  1. Theoretical modeling of iodine value and saponification value of biodiesel fuels from their fatty acid composition

    Energy Technology Data Exchange (ETDEWEB)

    Gopinath, A.; Puhan, Sukumar; Nagarajan, G. [Internal Combustion Engineering Division, Department of Mechanical Engineering, Anna University, Chennai 600 025, Tamil Nadu (India)

    2009-07-15

    Biodiesel is an alternative fuel consisting of alkyl esters of fatty acids from vegetable oils or animal fats. The properties of biodiesel depend on the type of vegetable oil used for the transesterification process. The objective of the present work is to theoretically predict the iodine value and the saponification value of different biodiesels from their fatty acid methyl ester composition. The fatty acid ester compositions and the above values of different biodiesels were taken from the available published data. A multiple linear regression model was developed to predict the iodine value and saponification value of different biodiesels. The predicted results showed that the prediction errors were less than 3.4% compared to the available published data. The predicted values were also verified by substituting in the available published model which was developed to predict the higher heating values of biodiesel fuels from their iodine value and the saponification value. The resulting heating values of biodiesels were then compared with the published heating values and reported. (author)

  2. A NOVEL MULTI-VALUED BAM MODEL WITH IMPROVED ERROR-CORRECTING CAPABILITY

    Institute of Scientific and Technical Information of China (English)

    Zhang Daoqiang; Chen Songcan

    2003-01-01

    A Hyperbolic Tangent multi-valued Bi-directional Associative Memory (HTBAM)model is proposed in this letter. Two general energy functions are defined to prove the stabilityof one class of multi-valued Bi-directional Associative Memorys(BAMs), with HTBAM being thespecial case. Simulation results show that HTBAM has a competitive storage capacity and muchmore error-correcting capability than other multi-valued BAMs.

  3. Warehouse Simulation Through Model Configuration

    NARCIS (Netherlands)

    Verriet, J.H.; Hamberg, R.; Caarls, J.; Wijngaarden, B. van

    2013-01-01

    The pre-build development of warehouse systems leads from a specific customer request to a specific customer quotation. This involves a process of configuring a warehouse system using a sequence of steps that contain increasingly more details. Simulation is a helpful tool in analyzing warehouse desi

  4. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  5. Transport Simulation Model Calibration with Two-Step Cluster Analysis Procedure

    Directory of Open Access Journals (Sweden)

    Zenina Nadezda

    2015-12-01

    Full Text Available The calibration results of transport simulation model depend on selected parameters and their values. The aim of the present paper is to calibrate a transport simulation model by a two-step cluster analysis procedure to improve the reliability of simulation model results. Two global parameters have been considered: headway and simulation step. Normal, uniform and exponential headway generation models have been selected for headway. Application of two-step cluster analysis procedure to the calibration procedure has allowed reducing time needed for simulation step and headway generation model value selection.

  6. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  7. Quantum simulation of the t- J model

    Science.gov (United States)

    Yamaguchi, Fumiko; Yamamoto, Yoshihisa

    2002-12-01

    Computer simulation of a many-particle quantum system is bound to reach the inevitable limits of its ability as the system size increases. The primary reason for this is that the memory size used in a classical simulator grows polynomially whereas the Hilbert space of the quantum system does so exponentially. Replacing the classical simulator by a quantum simulator would be an effective method of surmounting this obstacle. The prevailing techniques for simulating quantum systems on a quantum computer have been developed for purposes of computing numerical algorithms designed to obtain approximate physical quantities of interest. The method suggested here requires no numerical algorithms; it is a direct isomorphic translation between a quantum simulator and the quantum system to be simulated. In the quantum simulator, physical parameters of the system, which are the fixed parameters of the simulated quantum system, are under the control of the experimenter. A method of simulating a model for high-temperature superconducting oxides, the t- J model, by optical control, as an example of such a quantum simulation, is presented.

  8. CAUSA - An Environment For Modeling And Simulation

    Science.gov (United States)

    Dilger, Werner; Moeller, Juergen

    1989-03-01

    CAUSA is an environment for modeling and simulation of dynamic systems on a quantitative level. The environment provides a conceptual framework including primitives like objects, processes and causal dependencies which allow the modeling of a broad class of complex systems. The facility of simulation allows the quantitative and qualitative inspection and empirical investigation of the behavior of the modeled system. CAUSA is implemented in Knowledge-Craft and runs on a Symbolics 3640.

  9. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...... in details. The results of simulations developed for different researches reveal that different mdel may be suitable for different purpose, thus the model should be chosen different carefully. Some details and tricks in modeling are also introduced which give a reference for further research....

  10. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  11. A model of relations of value of property sites on the case of Ljubljana

    Directory of Open Access Journals (Sweden)

    Franc J. Zakrajšek

    2004-01-01

    Full Text Available The proposed model is a computer aided system for evaluating the value of spatial sites of properties, both present as well as simulations of future conditions following hypothetical directions of development, strategic or concrete spatial decisions. The model is built on classical methods of mass appraisal of property, which are supported by a geographical information model. It introduces significant novelties in the field: the use of methods of analytical hierarchical processing for determining technical coefficients of site advantages. The model was developed in the research Development and implementation of a regional simulation model for the Ljubljana urban region, financed by the City municipality of Ljubljana.

  12. Characteristics of hands-on simulations with added value for innovative secondary and higher vocational education

    NARCIS (Netherlands)

    Khaled, A.E.; Gulikers, J.T.M.; Biemans, H.J.A.; Wel, van der M.; Mulder, M.

    2014-01-01

    The intentions with which hands-on simulations are used in vocational education are not always clear. Also, pedagogical-didactic approaches in hands-on simulations are not well conceptualised from a learning theory perspective. This makes it difficult to pinpoint the added value that hands-on simula

  13. Multiscale Model Approach for Magnetization Dynamics Simulations

    CERN Document Server

    De Lucia, Andrea; Tretiakov, Oleg A; Kläui, Mathias

    2016-01-01

    Simulations of magnetization dynamics in a multiscale environment enable rapid evaluation of the Landau-Lifshitz-Gilbert equation in a mesoscopic sample with nanoscopic accuracy in areas where such accuracy is required. We have developed a multiscale magnetization dynamics simulation approach that can be applied to large systems with spin structures that vary locally on small length scales. To implement this, the conventional micromagnetic simulation framework has been expanded to include a multiscale solving routine. The software selectively simulates different regions of a ferromagnetic sample according to the spin structures located within in order to employ a suitable discretization and use either a micromagnetic or an atomistic model. To demonstrate the validity of the multiscale approach, we simulate the spin wave transmission across the regions simulated with the two different models and different discretizations. We find that the interface between the regions is fully transparent for spin waves with f...

  14. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  15. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  16. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  17. HVDC System Characteristics and Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.I.; Han, B.M.; Jang, G.S. [Electric Enginnering and Science Research Institute, Seoul (Korea)

    2001-07-01

    This report deals with the AC-DC power system simulation method by PSS/E and EUROSTAG for the development of a strategy for the reliable operation of the Cheju-Haenam interconnected system. The simulation using both programs is performed to analyze HVDC simulation models. In addition, the control characteristics of the Cheju-Haenam HVDC system as well as Cheju AC system characteristics are described in this work. (author). 104 figs., 8 tabs.

  18. Simulation modeling and analysis with Arena

    Energy Technology Data Exchange (ETDEWEB)

    Tayfur Altiok; Benjamin Melamed [Rutgers University, NJ (United States). Department of Industrial and Systems Engineering

    2007-06-15

    The textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings. Chapter 13.3.3 is on coal loading operations on barges/tugboats.

  19. Hyperbolic value addition and general models of animal choice.

    Science.gov (United States)

    Mazur, J E

    2001-01-01

    Three mathematical models of choice--the contextual-choice model (R. Grace, 1994), delay-reduction theory (N. Squires & E. Fantino, 1971), and a new model called the hyperbolic value-added model--were compared in their ability to predict the results from a wide variety of experiments with animal subjects. When supplied with 2 or 3 free parameters, all 3 models made fairly accurate predictions for a large set of experiments that used concurrent-chain procedures. One advantage of the hyperbolic value-added model is that it is derived from a simpler model that makes accurate predictions for many experiments using discrete-trial adjusting-delay procedures. Some results favor the hyperbolic value-added model and delay-reduction theory over the contextual-choice model, but more data are needed from choice situations for which the models make distinctly different predictions.

  20. Statistical Modeling of Large-Scale Scientific Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Baldwin, C; Abdulla, G; Critchlow, T

    2003-11-15

    With the advent of massively parallel computer systems, scientists are now able to simulate complex phenomena (e.g., explosions of a stars). Such scientific simulations typically generate large-scale data sets over the spatio-temporal space. Unfortunately, the sheer sizes of the generated data sets make efficient exploration of them impossible. Constructing queriable statistical models is an essential step in helping scientists glean new insight from their computer simulations. We define queriable statistical models to be descriptive statistics that (1) summarize and describe the data within a user-defined modeling error, and (2) are able to answer complex range-based queries over the spatiotemporal dimensions. In this chapter, we describe systems that build queriable statistical models for large-scale scientific simulation data sets. In particular, we present our Ad-hoc Queries for Simulation (AQSim) infrastructure, which reduces the data storage requirements and query access times by (1) creating and storing queriable statistical models of the data at multiple resolutions, and (2) evaluating queries on these models of the data instead of the entire data set. Within AQSim, we focus on three simple but effective statistical modeling techniques. AQSim's first modeling technique (called univariate mean modeler) computes the ''true'' (unbiased) mean of systematic partitions of the data. AQSim's second statistical modeling technique (called univariate goodness-of-fit modeler) uses the Andersen-Darling goodness-of-fit method on systematic partitions of the data. Finally, AQSim's third statistical modeling technique (called multivariate clusterer) utilizes the cosine similarity measure to cluster the data into similar groups. Our experimental evaluations on several scientific simulation data sets illustrate the value of using these statistical models on large-scale simulation data sets.

  1. The Role of Non-Epistemic Values in Engineering Models

    OpenAIRE

    Diekmann, Sven; Peterson, Martin

    2011-01-01

    We argue that non-epistemic values, including moral ones, play an important role in the construction and choice of models in science and engineering. Our main claim is that non-epistemic values are not only “secondary values” that become important just in case epistemic values leave some issues open. Our point is, on the contrary, that non-epistemic values are as important as epistemic ones when engineers seek to develop the best model of a process or problem. The upshot is that models are ne...

  2. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...... onduction simulation experiments....

  3. Modeling and simulation for RF system design

    CERN Document Server

    Frevert, Ronny; Jancke, Roland; Knöchel, Uwe; Schwarz, Peter; Kakerow, Ralf; Darianian, Mohsen

    2005-01-01

    Focusing on RF specific modeling and simulation methods, and system and circuit level descriptions, this work contains application-oriented training material. Accompanied by a CD- ROM, it combines the presentation of a mixed-signal design flow, an introduction into VHDL-AMS and Verilog-A, and the application of commercially available simulators.

  4. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  5. Computer simulations for internal dosimetry using voxel models.

    Science.gov (United States)

    Kinase, Sakae; Mohammadi, Akram; Takahashi, Masa; Saito, Kimiaki; Zankl, Maria; Kramer, Richard

    2011-07-01

    In the Japan Atomic Energy Agency, several studies have been conducted on the use of voxel models for internal dosimetry. Absorbed fractions (AFs) and S values have been evaluated for preclinical assessments of radiopharmaceuticals using human voxel models and a mouse voxel model. Computational calibration of in vivo measurement system has been also made using Japanese and Caucasian voxel models. In addition, for radiation protection of the environment, AFs have been evaluated using a frog voxel model. Each study was performed by using Monte Carlo simulations. Consequently, it was concluded that these data of Monte Carlo simulations and voxel models could adequately reproduce measurement results. Voxel models were found to be a significant tool for internal dosimetry since the models are anatomically realistic. This fact indicates that several studies on correction of the in vivo measurement efficiency for the variability of human subjects and interspecies scaling of organ doses will succeed.

  6. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  7. Large Scale Simulations of the Kinetic Ising Model

    Science.gov (United States)

    Münkel, Christian

    We present Monte Carlo simulation results for the dynamical critical exponent z of the two- and three-dimensional kinetic Ising model. The z-values were calculated from the magnetization relaxation from an ordered state into the equilibrium state at Tc for very large systems with up to (169984)2 and (3072)3 spins. To our knowledge, these are the largest Ising-systems simulated todate. We also report the successful simulation of very large lattices on a massively parallel MIMD computer with high speedups of approximately 1000 and an efficiency of about 0.93.

  8. A sand wave simulation model

    NARCIS (Netherlands)

    Nemeth, A.A.; Hulscher, S.J.M.H.; Damme, van R.M.J.

    2003-01-01

    Sand waves form a prominent regular pattern in the offshore seabeds of sandy shallow seas. A two dimensional vertical (2DV) flow and morphological numerical model describing the behaviour of these sand waves has been developed. The model contains the 2DV shallow water equations, with a free water su

  9. Mean Value Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Muller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models which are physically based. Such models are useful for control studies, for engine control system analysis and for model based control systems. Very few published MVEMs have included the effects of Exhaust Gas Recirculation (E...

  10. Modelling Reactive and Proactive Behaviour in Simulation

    CERN Document Server

    Majid, Mazlina Abdul; Aickelin, Uwe

    2010-01-01

    This research investigated the simulation model behaviour of a traditional and combined discrete event as well as agent based simulation models when modelling human reactive and proactive behaviour in human centric complex systems. A departmental store was chosen as human centric complex case study where the operation system of a fitting room in WomensWear department was investigated. We have looked at ways to determine the efficiency of new management policies for the fitting room operation through simulating the reactive and proactive behaviour of staff towards customers. Once development of the simulation models and their verification had been done, we carried out a validation experiment in the form of a sensitivity analysis. Subsequently, we executed a statistical analysis where the mixed reactive and proactive behaviour experimental results were compared with some reactive experimental results from previously published works. Generally, this case study discovered that simple proactive individual behaviou...

  11. Challenges in SysML Model Simulation

    Directory of Open Access Journals (Sweden)

    Mara Nikolaidou

    2016-07-01

    Full Text Available Systems Modeling Language (SysML is a standard proposed by the OMG for systems-of-systems (SoS modeling and engineering. To this end, it provides the means to depict SoS components and their behavior in a hierarchical, multi-layer fashion, facilitating alternative engineering activities, such as system design. To explore the performance of SysML, simulation is one of the preferred methods. There are many efforts targeting simulation code generation from SysML models. Numerous simulation methodologies and tools are employed, while different SysML diagrams are utilized. Nevertheless, this process is not standardized, although most of current approaches tend to follow the same steps, even if they employ different tools. The scope of this paper is to provide a comprehensive understanding of the similarities and differences of existing approaches and identify current challenges in fully automating SysML models simulation process.

  12. SIMULATION MODELING SLOW SPATIALLY HETER- OGENEOUS COAGULATION

    Directory of Open Access Journals (Sweden)

    P. A. Zdorovtsev

    2013-01-01

    Full Text Available A new model of spatially inhomogeneous coagulation, i.e. formation of larger clusters by joint interaction of smaller ones, is under study. The results of simulation are compared with known analytical and numerical solutions.

  13. Modelling and Simulation of RF Multilayer Inductors in LTCC Technology

    Directory of Open Access Journals (Sweden)

    A. Čelić

    2009-11-01

    Full Text Available This paper is aimed at presenting the models and characteristics of two types of inductors designed in LTCC (Low Temperature Cofired Ceramic technology. We present the physical model of a 3D planar solenoid-type inductor and of a serial planar solenoid-type inductor for the RF (radio frequency range. To verify the results obtained by using these models, we have compared them with the results obtained by employing the Ansoft HFSS electromagnetic simulator. Very good agreement has been recorded for the effective inductance value, whereas the effective Q factor value has shown a somewhat larger deviation than the inductance.

  14. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  15. Application of Chebyshev Polynomial to simulated modeling

    Institute of Scientific and Technical Information of China (English)

    CHI Hai-hong; LI Dian-pu

    2006-01-01

    Chebyshev polynomial is widely used in many fields, and used usually as function approximation in numerical calculation. In this paper, Chebyshev polynomial expression of the propeller properties across four quadrants is given at first, then the expression of Chebyshev polynomial is transformed to ordinary polynomial for the need of simulation of propeller dynamics. On the basis of it,the dynamical models of propeller across four quadrants are given. The simulation results show the efficiency of mathematical model.

  16. Collisionless Electrostatic Shock Modeling and Simulation

    Science.gov (United States)

    2016-10-21

    Briefing Charts 3. DATES COVERED (From - To) 30 September 2016 – 21 October 2016 4. TITLE AND SUBTITLE Collisionless Electrostatic Shock Modeling and...release: distribution unlimited. PA#16490 Air Force Research Laboratory Collisionless Electrostatic Shock Modeling and Simulation Daniel W. Crews In-Space...unlimited. PA#16490 Overview • Motivation and Background • What is a Collisionless Shock Wave? • Features of the Collisionless Shock • The Shock Simulation

  17. Determination of interface width value in phase-field simulation of dendritic growth into undercooled melt

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The influence of the interface width value on the simulation results and its dependence upon thermo-physical parameters in the phase-field simulation of dendritic growth into undercooled melt are investigated. After choosing the reasonable interface width value, the tip velocities of dendritic growth in Ni melt under different undercoolings are calculated and compared with the experimental data in order to benchmark our results. It is shown that the reasonable interface width value, which is determined by the undercooling, anisotropy, interface kinetic, and thermal diffusivity, has to be taken low enough, and the agreement of our results with experimental data verifies that the credible results can be achieved as long as the interface width value is adequately low. This paper provides the basis of determining interface width value in simulating dendritic growth into undercooled melt by phase-field approach.

  18. A comparison between track-structure, condensed-history Monte Carlo simulations and MIRD cellular S-values

    Science.gov (United States)

    Tajik-Mansoury, M. A.; Rajabi, H.; Mazdarani, H.

    2017-03-01

    The S-value is a standard measure in cellular dosimetry. S-values are calculated by applying analytical methods or by Monte Carlo simulation. In Monte Carlo simulation, particles are either tracked individually event-by-event or close events are condensed and processed collectively in different steps. Both of these methods have been employed for estimation of cellular S-values, but there is no consistency between the published results. In the present paper, we used the Geant4-DNA track-structure physics model as the reference to estimate the cellular S-values. We compared the results with the corresponding values obtained from the following three condensed-history physics models of Geant4: Penelope, Livermore and standard. The geometry and source were exactly the same in all the simulations. We utilized mono-energetic electrons with an initial kinetic energy in the range 1–700 keV as the source of radiation. We also compared our results with the MIRD S-values. We first drew an overall comparison between different data series and then compared the dependence of results on the energy of particles and the size of scoring compartments. The overall comparison indicated a very good linear correlation (R 2  >  91%) and small bias (3%) between the results of the track-structure model and the condensed-history physics model. The bias between MIRD and the results of Monte Carlo track-structure simulation was considerable (‑8%). However, the point-by-point comparison revealed differences of up to 28% between the condensed-history and the track-structure MC codes for self-absorption S-values in the 10–50 keV energy range. For the cross-absorption S-values, the difference was up to 34%. In this energy range, the difference between the MIRD S-values and the Geant4-DNA results was up to 68%. Our findings suggest that the consistency/inconsistency of the results obtained with different MC simulations depends on the size of the scoring volumes, the energy of the

  19. A new model to simulate impact breakup

    Science.gov (United States)

    Cordelli, Alessandro; Farinella, Paolo

    1997-12-01

    We have developed a preliminary version of a new type of code to simulate the outcomes of impacts between solid bodies, which we plan to further refine for application to both asteroid science and space debris studies. In the current code, colliding objects are modeled as two-dimensional arrays of finite elements, which can interact with each other in both an elastic and a shock-wave regime. The finite elements are hard spheres with a given value for mass and radius. When two of them come into contact the laws of inelastic scattering are applied, thus giving rise to the propagation of shock waves. Moreover each spherical element interacts elastically with its nearest neighbours. The interaction force corresponds to that of a spring having an equilibrium length equal to the lattice spacing, and results into the propagation of elastic waves in the lattice. Dissipation effects are modeled by means of a dissipative force term proportional to the relative velocity, with a given characteristic time of decay. The possible occurrence of fractures in the material is modeled by assuming that when the distance of two neighbouring elements exceeds a threshold value, the binding force between them disappears for ever. This model requires finding a plausible correspondence between the input parameters appearing in the equations of motion, and the physical properties of real solid materials. Some of the required links are quite obvious (e.g., the relationship between mass of the elements and elastic constant on one side, and material density and sound velocity on the other side), some others a priori are unclear, and additional hypotheses on them must be made (e.g., on the restitution coefficient of inelastic scattering). Despite the preliminary character of the model, we have obtained some interesting results, which appear to mimic in a realistic way the outcomes of actual impacts. For instance, we have observed the formation of craters and fractures, and (for high impact

  20. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  1. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  2. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  3. Modeling and simulation of multiport RF switch

    Energy Technology Data Exchange (ETDEWEB)

    Vijay, J [Student, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India); Saha, Ivan [Scientist, Indian Space Research Organisation (ISRO) (India); Uma, G [Lecturer, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India); Umapathy, M [Assistant Professor, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India)

    2006-04-01

    This paper describes the modeling and simulation of 'Multi Port RF Switch' where the latching mechanism is realized with two hot arm electro thermal actuators and the switching action is realized with electrostatic actuators. It can act as single pole single thrown as well as single pole multi thrown switch. The proposed structure is modeled analytically and required parameters are simulated using MATLAB. The analytical simulation results are validated using Finite Element Analysis of the same in the COVENTORWARE software.

  4. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  5. Traffic Modeling in WCDMA System Level Simulations

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Traffic modeling is a crucial element in WCDMA system level simulations. A clear understanding of the nature of traffic in the WCDMA system and subsequent selection of an appropriate random traffic model are critical to the success of the modeling enterprise. The resultant performances will evidently be of a function that our design has been well adapted to the traffic, channel and user mobility models, and these models are also accurate. In this article, our attention will be focused on modeling voice and WWW data traffic with the SBBP model and Victor model respectively.

  6. A value model for evaluating homeland security decisions.

    Science.gov (United States)

    Keeney, Ralph L; von Winterfeldt, Detlof

    2011-09-01

    One of the most challenging tasks of homeland security policymakers is to allocate their limited resources to reduce terrorism risks cost effectively. To accomplish this task, it is useful to develop a comprehensive set of homeland security objectives, metrics to measure each objective, a utility function, and value tradeoffs relevant for making homeland security investments. Together, these elements form a homeland security value model. This article develops a homeland security value model based on literature reviews, a survey, and experience with building value models. The purposes of the article are to motivate the use of a value model for homeland security decision making and to illustrate its use to assess terrorism risks, assess the benefits of countermeasures, and develop a severity index for terrorism attacks. © 2011 Society for Risk Analysis.

  7. Model grid and infiltration values for the transient ground-water flow model, Death Valley regional ground-water flow system, Nevada and California

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digital data set defines the model grid and infiltration values simulated in the transient ground-water flow model of the Death Valley regional ground-water...

  8. Model grid and infiltration values for the transient ground-water flow model, Death Valley regional ground-water flow system, Nevada and California

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digital data set defines the model grid and infiltration values simulated in the transient ground-water flow model of the Death Valley regional ground-water...

  9. Self-Service Banking: Value Creation Models and Information Exchange

    Directory of Open Access Journals (Sweden)

    Ragnvald Sannes

    2001-01-01

    Full Text Available This paper argues that most banks have failed to exploit the potential of self-service banking because they base their service design on an incomplete business model for self-service. A framework for evaluation of self-service banking concepts is developed on the basis of Stabell and Fjeldstad's three value configurations. The value network and the value shop are consistent with self-service banking while the value chain is inappropriate. The impact of the value configurations on information exchange and self-service functionality is discussed, and a framework for design of such services proposed. Current self-service banking practices are compared to the framework, and it is concluded that current practice matches the concept of a value network and not the value shop. However, current practices are only a partial implementation of a value network-based self-service banking concept.

  10. Modeling and simulation of luminescence detection platforms.

    Science.gov (United States)

    Salama, Khaled; Eltoukhy, Helmy; Hassibi, Arjang; El-Gamal, Abbas

    2004-06-15

    Motivated by the design of an integrated CMOS-based detection platform, a simulation model for CCD and CMOS imager-based luminescence detection systems is developed. The model comprises four parts. The first portion models the process of photon flux generation from luminescence probes using ATP-based and luciferase label-based assay kinetics. An optics simulator is then used to compute the incident photon flux on the imaging plane for a given photon flux and system geometry. Subsequently, the output image is computed using a detailed imaging sensor model that accounts for photodetector spectral response, dark current, conversion gain, and various noise sources. Finally, signal processing algorithms are applied to the image to enhance detection reliability and hence increase the overall system throughput. To validate the model, simulation results are compared to experimental results obtained from a CCD-based system that was built to emulate the integrated CMOS-based platform.

  11. SOFT MODELLING AND SIMULATION IN STRATEGY

    Directory of Open Access Journals (Sweden)

    Luciano Rossoni

    2006-06-01

    Full Text Available A certain resistance on the part of the responsible controllers for the strategy exists, in using techniques and tools of modeling and simulation. Many find them excessively complicated, already others see them as rigid and mathematical for excessively for the use of strategies in uncertain and turbulent environments. However, some interpretative boarding that take care of, in part exist, the necessities of these borrowers of decision. The objective of this work is to demonstrate of a clear and simple form, some of the most powerful boarding, methodologies and interpretative tools (soft of modeling and simulation in the business-oriented area of strategy. We will define initially, what they are on models, simulation and some aspects to the modeling and simulation in the strategy area. Later we will see some boarding of modeling soft, that they see the modeling process much more of that simply a mechanical process, therefore, as seen for Simon, the human beings rationally are limited and its decisions are influenced by a series of questions of subjective character, related to the way where it is inserted. Keywords: strategy, modeling and simulation, soft systems methodology, cognitive map, systems dynamics.

  12. Reiteration of Hankel singular value decomposition for modeling of complex-valued signal

    Science.gov (United States)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2016-06-01

    Modeling signal which forms complex values is a common scientific problem, which is present in many applications, i.e. in medical signals, computer graphics and vision. One of the possible solution is utilization of Hankel Singular Value Decomposition. In the first step complex-valued signal is arranged in a special form called Hankel matrix, which is in the next step decomposed in operation of Singular Value Decomposition. Obtained matrices can be then reformulated in order to get parameters describing system. Basic method can be applied for fitting whole signal but it fails in modeling each particular component of signal. Modification of basic HSVD method, which relies on reiteration and is used for main components, and application of prior knowledge solves presented problem.

  13. Modeling and Simulation of Hydraulic Engine Mounts

    Institute of Scientific and Technical Information of China (English)

    DUAN Shanzhong; Marshall McNea

    2012-01-01

    Hydraulic engine mounts are widely used in automotive powertrains for vibration isolation.A lumped mechanical parameter model is a traditional approach to model and simulate such mounts.This paper presents a dynamical model of a passive hydraulic engine mount with a double-chamber,an inertia track,a decoupler,and a plunger.The model is developed based on analogy between electrical systems and mechanical-hydraulic systems.The model is established to capture both low and high frequency dynatmic behaviors of the hydraulic mount.The model will be further used to find the approximate pulse responses of the mounts in terms of the force transmission and top chamber pressure.The close form solution from the simplifiod linear model may provide some insight into the highly nonlinear behavior of the mounts.Based on the model,computer simulation has been carried out to study dynamic performance of the hydraulic mount.

  14. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels;

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  15. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus; Condra, Thomas Joseph;

    2003-01-01

    A model for a ue gas boiler covering the ue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been dened for the furnace, the convection zone (split in 2: a zone...... submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic- Equation system (DAE). Subsequently MatLab/Simulink has...... been applied for carrying out the simulations. To be able to verify the simulated results an experiments has been carried out on a full scale boiler plant....

  16. Environment Modeling Using Runtime Values for JPF-Android

    Science.gov (United States)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  17. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented......, focusing on universality of the ac response in the extreme disorder limit. Finally, some important unsolved problems relating to hopping models for ac conduction are listed....

  18. Mean Value Modelling of a Turbocharged SI Engine

    DEFF Research Database (Denmark)

    Müller, Martin; Hendricks, Elbert; Sorenson, Spencer C.

    1998-01-01

    An important paradigm for the modelling of naturallly aspirated (NA) spark ignition (SI) engines for control purposes is the Mean Value Engine Model (MVEM). Such models have a time resolution which is just sufficient to capture the main details of the dynamic performance of NA SI engines but not ...

  19. Modeling and simulating of unloading welding transformer

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The simulation model of an unloading welding transformer was established on the basis of MATLAB software, and the modeling principle was described in detail in the paper. The model was made up of three sub-models, i.e. the linear inductor sub-model, the non-linear inductor sub-model and series connection sub-model controlled by current, and these sub-models were jointed together by means of segmented linearization. The simulating results showed that, in the conditions of the high convert frequency and the large cross section of the magnet core of a welding transformer, the non-linear inductor sub-model can be substituted by a linear inductor sub-model in the model; and the leakage reactance in the welding transformer is one of the main reasons of producing over-current and over-voltage in the inverter. The simulation results demonstrate that the over-voltage produced by leakage reactance is nearly two times of the input voltage supplied to the transformer, and the lasting time of over-voltage depends on time constant τ1. With reducing of τ1, the amplitude of the over-current will increase, and the lasting time becomes shorter. Contrarily, with increasing of τ1, the amplitude of the over-current will decrease, and the lasting time becomes longer. The model has played the important role for the development of the inverter resistance welding machine.

  20. Revolutions in energy through modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tatro, M.; Woodard, J.

    1998-08-01

    The development and application of energy technologies for all aspects from generation to storage have improved dramatically with the advent of advanced computational tools, particularly modeling and simulation. Modeling and simulation are not new to energy technology development, and have been used extensively ever since the first commercial computers were available. However, recent advances in computing power and access have broadened the extent and use, and, through increased fidelity (i.e., accuracy) of the models due to greatly enhanced computing power, the increased reliance on modeling and simulation has shifted the balance point between modeling and experimentation. The complex nature of energy technologies has motivated researchers to use these tools to understand better performance, reliability and cost issues related to energy. The tools originated in sciences such as the strength of materials (nuclear reactor containment vessels); physics, heat transfer and fluid flow (oil production); chemistry, physics, and electronics (photovoltaics); and geosciences and fluid flow (oil exploration and reservoir storage). Other tools include mathematics, such as statistics, for assessing project risks. This paper describes a few advancements made possible by these tools and explores the benefits and costs of their use, particularly as they relate to the acceleration of energy technology development. The computational complexity ranges from basic spreadsheets to complex numerical simulations using hardware ranging from personal computers (PCs) to Cray computers. In all cases, the benefits of using modeling and simulation relate to lower risks, accelerated technology development, or lower cost projects.

  1. Diversity modelling for electrical power system simulation

    Science.gov (United States)

    Sharip, R. M.; Abu Zarim, M. A. U. A.

    2013-12-01

    This paper considers diversity of generation and demand profiles against the different future energy scenarios and evaluates these on a technical basis. Compared to previous studies, this research applied a forecasting concept based on possible growth rates from publically electrical distribution scenarios concerning the UK. These scenarios were created by different bodies considering aspects such as environment, policy, regulation, economic and technical. In line with these scenarios, forecasting is on a long term timescale (up to every ten years from 2020 until 2050) in order to create a possible output of generation mix and demand profiles to be used as an appropriate boundary condition for the network simulation. The network considered is a segment of rural LV populated with a mixture of different housing types. The profiles for the 'future' energy and demand have been successfully modelled by applying a forecasting method. The network results under these profiles shows for the cases studied that even though the value of the power produced from each Micro-generation is often in line with the demand requirements of an individual dwelling there will be no problems arising from high penetration of Micro-generation and demand side management for each dwellings considered. The results obtained highlight the technical issues/changes for energy delivery and management to rural customers under the future energy scenarios.

  2. Inventory Reduction Using Business Process Reengineering and Simulation Modeling.

    Science.gov (United States)

    1996-12-01

    center is analyzed using simulation modeling and business process reengineering (BPR) concepts. The two simulation models were designed and evaluated by...reengineering and simulation modeling offer powerful tools to aid the manager in reducing cycle time and inventory levels.

  3. Trust Model for Social Network using Singular Value Decomposition

    OpenAIRE

    Davis Bundi Ntwiga; Patrick Weke; Michael Kiura Kirumbu

    2016-01-01

    For effective interactions to take place in a social network, trust is important. We model trust of agents using the peer to peer reputation ratings in the network that forms a real valued matrix. Singular value decomposition discounts the reputation ratings to estimate the trust levels as trust is the subjective probability of future expectations based on current reputation ratings. Reputation and trust are closely related and singular value decomposition can estimate trust using the...

  4. Simulation and modeling of turbulent flows

    CERN Document Server

    Gatski, Thomas B; Lumley, John L

    1996-01-01

    This book provides students and researchers in fluid engineering with an up-to-date overview of turbulent flow research in the areas of simulation and modeling. A key element of the book is the systematic, rational development of turbulence closure models and related aspects of modern turbulent flow theory and prediction. Starting with a review of the spectral dynamics of homogenous and inhomogeneous turbulent flows, succeeding chapters deal with numerical simulation techniques, renormalization group methods and turbulent closure modeling. Each chapter is authored by recognized leaders in their respective fields, and each provides a thorough and cohesive treatment of the subject.

  5. Modeling & Simulation Executive Agent Panel

    Science.gov (United States)

    2007-11-02

    Richard W. ; 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME AND ADDRESS Office of the Oceanographer of the Navy...acquisition, and training communities.” MSEA Role • Facilitator in the project startup phase • Catalyst during development • Certifier in the...ACOUSTIC MODELS Parabolic Equation 5.0 ASTRAL 5.0 ASPM 4.3 Gaussian Ray Bundle 1.0 High Freq Env Acoustic (HFEVA) 1.0 COLOSSUS II 1.0 Low Freq Bottom LOSS

  6. Simulering af dagslys i digitale modeller

    DEFF Research Database (Denmark)

    Villaume, René Domine; Ørstrup, Finn Rude

    2004-01-01

    Projektet undersøger via forskellige simuleringer af dagslys, kvaliteten af visualiseringer af komplekse lysforhold i digitale modeller i forbindelse med formidling af arkitektur via nettet. I en digital 3D model af Utzon Associates Paustians hus, simulers naturligt dagslysindfald med  forskellig...... Renderingsmetoder som: "shaded render" /  ”raytraceing” /  "Final Gather /  ”Global Illumination”...

  7. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  8. Molecular simulation and modeling of complex I.

    Science.gov (United States)

    Hummer, Gerhard; Wikström, Mårten

    2016-07-01

    Molecular modeling and molecular dynamics simulations play an important role in the functional characterization of complex I. With its large size and complicated function, linking quinone reduction to proton pumping across a membrane, complex I poses unique modeling challenges. Nonetheless, simulations have already helped in the identification of possible proton transfer pathways. Simulations have also shed light on the coupling between electron and proton transfer, thus pointing the way in the search for the mechanistic principles underlying the proton pump. In addition to reviewing what has already been achieved in complex I modeling, we aim here to identify pressing issues and to provide guidance for future research to harness the power of modeling in the functional characterization of complex I. This article is part of a Special Issue entitled Respiratory complex I, edited by Volker Zickermann and Ulrich Brandt. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  10. Investigating Output Accuracy for a Discrete Event Simulation Model and an Agent Based Simulation Model

    CERN Document Server

    Majid, Mazlina Abdul; Siebers, Peer-Olaf

    2010-01-01

    In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store's fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.

  11. Stakeholder Theory and Value Creation Models in Brazilian Firms

    Directory of Open Access Journals (Sweden)

    Natalia Giugni Vidal

    2015-09-01

    Full Text Available Objective – The purpose of this study is to understand how top Brazilian firms think about and communicate value creation to their stakeholders. Design/methodology/approach – We use qualitative content analysis methodology to analyze the sustainability or annual integrated reports of the top 25 Brazilian firms by sales revenue. Findings – Based on our analysis, these firms were classified into three main types of stakeholder value creation models: narrow, broad, or transitioning from narrow to broad. We find that many of the firms in our sample are in a transition state between narrow and broad stakeholder value creation models. We also identify seven areas of concentration discussed by firms in creating value for stakeholders: better stakeholder relationships, better work environment, environmental preservation, increased customer base, local development, reputation, and stakeholder dialogue. Practical implications – This study shows a trend towards broader stakeholder value creation models in Brazilian firms. The findings of this study may inform practitioners interested in broadening their value creation models. Originality/value – This study adds to the discussion of stakeholder theory in the Brazilian context by understanding variations in value creation orientation in Brazil.

  12. Power electronics system modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lai, Jih-Sheng

    1994-12-31

    This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and output current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.

  13. Simulation of Gravity Currents Using VOF Model

    Institute of Scientific and Technical Information of China (English)

    邹建锋; 黄钰期; 应新亚; 任安禄

    2002-01-01

    By the Volume of Fluid (VOF) multiphase flow model two-dimensional gravity currents with three phases including air are numerically simulated in this article. The necessity of consideration of turbulence effect for high Reynolds numbers is demonstrated quantitatively by LES (the Large Eddy Simulation) turbulence model. The gravity currents are simulated for h ≠ H as well as h = H, where h is the depth of the gravity current before the release and H is the depth of the intruded fluid. Uprising of swell occurs when a current flows horizontally into another lighter one for h ≠ H. The problems under what condition the uprising of swell occurs and how long it takes are considered in this article. All the simulated results are in reasonable agreement with the experimental results available.

  14. The effects of numerical-model complexity and observation type on estimated porosity values

    Science.gov (United States)

    Starn, J. Jeffrey; Bagtzoglou, Amvrossios C.; Green, Christopher T.

    2015-09-01

    The relative merits of model complexity and types of observations employed in model calibration are compared. An existing groundwater flow model coupled with an advective transport simulation of the Salt Lake Valley, Utah (USA), is adapted for advective transport, and effective porosity is adjusted until simulated tritium concentrations match concentrations in samples from wells. Two calibration approaches are used: a "complex" highly parameterized porosity field and a "simple" parsimonious model of porosity distribution. The use of an atmospheric tracer (tritium in this case) and apparent ages (from tritium/helium) in model calibration also are discussed. Of the models tested, the complex model (with tritium concentrations and tritium/helium apparent ages) performs best. Although tritium breakthrough curves simulated by complex and simple models are very generally similar, and there is value in the simple model, the complex model is supported by a more realistic porosity distribution and a greater number of estimable parameters. Culling the best quality data did not lead to better calibration, possibly because of processes and aquifer characteristics that are not simulated. Despite many factors that contribute to shortcomings of both the models and the data, useful information is obtained from all the models evaluated. Although any particular prediction of tritium breakthrough may have large errors, overall, the models mimic observed trends.

  15. Value Creation Challenges in Multichannel Retail Business Models

    Directory of Open Access Journals (Sweden)

    Mika Yrjölä

    2014-08-01

    Full Text Available Purpose: The purpose of the paper is to identify and analyze the challenges of value creation in multichannel retail business models. Design/methodology/approach: With the help of semi-structured interviews with top executives from different retailing environments, this study introduces a model of value creation challenges in the context of multichannel retailing. The challenges are analyzed in terms of three retail business model elements, i.e., format, activities, and governance. Findings: Adopting a multichannel retail business model requires critical rethinking of the basic building blocks of value creation. First of all, as customers effortlessly move between multiple channels, multichannel formats can lead to a mismatch between customer and firm value. Secondly, retailers face pressures to use their activities to form integrated total offerings to customers. Thirdly, multiple channels might lead to organizational silos with conflicting goals. A careful orchestration of value creation is needed to determine the roles and incentives of the channel parties involved. Research limitations/implications: In contrast to previous business model literature, this study did not adopt a network-centric view. By embracing the boundary-spanning nature of the business model, other challenges and elements might have been discovered (e.g., challenges in managing relationships with suppliers. Practical implications: As a practical contribution, this paper has analyzed the challenges retailers face in adopting multichannel business models. Customer tendencies for showrooming behavior highlight the need for generating efficient lock-in strategies. Customized, personal offers and information are ways to increase customer value, differentiate from competition, and achieve lock-in. Originality/value: As a theoretical contribution, this paper empirically investigates value creation challenges in a specific context, lowering the level of abstraction in the mostly

  16. Integral-Value Models for Outcomes over Continuous Time

    DEFF Research Database (Denmark)

    Harvey, Charles M.; Østerdal, Lars Peter

    Models of preferences between outcomes over continuous time are important for individual, corporate, and social decision making, e.g., medical treatment, infrastructure development, and environmental regulation. This paper presents a foundation for such models. It shows that conditions...... on preferences between real- or vector-valued outcomes over continuous time are satisfied if and only if the preferences are represented by a value function having an integral form...

  17. Development of NASA's Models and Simulations Standard

    Science.gov (United States)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  18. Modeling Dynamics of Leaf Color Based on RGB Value in Rice

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yong-hui; TANG Liang; LIU Xiao-jun; LIU Lei-lei; CAO Wei-xing; ZHU Yan

    2014-01-01

    This paper was to develop a model for simulating the leaf color changes in rice (Oryza sativa L.) based on RGB (red, green, and blue) values. Based on rice experiment data with different cultivars and nitrogen (N) rates, the time-course RGB values of each leaf on main stem were collected during the growth period in rice, and a model for simulating the dynamics of leaf color in rice was then developed using quantitative modeling technology. The results showed that the RGB values of leaf color gradually decreased from the initial values (light green) to the steady values (green) during the ifrst stage, remained the steady values (green) during the second stage, then gradually increased to the ifnal values (from green to yellow) during the third stage. The decreasing linear functions, constant functions and increasing linear functions were used to simulate the changes in RGB values of leaf color at the ifrst, second and third stages with growing degree days (GDD), respectively;two cultivar parameters, MatRGB (leaf color matrix) and AR (a vector composed of the ratio of the cumulative GDD of each stage during color change process of leaf n to that during leaf n drawn under adequate N status), were introduced to quantify the genetic characters in RGB values of leaf color and in durations of different stages during leaf color change, respectively;FN (N impact factor) was used to quantify the effects of N levels on RGB values of leaf color and on durations of different stages during leaf color change;linear functions were applied to simulate the changes in leaf color along the leaf midvein direction during leaf development process. Validation of the models with the independent experiment dataset exhibited that the root mean square errors (RMSE) between the observed and simulated RGB values were among 8 to 13, the relative RMSE (RRMSE) were among 8 to 10%, the mean absolute differences (da) were among 3.85 to 6.90, and the ratio of da to the mean observation values (dap

  19. Determination of Complex-Valued Parametric Model Coefficients Using Artificial Neural Network Technique

    Directory of Open Access Journals (Sweden)

    A. M. Aibinu

    2010-01-01

    Full Text Available A new approach for determining the coefficients of a complex-valued autoregressive (CAR and complex-valued autoregressive moving average (CARMA model coefficients using complex-valued neural network (CVNN technique is discussed in this paper. The CAR and complex-valued moving average (CMA coefficients which constitute a CARMA model are computed simultaneously from the adaptive weights and coefficients of the linear activation functions in a two-layered CVNN. The performance of the proposed technique has been evaluated using simulated complex-valued data (CVD with three different types of activation functions. The results show that the proposed method can accurately determine the model coefficients provided that the network is properly trained. Furthermore, application of the developed CVNN-based technique for MRI K-space reconstruction results in images with improve resolution.

  20. Prediction of survival with alternative modeling techniques using pseudo values.

    Directory of Open Access Journals (Sweden)

    Tjeerd van der Ploeg

    Full Text Available BACKGROUND: The use of alternative modeling techniques for predicting patient survival is complicated by the fact that some alternative techniques cannot readily deal with censoring, which is essential for analyzing survival data. In the current study, we aimed to demonstrate that pseudo values enable statistically appropriate analyses of survival outcomes when used in seven alternative modeling techniques. METHODS: In this case study, we analyzed survival of 1282 Dutch patients with newly diagnosed Head and Neck Squamous Cell Carcinoma (HNSCC with conventional Kaplan-Meier and Cox regression analysis. We subsequently calculated pseudo values to reflect the individual survival patterns. We used these pseudo values to compare recursive partitioning (RPART, neural nets (NNET, logistic regression (LR general linear models (GLM and three variants of support vector machines (SVM with respect to dichotomous 60-month survival, and continuous pseudo values at 60 months or estimated survival time. We used the area under the ROC curve (AUC and the root of the mean squared error (RMSE to compare the performance of these models using bootstrap validation. RESULTS: Of a total of 1282 patients, 986 patients died during a median follow-up of 66 months (60-month survival: 52% [95% CI: 50%-55%]. The LR model had the highest optimism corrected AUC (0.791 to predict 60-month survival, followed by the SVM model with a linear kernel (AUC 0.787. The GLM model had the smallest optimism corrected RMSE when continuous pseudo values were considered for 60-month survival or the estimated survival time followed by SVM models with a linear kernel. The estimated importance of predictors varied substantially by the specific aspect of survival studied and modeling technique used. CONCLUSIONS: The use of pseudo values makes it readily possible to apply alternative modeling techniques to survival problems, to compare their performance and to search further for promising

  1. Possibilistic Fuzzy Net Present Value Model and Application

    Directory of Open Access Journals (Sweden)

    S. S. Appadoo

    2014-01-01

    Full Text Available The cash flow values and the interest rate in the net present value (NPV model are usually specified by either crisp numbers or random variables. In this paper, we first discuss some of the recent developments in possibility theory and find closed form expressions for fuzzy possibilistic net present value (FNPV. Then, following Carlsson and Fullér (2001, we discuss some of the possibilistic moments related to FNPV model along with an illustrative numerical example. We also give a unified approach to find higher order moments of FNPV by using the moment generating function introduced by Paseka et al. (2011.

  2. Models of consumer value cocreation in health care.

    Science.gov (United States)

    Nambisan, Priya; Nambisan, Satish

    2009-01-01

    In recent years, consumer participation in health care has gained critical importance as health care organizations (HCOs) seek varied avenues to enhance the quality and the value of their offerings. Many large HCOs have established online health communities where health care consumers (patients) can interact with one another to share knowledge and offer emotional support in disease management and care. Importantly, the focus of consumer participation in health care has moved beyond such personal health care management as the potential for consumers to participate in innovation and value creation in varied areas of the health care industry becomes increasingly evident. Realizing such potential, however, will require HCOs to develop a better understanding of the varied types of consumer value cocreation that are enabled by new information and communication technologies such as online health communities and Web 2.0 (social media) technologies. This article seeks to contribute toward such an understanding by offering a concise and coherent theoretical framework to analyze consumer value cocreation in health care. We identify four alternate models of consumer value cocreation-the partnership model, the open-source model, the support-group model, and the diffusion model-and discuss their implications for HCOs. We develop our theoretical framework by drawing on theories and concepts in knowledge creation, innovation management, and online communities. A set of propositions are developed by combining theoretical insights from these areas with real-world examples of consumer value cocreation in health care. The theoretical framework offered here informs on the potential impact of the different models of consumer value cocreation on important organizational variables such as innovation cost and time, service quality, and consumer perceptions of HCO. An understanding of the four models of consumer value cocreation can help HCOs adopt appropriate strategies and practices to

  3. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  4. Incorporation of RAM techniques into simulation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, S.C. Jr.; Haire, M.J.; Schryver, J.C.

    1995-07-01

    This work concludes that reliability, availability, and maintainability (RAM) analytical techniques can be incorporated into computer network simulation modeling to yield an important new analytical tool. This paper describes the incorporation of failure and repair information into network simulation to build a stochastic computer model represents the RAM Performance of two vehicles being developed for the US Army: The Advanced Field Artillery System (AFAS) and the Future Armored Resupply Vehicle (FARV). The AFAS is the US Army`s next generation self-propelled cannon artillery system. The FARV is a resupply vehicle for the AFAS. Both vehicles utilize automation technologies to improve the operational performance of the vehicles and reduce manpower. The network simulation model used in this work is task based. The model programmed in this application requirements a typical battle mission and the failures and repairs that occur during that battle. Each task that the FARV performs--upload, travel to the AFAS, refuel, perform tactical/survivability moves, return to logistic resupply, etc.--is modeled. Such a model reproduces a model reproduces operational phenomena (e.g., failures and repairs) that are likely to occur in actual performance. Simulation tasks are modeled as discrete chronological steps; after the completion of each task decisions are programmed that determine the next path to be followed. The result is a complex logic diagram or network. The network simulation model is developed within a hierarchy of vehicle systems, subsystems, and equipment and includes failure management subnetworks. RAM information and other performance measures are collected which have impact on design requirements. Design changes are evaluated through ``what if`` questions, sensitivity studies, and battle scenario changes.

  5. Simulation data for an estimation of the maximum theoretical value and confidence interval for the correlation coefficient.

    Science.gov (United States)

    Rocco, Paolo; Cilurzo, Francesco; Minghetti, Paola; Vistoli, Giulio; Pedretti, Alessandro

    2017-10-01

    The data presented in this article are related to the article titled "Molecular Dynamics as a tool for in silico screening of skin permeability" (Rocco et al., 2017) [1]. Knowledge of the confidence interval and maximum theoretical value of the correlation coefficient r can prove useful to estimate the reliability of developed predictive models, in particular when there is great variability in compiled experimental datasets. In this Data in Brief article, data from purposely designed numerical simulations are presented to show how much the maximum r value is worsened by increasing the data uncertainty. The corresponding confidence interval of r is determined by using the Fisher r→Z transform.

  6. A Bayesian model of context-sensitive value attribution.

    Science.gov (United States)

    Rigoli, Francesco; Friston, Karl J; Martinelli, Cristina; Selaković, Mirjana; Shergill, Sukhwinder S; Dolan, Raymond J

    2016-06-22

    Substantial evidence indicates that incentive value depends on an anticipation of rewards within a given context. However, the computations underlying this context sensitivity remain unknown. To address this question, we introduce a normative (Bayesian) account of how rewards map to incentive values. This assumes that the brain inverts a model of how rewards are generated. Key features of our account include (i) an influence of prior beliefs about the context in which rewards are delivered (weighted by their reliability in a Bayes-optimal fashion), (ii) the notion that incentive values correspond to precision-weighted prediction errors, (iii) and contextual information unfolding at different hierarchical levels. This formulation implies that incentive value is intrinsically context-dependent. We provide empirical support for this model by showing that incentive value is influenced by context variability and by hierarchically nested contexts. The perspective we introduce generates new empirical predictions that might help explaining psychopathologies, such as addiction.

  7. Pricing for Catastrophe Bonds Based on Expected-value Model

    Directory of Open Access Journals (Sweden)

    Junfei Chen

    2013-02-01

    Full Text Available As the catastrophes cannot be avoided and result in huge economic losses, therefore the compensation issue for catastrophe losses become an important research topic. Catastrophe bonds can effectively disperse the catastrophe risks which mainly undertaken by the government and the insurance companies currently and focus on capital more effectively in broad capital market, therefore to be an ideal catastrophe securities product. This study adopts Expectancy Theory to supplement and improve the pricing of catastrophe bonds based on Value Theory. A model of expected utility is established to determine the conditions of the expected revenue R of catastrophe bonds. The pricing model of the value function is used to get the psychological value of R,U (R-R‾, for catastrophe bonds. Finally, the psychological value is improved by the value according to expected utility and this can more accurately evaluate catastrophe bonds at a reasonable price. This research can provide decision-making for the pricing of catastrophe bonds.

  8. Testing turbulent closure models with convection simulations

    CERN Document Server

    Snellman, J E; Mantere, M J; Rheinhardt, M; Dintrans, B

    2012-01-01

    Aims: To compare simple analytical closure models of turbulent Boussinesq convection for stellar applications with direct three-dimensional simulations both in homogeneous and inhomogeneous (bounded) setups. Methods: We use simple analytical closure models to compute the fluxes of angular momentum and heat as a function of rotation rate measured by the Taylor number. We also investigate cases with varying angles between the angular velocity and gravity vectors, corresponding to locating the computational domain at different latitudes ranging from the pole to the equator of the star. We perform three-dimensional numerical simulations in the same parameter regimes for comparison. The free parameters appearing in the closure models are calibrated by two fit methods using simulation data. Unique determination of the closure parameters is possible only in the non-rotating case and when the system is placed at the pole. In the other cases the fit procedures yield somewhat differing results. The quality of the closu...

  9. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  10. [Healthcare value chain: a model for the Brazilian healthcare system].

    Science.gov (United States)

    Pedroso, Marcelo Caldeira; Malik, Ana Maria

    2012-10-01

    This article presents a model of the healthcare value chain which consists of a schematic representation of the Brazilian healthcare system. The proposed model is adapted for the Brazilian reality and has the scope and flexibility for use in academic activities and analysis of the healthcare sector in Brazil. It places emphasis on three components: the main activities of the value chain, grouped in vertical and horizontal links; the mission of each link and the main value chain flows. The proposed model consists of six vertical and three horizontal links, amounting to nine. These are: knowledge development; supply of products and technologies; healthcare services; financial intermediation; healthcare financing; healthcare consumption; regulation; distribution of healthcare products; and complementary and support services. Four flows can be used to analyze the value chain: knowledge and innovation; products and services; financial; and information.

  11. Integer Valued Autoregressive Models for Tipping Bucket Rainfall Measurements

    DEFF Research Database (Denmark)

    Thyregod, Peter; Carstensen, Niels Jacob; Madsen, Henrik

    1999-01-01

    A new method for modelling the dynamics of rain sampled by a tipping bucket rain gauge is proposed. The considered models belong to the class of integer valued autoregressive processes. The models take the autocorelation and discrete nature of the data into account. A first order, a second order...... and a threshold model are presented together with methods to estimate the parameters of each model. The models are demonstrated to provide a good description of dt from actual rain events requiring only two to four parameters....

  12. Modeling and simulation with operator scaling

    CERN Document Server

    Cohen, Serge; Rosinski, Jan

    2009-01-01

    Self-similar processes are useful in modeling diverse phenomena that exhibit scaling properties. Operator scaling allows a different scale factor in each coordinate. This paper develops practical methods for modeling and simulating stochastic processes with operator scaling. A simulation method for operator stable Levy processes is developed, based on a series representation, along with a Gaussian approximation of the small jumps. Several examples are given to illustrate practical applications. A classification of operator stable Levy processes in two dimensions is provided according to their exponents and symmetry groups. We conclude with some remarks and extensions to general operator self-similar processes.

  13. Hemispherical sky simulator for daylighting model studies

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, S.

    1981-07-01

    The design of a 24-foot-diameter hemispherical sky simulator recently completed at LBL is described. The goal was to produce a facility in which large models could be tested; which was suitable for research, teaching, and design; which could provide a uniform sky, an overcast sky, and several clear-sky luminance distributions, as well as accommodating an artificial sun. Initial operating experience with the facility is described, the sky simulator capabilities are reviewed, and its strengths and weaknesses relative to outdoor modeling tests are discussed.

  14. Wind Shear Target Echo Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Xiaoyang Liu

    2015-01-01

    Full Text Available Wind shear is a dangerous atmospheric phenomenon in aviation. Wind shear is defined as a sudden change of speed or direction of the wind. In order to analyze the influence of wind shear on the efficiency of the airplane, this paper proposes a mathematical model of point target rain echo and weather target signal echo based on Doppler effect. The wind field model is developed in this paper, and the antenna model is also studied by using Bessel function. The spectrum distribution of symmetric and asymmetric wind fields is researched by using the mathematical model proposed in this paper. The simulation results are in accordance with radial velocity component, and the simulation results also confirm the correctness of the established model of antenna.

  15. Effects of model schematisation, geometry and parameter values on urban flood modelling.

    Science.gov (United States)

    Vojinovic, Z; Seyoum, S D; Mwalwaka, J M; Price, R K

    2011-01-01

    One-dimensional (1D) hydrodynamic models have been used as a standard industry practice for urban flood modelling work for many years. More recently, however, model formulations have included a 1D representation of the main channels and a 2D representation of the floodplains. Since the physical process of describing exchanges of flows with the floodplains can be represented in different ways, the predictive capability of different modelling approaches can also vary. The present paper explores effects of some of the issues that concern urban flood modelling work. Impacts from applying different model schematisation, geometry and parameter values were investigated. The study has mainly focussed on exploring how different Digital Terrain Model (DTM) resolution, presence of different features on DTM such as roads and building structures and different friction coefficients affect the simulation results. Practical implications of these issues are analysed and illustrated in a case study from St Maarten, N.A. The results from this study aim to provide users of numerical models with information that can be used in the analyses of flooding processes in urban areas.

  16. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  17. Creating Value Through the Freemium Business Model: A Consumer Perspective

    NARCIS (Netherlands)

    J. Rietveld (Joost)

    2016-01-01

    textabstractThis paper develops a consumer-centric framework for creating value through the freemium business model. Goods that are commercialized through the freemium business model offer basic functionality for free and monetize users for extended use or complementary features. Compared to premium

  18. International Business Models Developed Through Brokerage Knowledge and Value Creation

    DEFF Research Database (Denmark)

    Petersen, Nicolaj Hannesbo; Rasmussen, Erik Stavnsager

    This paper highlights theoretically and empirically international business model decisions in networks with knowledge sharing and value creation. The paper expands the conceptual in-ternational business model framework for technology-oriented companies to include the focal firm’s network role...

  19. Theory, modeling and simulation of superconducting qubits

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU

    2011-01-13

    We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high

  20. Theory, modeling and simulation of superconducting qubits

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU

    2011-01-13

    We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high

  1. Battery thermal models for hybrid vehicle simulations

    Science.gov (United States)

    Pesaran, Ahmad A.

    This paper summarizes battery thermal modeling capabilities for: (1) an advanced vehicle simulator (ADVISOR); and (2) battery module and pack thermal design. The National Renewable Energy Laboratory's (NREL's) ADVISOR is developed in the Matlab/Simulink environment. There are several battery models in ADVISOR for various chemistry types. Each one of these models requires a thermal model to predict the temperature change that could affect battery performance parameters, such as resistance, capacity and state of charges. A lumped capacitance battery thermal model in the Matlab/Simulink environment was developed that included the ADVISOR battery performance models. For thermal evaluation and design of battery modules and packs, NREL has been using various computer aided engineering tools including commercial finite element analysis software. This paper will discuss the thermal ADVISOR battery model and its results, along with the results of finite element modeling that were presented at the workshop on "Development of Advanced Battery Engineering Models" in August 2001.

  2. When experts are oceans apart: comparing expert performance values for proficiency-based laparoscopic simulator training

    NARCIS (Netherlands)

    Luursema, J.M.; Rovers, M.M.; Alken, A.P.; Kengen, B.; Goor, H. van

    2015-01-01

    BACKGROUND: Surgical training is moving away from the operating room toward simulation-based skills training facilities. This has led to the development of proficiency-based training courses in which expert performance data are used for feedback and assessment. However, few expert value data sets

  3. The Value Simulation-Based Learning Added to Machining Technology in Singapore

    Science.gov (United States)

    Fang, Linda; Tan, Hock Soon; Thwin, Mya Mya; Tan, Kim Cheng; Koh, Caroline

    2011-01-01

    This study seeks to understand the value simulation-based learning (SBL) added to the learning of Machining Technology in a 15-week core subject course offered to university students. The research questions were: (1) How did SBL enhance classroom learning? (2) How did SBL help participants in their test? (3) How did SBL prepare participants for…

  4. Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    André, T. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Morini, F. [Research Group of Theoretical Chemistry and Molecular Modelling, Hasselt University, Agoralaan Gebouw D, B-3590 Diepenbeek (Belgium); Karamitros, M. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, INCIA, UMR 5287, F-33400 Talence (France); Delorme, R. [LPSC, Université Joseph Fourier Grenoble 1, CNRS/IN2P3, Grenoble INP, 38026 Grenoble (France); CEA, LIST, F-91191 Gif-sur-Yvette (France); Le Loirec, C. [CEA, LIST, F-91191 Gif-sur-Yvette (France); Campos, L. [Departamento de Física, Universidade Federal de Sergipe, São Cristóvão (Brazil); Champion, C. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Groetz, J.-E.; Fromm, M. [Université de Franche-Comté, Laboratoire Chrono-Environnement, UMR CNRS 6249, Besançon (France); Bordage, M.-C. [Laboratoire Plasmas et Conversion d’Énergie, UMR 5213 CNRS-INPT-UPS, Université Paul Sabatier, Toulouse (France); Perrot, Y. [Laboratoire de Physique Corpusculaire, UMR 6533, Aubière (France); Barberet, Ph. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); and others

    2014-01-15

    Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov–Smirnov test has allowed confirming the statistical compatibility of all simulation results.

  5. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  6. Simulation of DME synthesis from coal syngas by kinetics model

    Energy Technology Data Exchange (ETDEWEB)

    Shim, H.M.; Lee, S.J.; Yoo, Y.D.; Yun, Y.S.; Kim, H.T. [Ajou University, Suwon (Republic of Korea)

    2009-05-15

    DME (Dimethyl Ether) has emerged as a clean alternative fuel for diesel. In this study it is developed a simulation model through a kinetics model of the ASPEN plus simulator, performed to detect operating characteristics of DME direct synthesis. An overall DME synthesis process is referenced by experimental data of 3 ton/day (TPD) coal gasification pilot plant located at IAE in Korea. Supplying condition of DME synthesis model is equivalently set to 80 N/m{sup 3} of syngas which is derived from a coal gasification plant. In the simulation it is assumed that the overall DME synthesis process proceeds with steady state, vapor-solid reaction with DME catalyst. The physical properties of reactants are governed by Soave-Redlich-Kwong (SRK) EOS in this model. A reaction model of DME synthesis is considered that is applied with the LHHW (Langmuir-Hinshelwood Hougen Watson) equation as an adsorption-desorption model on the surface of the DME catalyst. After adjusting the kinetics of the DME synthesis reaction among reactants with experimental data, the kinetics of the governing reactions inner DME reactor are modified and coupled with the entire DME synthesis reaction. For validating simulation results of the DME synthesis model, the obtained simulation results are compared with experimental results: conversion ratio, DME yield and DME production rate. Then, a sensitivity analysis is performed by effects of operating variables such as pressure, temperature of the reactor, void fraction of catalyst and H{sub 2}/CO ratio of supplied syngas with modified model. According to simulation results, optimum operating conditions of DME reactor are obtained in the range of 265-275{sup o}C and 60 kg/cm{sup 2}. And DME production rate has a maximum value in the range of 1-1.5 of H{sub 2}/CO ratio in the syngas composition.

  7. The construction of the milling process simulation models

    Science.gov (United States)

    Ślusarczyk, Ł.

    2016-09-01

    The paper has aimed at presentation of the possibilities of using computer-based techniques into scope of machine cutting processes, and mostly of analytical and numerical modeling of the milling process for austenitic high-alloy chromium-nickel steel X 5 CrNi 18-10 and verification of the results experiments. The study was mostly focused on measuring and assessment of deformations in the given sample with the specific load. The simulations were executed in modern computer simulation software which supports such activities. These include: NX by Siemens and Simulia Abaqus. The selection of parameters was based on the real values measured during the milling process.

  8. The Unfolding of Value Sources During Online Business Model Transformation

    Directory of Open Access Journals (Sweden)

    Nadja Hoßbach

    2016-12-01

    Full Text Available Purpose: In the magazine publishing industry, viable online business models are still rare to absent. To prepare for the ‘digital future’ and safeguard their long-term survival, many publishers are currently in the process of transforming their online business model. Against this backdrop, this study aims to develop a deeper understanding of (1 how the different building blocks of an online business model are transformed over time and (2 how sources of value creation unfold during this transformation process. Methodology: To answer our research question, we conducted a longitudinal case study with a leading German business magazine publisher (called BIZ. Data was triangulated from multiple sources including interviews, internal documents, and direct observations. Findings: Based on our case study, we nd that BIZ used the transformation process to differentiate its online business model from its traditional print business model along several dimensions, and that BIZ’s online business model changed from an efficiency- to a complementarity- to a novelty-based model during this process. Research implications: Our findings suggest that different business model transformation phases relate to different value sources, questioning the appropriateness of value source-based approaches for classifying business models. Practical implications: The results of our case study highlight the need for online-offline business model differentiation and point to the important distinction between service and product differentiation. Originality: Our study contributes to the business model literature by applying a dynamic and holistic perspective on the link between online business model changes and unfolding value sources.

  9. The Yellow Brick Road: a values based curriculum model.

    Science.gov (United States)

    McLean, Christopher

    2012-05-01

    Within the United Kingdom, the Nursing and Midwifery Council (NMC) requires that nurses and midwives are of 'good character' at the point of registration. This paper sets out how good character has been conceptualised within one U.K. higher education institution and presents a model of "values based enquiry" which aims to develop the 'character' of students. The paper presents three qualities ("the heart", "the nerve" and "the brain") which represent 'good character' and which are believed to underpin values based Nursing or Midwifery practice. The development of these qualities is argued to be reliant upon helping students to develop intrinsic professional values of care and compassion. The role of these character qualities in nursing practice and education is outlined, as are the ways in which they have led to the development of a model for values based enquiry. This model represents a vision of the nature of professional education which may be shared by staff and students, whilst offering a model for learning and teaching based upon recognised educational principles. An argument is advanced that the adoption of a values based enquiry model may develop and nurture the habits of mind which are necessary for the development of 'good character'. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. An Evaluation of Value at Risk Models in Chinese Stock Market

    OpenAIRE

    Xiao, Ying

    2013-01-01

    The aim of this article is to examine the predictive performance of VaR model in Chinese stock market and try to find the rational choice of models for China. In order to achieve this goal, Historical simulation approach, Bootstrapped HS, Hull White method, parametric approach with volatility adjustment, Generalized extreme value theory and Peaks-over-threshold approach are applied to the Shanghai Stock Exchange Composite Index (SSECI) and Shenzhen Stock Exchange Composite Index (SZSECI) to e...

  11. Reference values and physiological characterization of a specific isolated pig kidney perfusion model

    OpenAIRE

    Meissler Michael; Fischer Axel; Fehrenberg Claudia; Grosse-Siestrup Christian; Unger Volker; Groneberg David A

    2007-01-01

    Abstract Background Models of isolated and perfused kidneys are used to study the effects of drugs, hazardous or toxic substances on renal functions. Since physiological and morphological parameters of small laboratory animal kidneys are difficult to compare to human renal parameters, porcine kidney perfusion models have been developed to simulate closer conditions to the human situation, but exact values of renal parameters for different collection and perfusion conditions have not been repo...

  12. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  13. Nanoindentation shape effect: experiments, simulations and modelling

    Energy Technology Data Exchange (ETDEWEB)

    Calabri, L [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy); Pugno, N [Department of Structural Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Turin (Italy); Rota, A [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy); Marchetto, D [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy); Valeri, S [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy)

    2007-10-03

    AFM nanoindentation is nowadays commonly used for the study of mechanical properties of materials at the nanoscale. The investigation of surface hardness of a material using AFM means that the probe has to be able to indent the surface, but also to image it. Usually standard indenters are not sharp enough to obtain high-resolution images, but on the other hand measuring the hardness behaviour of a material with a non-standard sharp indenter gives only comparative results affected by a significant deviation from the commonly used hardness scales. In this paper we try to understand how the shape of the indenter affects the hardness measurement, in order to find a relationship between the measured hardness of a material and the corner angle of a pyramidal indenter. To achieve this we performed a full experimental campaign, indenting the same material with three focused ion beam (FIB) nanofabricated probes with a highly altered corner angle. We then compared the results obtained experimentally with those obtained by numerical simulations, using the finite element method (FEM), and by theoretical models, using a general scaling law for nanoindentation available for indenters with a variable size and shape. The comparison between these three approaches (experimental, numerical and theoretical approaches) reveals a good agreement and allowed us to find a theoretical relationship which links the measured hardness value with the shape of the indenter. The same theoretical approach has also been used to fit the hardness experimental results considering the indentation size effect. In this case we compare the measured data, changing the applied load.

  14. Model Checking Real-Time Value-Passing Systems

    Institute of Scientific and Technical Information of China (English)

    Jing Chen; Zio-Ning Cao

    2004-01-01

    In this paper,to model check real-time value-passing systems,a formal language Timed Symbolic Transition Graph and a logic system named Timed Predicate μ-Calculus are proposed.An algorithm is presented which is local in that it generates and investigates the reachable state space in top-down fashion and maintains the partition for time evaluations as coarse as possible while on-the-fly instantiating data variables.It can deal with not only data variables with finite value domain,but also the so called data independent variables with infinite value domain.To authors knowledge,this is the first algorithm for model checking timed systems containing value-passing features.

  15. EXACT SIMULATION OF A BOOLEAN MODEL

    Directory of Open Access Journals (Sweden)

    Christian Lantuéjoul

    2013-06-01

    Full Text Available A Boolean model is a union of independent objects (compact random subsets located at Poisson points. Two algorithms are proposed for simulating a Boolean model in a bounded domain. The first one applies only to stationary models. It generates the objects prior to their Poisson locations. Two examples illustrate its applicability. The second algorithm applies to stationary and non-stationary models. It generates the Poisson points prior to the objects. Its practical difficulties of implementation are discussed. Both algorithms are based on importance sampling techniques, and the generated objects are weighted.

  16. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  17. Modeling and Simulation of Nuclear Fuel Materials

    Energy Technology Data Exchange (ETDEWEB)

    Devanathan, Ramaswami; Van Brutzel, Laurent; Chartier, Alan; Gueneau, Christine; Mattsson, Ann E.; Tikare, Veena; Bartel, Timothy; Besmann, T. M.; Stan, Marius; Van Uffelen, Paul

    2010-10-01

    We review the state of modeling and simulation of nuclear fuels with emphasis on the most widely used nuclear fuel, UO2. The hierarchical scheme presented represents a science-based approach to modeling nuclear fuels by progressively passing information in several stages from ab initio to continuum levels. Such an approach is essential to overcome the challenges posed by radioactive materials handling, experimental limitations in modeling extreme conditions and accident scenarios, and the small time and distance scales of fundamental defect processes. When used in conjunction with experimental validation, this multiscale modeling scheme can provide valuable guidance to development of fuel for advanced reactors to meet rising global energy demand.

  18. Simulation modeling of health care policy.

    Science.gov (United States)

    Glied, Sherry; Tilipman, Nicholas

    2010-01-01

    Simulation modeling of health reform is a standard part of policy development and, in the United States, a required element in enacting health reform legislation. Modelers use three types of basic structures to build models of the health system: microsimulation, individual choice, and cell-based. These frameworks are filled in with data on baseline characteristics of the system and parameters describing individual behavior. Available data on baseline characteristics are imprecise, and estimates of key empirical parameters vary widely. A comparison of estimated and realized consequences of several health reform proposals suggests that models provided reasonably accurate estimates, with confidence bounds of approximately 30%.

  19. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  20. Simulation of MILD combustion using Perfectly Stirred Reactor model

    KAUST Repository

    Chen, Z.

    2016-07-06

    A simple model based on a Perfectly Stirred Reactor (PSR) is proposed for moderate or intense low-oxygen dilution (MILD) combustion. The PSR calculation is performed covering the entire flammability range and the tabulated chemistry approach is used with a presumed joint probability density function (PDF). The jet, in hot and diluted coflow experimental set-up under MILD conditions, is simulated using this reactor model for two oxygen dilution levels. The computed results for mean temperature, major and minor species mass fractions are compared with the experimental data and simulation results obtained recently using a multi-environment transported PDF approach. Overall, a good agreement is observed at three different axial locations for these comparisons despite the over-predicted peak value of CO formation. This suggests that MILD combustion can be effectively modelled by the proposed PSR model with lower computational cost.

  1. A reservoir simulation approach for modeling of naturally fractured reservoirs

    Directory of Open Access Journals (Sweden)

    H. Mohammadi

    2012-12-01

    Full Text Available In this investigation, the Warren and Root model proposed for the simulation of naturally fractured reservoir was improved. A reservoir simulation approach was used to develop a 2D model of a synthetic oil reservoir. Main rock properties of each gridblock were defined for two different types of gridblocks called matrix and fracture gridblocks. These two gridblocks were different in porosity and permeability values which were higher for fracture gridblocks compared to the matrix gridblocks. This model was solved using the implicit finite difference method. Results showed an improvement in the Warren and Root model especially in region 2 of the semilog plot of pressure drop versus time, which indicated a linear transition zone with no inflection point as predicted by other investigators. Effects of fracture spacing, fracture permeability, fracture porosity, matrix permeability and matrix porosity on the behavior of a typical naturally fractured reservoir were also presented.

  2. Modeling and simulation of epidemic spread

    DEFF Research Database (Denmark)

    Shatnawi, Maad; Lazarova-Molnar, Sanja; Zaki, Nazar

    2013-01-01

    and control such epidemics. This paper presents an overview of the epidemic spread modeling and simulation, and summarizes the main technical challenges in this field. It further investigates the most relevant recent approaches carried out towards this perspective and provides a comparison and classification...

  3. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  4. Modeling and Simulating Virtual Anatomical Humans

    NARCIS (Netherlands)

    Madehkhaksar, Forough; Luo, Zhiping; Pronost, Nicolas; Egges, Arjan

    2014-01-01

    This chapter presents human musculoskeletal modeling and simulation as a challenging field that lies between biomechanics and computer animation. One of the main goals of computer animation research is to develop algorithms and systems that produce plausible motion. On the other hand, the main chall

  5. Modeling and Simulation in Healthcare Future Directions

    Science.gov (United States)

    2010-07-13

    Quantify performance (Competency - based) 6. Simulate before practice ( Digital Libraries ) Classic Education and Examination What is the REVOLUTION in...av $800,000 yr 2.) Actor patients - $250,000 – $400,000/yr 2. Digital Libraries or synthetic tissue models a. Subscription vs up-front costs

  6. Simulation Versus Models: Which One and When?

    Science.gov (United States)

    Dorn, William S.

    1975-01-01

    Describes two types of computer-based experiments: simulation (which assumes no student knowledge of the workings of the computer program) is recommended for experiments aimed at inductive reasoning; and modeling (which assumes student understanding of the computer program) is recommended for deductive processes. (MLH)

  7. Love Kills:. Simulations in Penna Ageing Model

    Science.gov (United States)

    Stauffer, Dietrich; Cebrat, Stanisław; Penna, T. J. P.; Sousa, A. O.

    The standard Penna ageing model with sexual reproduction is enlarged by adding additional bit-strings for love: Marriage happens only if the male love strings are sufficiently different from the female ones. We simulate at what level of required difference the population dies out.

  8. Inverse modeling for Large-Eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.

    1998-01-01

    Approximate higher order polynomial inversion of the top-hat filter is developed with which the turbulent stress tensor in Large-Eddy Simulation can be consistently represented using the filtered field. Generalized (mixed) similarity models are proposed which improved the agreement with the kinetic

  9. Microdata Simulation Modeling After Twenty Years.

    Science.gov (United States)

    Haveman, Robert H.

    1986-01-01

    This article describes the method and the development of microdata simulation modeling over the past two decades. After tracing a brief history of this evaluation method, its problems and prospects are assessed. The effects of this research method on the development of the social sciences are examined. (JAZ)

  10. Simulation Modeling on the Macintosh using STELLA.

    Science.gov (United States)

    Costanza, Robert

    1987-01-01

    Describes a new software package for the Apple Macintosh computer which can be used to create elaborate simulation models in a fraction of the time usually required without using a programming language. Illustrates the use of the software which relates to water usage. (TW)

  11. Simulation Modeling of Radio Direction Finding Results

    Directory of Open Access Journals (Sweden)

    K. Pelikan

    1994-12-01

    Full Text Available It is sometimes difficult to determine analytically error probabilities of direction finding results for evaluating algorithms of practical interest. Probalistic simulation models are described in this paper that can be to study error performance of new direction finding systems or to geographical modifications of existing configurations.

  12. A Prison/Parole System Simulation Model,

    Science.gov (United States)

    parole system on future prison and parole populations. A simulation model is presented, viewing a prison / parole system as a feedback process for...ciminal offenders . Transitions among the states in which an offender might be located, imprisoned, paroled , and discharged, are assumed to be in...accordance with a discrete time semi-Markov process. Projected prison and parole populations for sample data and applications of the model are discussed. (Author)

  13. An interval-valued reliability model with bounded failure rates

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, Victor

    2012-01-01

    The approach to deriving interval-valued reliability measures described in this paper is distinctive from other imprecise reliability models in that it overcomes the issue of having to impose an upper bound on time to failure. It rests on the presupposition that a constant interval-valued failure...... function if only partial failure information is available. An example is provided. © 2012 Copyright Taylor and Francis Group, LLC....

  14. Assessing the potential value for an automated dairy cattle body condition scoring system through stochastic simulation

    NARCIS (Netherlands)

    Bewley, J.M.; Boehlje, M.D.; Gray, A.W.; Hogeveen, H.; Kenyon, S.J.; Eicher, S.D.; Schutz, M.M.

    2010-01-01

    Purpose – The purpose of this paper is to develop a dynamic, stochastic, mechanistic simulation model of a dairy business to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm sy

  15. Assessing the potential value for an automated dairy cattle body condition scoring system through stochastic simulation

    NARCIS (Netherlands)

    Bewley, J.M.; Boehlje, M.D.; Gray, A.W.; Hogeveen, H.; Kenyon, S.J.; Eicher, S.D.; Schutz, M.M.

    2010-01-01

    Purpose – The purpose of this paper is to develop a dynamic, stochastic, mechanistic simulation model of a dairy business to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm

  16. Eigen values in epidemic and other bio-inspired models

    Science.gov (United States)

    Supriatna, A. K.; Anggriani, N.; Carnia, E.; Raihan, A.

    2017-08-01

    Eigen values and the largest eigen value have special roles in many applications. In this paper we will discuss its role in determining the epidemic threshold in which we can determine if an epidemic will decease or blow out eventually. Some examples and their consequences to controling the epidemic are also discusses. Beside the application in epidemic model, the paper also discusses other example of appication in bio-inspired model, such as the backcross breeding for two age classes of local and exotic goats. Here we give some elaborative examples on the use of previous backcross breeding model. Some future direction on the exploration of the relationship between these eigenvalues to different epidemic models and other bio-inspired models are also presented.

  17. [Spirographic reference values. Mathematical models and practical use (author's transl)].

    Science.gov (United States)

    Drouet, D; Kauffmann, F; Brille, D; Lellouch, J

    1980-01-01

    Various models predicting VC and FEV1 from age and height have been compared by both theoretical and practical approaches on several subgroups of a working population examined in 1960 and 1972. The models in which spirographic values are proportional to the cube of the height give a significantly worse fit of the data. All the other models give similar predicted values in practical terms, but cutoff points depend on the distributions of VC and FEV1 given age and height. Results show that these distributions are closer to a normal than to a lognormal distribution. The use of reference values and classical cutoffs is then discussed. Rather than using a single cutoff point, a more quantitative way is proposed to describe the subjects' functional status, for example by situating him in the percentile of the reference population. In screening, cutoff points cannot be choosen without specifying first the decision considered and the population concerned.

  18. Automatic modeling of the linguistic values for database fuzzy querying

    Directory of Open Access Journals (Sweden)

    Diana STEFANESCU

    2007-12-01

    Full Text Available In order to evaluate vague queries, each linguistic term is considered according to its fuzzy model. Usually, the linguistic terms are defined as fuzzy sets, during a classical knowledge acquisition off-line process. But they can also be automatically extracted from the actual content of the database, by an online process. In at least two situations, automatically modeling the linguistic values would be very useful: first, to simplify the knowledge engineer’s task by extracting the definitions from the database content; and second, where mandatory, to dynamically define the linguistic values in complex criteria queries evaluation. Procedures to automatically extract the fuzzy model of the linguistic values from the existing data are presented in this paper.

  19. Personal values and crew compatibility: Results from a 105 days simulated space mission

    Science.gov (United States)

    Sandal, Gro M.; Bye, Hege H.; van de Vijver, Fons J. R.

    2011-08-01

    On a mission to Mars the crew will experience high autonomy and inter-dependence. "Groupthink", known as a tendency to strive for consensus at the cost of considering alternative courses of action, represents a potential safety hazard. This paper addresses two aspects of "groupthink": the extent to which confined crewmembers perceive increasing convergence in personal values, and whether they attribute less tension to individual differences over time. It further examines the impact of personal values for interpersonal compatibility. These questions were investigated in a 105-day confinement study in which a multinational crew ( N=6) simulated a Mars mission. The Portrait of Crew Values Questionnaire was administered regularly to assess personal values, perceived value homogeneity, and tension attributed to value disparities. Interviews were conducted before and after the confinement. Multiple regression analysis revealed no significant changes in value homogeneity over time; rather the opposite tendency was indicated. More tension was attributed to differences in hedonism, benevolence and tradition in the last 35 days when the crew was allowed greater autonomy. Three subgroups, distinct in terms of personal values, were identified. No evidence for "groupthink" was found. The results suggest that personal values should be considered in composition of crews for long duration missions.

  20. Twitter's tweet method modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  1. Value increasing business model for e-hospital.

    Science.gov (United States)

    Null, Robert; Wei, June

    2009-01-01

    This paper developed a business value increasing model for electronic hospital (e-hospital) based on electronic value chain analysis. From this model, 58 hospital electronic business (e-business) solutions were developed. Additionally, this paper investigated the adoption patterns of these 58 e-business solutions within six US leading hospitals. The findings show that only 36 of 58 or 62% of the e-business solutions are fully or partially implemented within the six hospitals. Ultimately, the research results will be beneficial to managers and executives for accelerating e-business adoptions for e-hospital.

  2. Modeling the value of strategic actions in the superior colliculus

    Directory of Open Access Journals (Sweden)

    Dhushan Thevarajah

    2010-02-01

    Full Text Available In learning models of strategic game play, an agent constructs a valuation (action value over possible future choices as a function of past actions and rewards. Choices are then stochastic functions of these action values. Our goal is to uncover a neural signal that correlates with the action value posited by behavioral learning models. We measured activity from neurons in the superior colliculus (SC, a midbrain region involved in planning saccadic eye movements, in monkeys while they performed two saccade tasks. In the strategic task, monkeys competed against a computer in a saccade version of the mixed-strategy game “matching-pennies”. In the instructed task, stochastic saccades were elicited through explicit instruction rather than free choices. In both tasks, neuronal activity and behavior were shaped by past actions and rewards with more recent events exerting a larger influence. Further, SC activity predicted upcoming choices during the strategic task and upcoming reaction times during the instructed task. Finally, we found that neuronal activity in both tasks correlated with an established learning model, the Experience Weighted Attraction model of action valuation (Ho, Camerer, and Chong, 2007. Collectively, our results provide evidence that action values hypothesized by learning models are represented in the motor planning regions of the brain in a manner that could be used to select strategic actions.

  3. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  4. Modelling and simulation of affinity membrane adsorption.

    Science.gov (United States)

    Boi, Cristiana; Dimartino, Simone; Sarti, Giulio C

    2007-08-24

    A mathematical model for the adsorption of biomolecules on affinity membranes is presented. The model considers convection, diffusion and adsorption kinetics on the membrane module as well as the influence of dead end volumes and lag times; an analysis of flow distribution on the whole system is also included. The parameters used in the simulations were obtained from equilibrium and dynamic experimental data measured for the adsorption of human IgG on A2P-Sartoepoxy affinity membranes. The identification of a bi-Langmuir kinetic mechanisms for the experimental system investigated was paramount for a correct process description and the simulated breakthrough curves were in good agreement with the experimental data. The proposed model provides a new insight into the phenomena involved in the adsorption on affinity membranes and it is a valuable tool to assess the use of membrane adsorbers in large scale processes.

  5. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L

    2007-01-01

    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  6. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  7. A Superbubble Feedback Model for Galaxy Simulations

    CERN Document Server

    Keller, B W; Benincasa, S M; Couchman, H M P

    2014-01-01

    We present a new stellar feedback model that reproduces superbubbles. Superbubbles from clustered young stars evolve quite differently to individual supernovae and are substantially more efficient at generating gas motions. The essential new components of the model are thermal conduction, sub-grid evaporation and a sub-grid multi-phase treatment for cases where the simulation mass resolution is insufficient to model the early stages of the superbubble. The multi-phase stage is short compared to superbubble lifetimes. Thermal conduction physically regulates the hot gas mass without requiring a free parameter. Accurately following the hot component naturally avoids overcooling. Prior approaches tend to heat too much mass, leaving the hot ISM below $10^6$ K and susceptible to rapid cooling unless ad-hoc fixes were used. The hot phase also allows feedback energy to correctly accumulate from multiple, clustered sources, including stellar winds and supernovae. We employ high-resolution simulations of a single star ...

  8. Gstat: a program for geostatistical modelling, prediction and simulation

    Science.gov (United States)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  9. Mean Value Engine Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Müller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models what are physically based. Such models are useful for control studies, for engine control system analysis and for model based engine control systems. Very few published MVEMs have included the effects of Exhaust Gas...... Recirculation (EGR). The purpose of this paper is to present a modified MVEM which includes EGR in a physical way. It has been tested using newly developed, very fast manifold pressure, manifold temperature, port and EGR mass flow sensors. Reasonable agreement has been obtained on an experimental engine...

  10. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.

    1998-02-01

    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  11. Dynamics modeling and simulation of flexible airships

    Science.gov (United States)

    Li, Yuwen

    The resurgence of airships has created a need for dynamics models and simulation capabilities of these lighter-than-air vehicles. The focus of this thesis is a theoretical framework that integrates the flight dynamics, structural dynamics, aerostatics and aerodynamics of flexible airships. The study begins with a dynamics model based on a rigid-body assumption. A comprehensive computation of aerodynamic effects is presented, where the aerodynamic forces and moments are categorized into various terms based on different physical effects. A series of prediction approaches for different aerodynamic effects are unified and applied to airships. The numerical results of aerodynamic derivatives and the simulated responses to control surface deflection inputs are verified by comparing to existing wind-tunnel and flight test data. With the validated aerodynamics and rigid-body modeling, the equations of motion of an elastic airship are derived by the Lagrangian formulation. The airship is modeled as a free-free Euler-Bernoulli beam and the bending deformations are represented by shape functions chosen as the free-free normal modes. In order to capture the coupling between the aerodynamic forces and the structural elasticity, local velocity on the deformed vehicle is used in the computation of aerodynamic forces. Finally, with the inertial, gravity, aerostatic and control forces incorporated, the dynamics model of a flexible airship is represented by a single set of nonlinear ordinary differential equations. The proposed model is implemented as a dynamics simulation program to analyze the dynamics characteristics of the Skyship-500 airship. Simulation results are presented to demonstrate the influence of structural deformation on the aerodynamic forces and the dynamics behavior of the airship. The nonlinear equations of motion are linearized numerically for the purpose of frequency domain analysis and for aeroelastic stability analysis. The results from the latter for the

  12. Can Participatory Action Research Create Value for Business Model Innovation?

    DEFF Research Database (Denmark)

    Sparre, Mogens; Rasmussen, Ole Horn; Fast, Alf Michael

    Abstract: Participatory Action Research (PAR) has a longer academic history compared with the idea of business models (BMs). This paper indicates how industries gain by using the combined methodology. The research question "Can participatory action research create value for Business Model...... Innovation (BMI)?” – has been investigated from five different perspectives based upon The Business Model Cube and The Where to Look Model. Using both established and newly developed tools the paper presents how. Theory and data from two cases are presented and it is demonstrated how industry increase...... their monetary and/or non-monetary value creation doing BMI based upon PAR. The process is essential and using the methodology of PAR creates meaning. Behind the process, the RAR methodology and its link to BM and BMI may contribute to theory construction and creation of a common language in academia around...

  13. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  14. A model for measuring value for money in professional sports

    Directory of Open Access Journals (Sweden)

    Vlad ROŞCA

    2013-07-01

    Full Text Available Few to almost none sports teams measure the entertainment value they provide to fans in exchange of the money the latter ones spend on admission fees. Scientific literature oversees the issue as well. The aim of this paper is to present a model that can be used for calculating value for money in the context of spectating sports. The research question asks how can value for money be conceptualized and measured for sports marketing purposes? Using financial and sporting variables, the method calculates how much money, on average, a fan had to spend for receiving quality entertainment – defined as won matches – from his favorite team, during the last season of the Romanian first division football championship. The results only partially confirm the research hypothesis, showing that not just price and sporting performances may influence the value delivered to fans, but other factors as well.

  15. Deep Drawing Simulations With Different Polycrystalline Models

    Science.gov (United States)

    Duchêne, Laurent; de Montleau, Pierre; Bouvier, Salima; Habraken, Anne Marie

    2004-06-01

    The goal of this research is to study the anisotropic material behavior during forming processes, represented by both complex yield loci and kinematic-isotropic hardening models. A first part of this paper describes the main concepts of the `Stress-strain interpolation' model that has been implemented in the non-linear finite element code Lagamine. This model consists of a local description of the yield locus based on the texture of the material through the full constraints Taylor's model. The texture evolution due to plastic deformations is computed throughout the FEM simulations. This `local yield locus' approach was initially linked to the classical isotropic Swift hardening law. Recently, a more complex hardening model was implemented: the physically-based microstructural model of Teodosiu. It takes into account intergranular heterogeneity due to the evolution of dislocation structures, that affects isotropic and kinematic hardening. The influence of the hardening model is compared to the influence of the texture evolution thanks to deep drawing simulations.

  16. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  17. Values and uncertainties in the predictions of global climate models.

    Science.gov (United States)

    Winsberg, Eric

    2012-06-01

    Over the last several years, there has been an explosion of interest and attention devoted to the problem of Uncertainty Quantification (UQ) in climate science-that is, to giving quantitative estimates of the degree of uncertainty associated with the predictions of global and regional climate models. The technical challenges associated with this project are formidable, and so the statistical community has understandably devoted itself primarily to overcoming them. But even as these technical challenges are being met, a number of persistent conceptual difficulties remain. So why is UQ so important in climate science? UQ, I would like to argue, is first and foremost a tool for communicating knowledge from experts to policy makers in a way that is meant to be free from the influence of social and ethical values. But the standard ways of using probabilities to separate ethical and social values from scientific practice cannot be applied in a great deal of climate modeling, because the roles of values in creating the models cannot be discerned after the fact-the models are too complex and the result of too much distributed epistemic labor. I argue, therefore, that typical approaches for handling ethical/social values in science do not work well here.

  18. IT Business Value Model for Information Intensive Organizations

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Gastaud Maçada

    2012-01-01

    Full Text Available Many studies have highlighted the capacity Information Technology (IT has for generating value for organizations. Investments in IT made by organizations have increased each year. Therefore, the purpose of the present study is to analyze the IT Business Value for Information Intensive Organizations (IIO - e.g. banks, insurance companies and securities brokers. The research method consisted of a survey that used and combined the models from Weill and Broadbent (1998 and Gregor, Martin, Fernandez, Stern and Vitale (2006. Data was gathered using an adapted instrument containing 5 dimensions (Strategic, Informational, Transactional, Transformational and Infra-structure with 27 items. The instrument was refined by employing statistical techniques such as Exploratory and Confirmatory Factorial Analysis through Structural Equations (first and second order Model Measurement. The final model is composed of four factors related to IT Business Value: Strategic, Informational, Transactional and Transformational, arranged in 15 items. The dimension Infra-structure was excluded during the model refinement process because it was discovered during interviews that managers were unable to perceive it as a distinct dimension of IT Business Value.

  19. Value-Added Models: What the Experts Say

    Science.gov (United States)

    Amrein-Beardsley, Audrey; Pivovarova, Margarita; Geiger, Tray J.

    2016-01-01

    Being an expert involves explaining how things are supposed to work, and, perhaps more important, why things might not work as supposed. In this study, researchers surveyed scholars with expertise in value-added models (VAMs) to solicit their opinions about the uses and potential of VAMs for teacher-level accountability purposes (for example, in…

  20. Towards Better Coupling of Hydrological Simulation Models

    Science.gov (United States)

    Penton, D.; Stenson, M.; Leighton, B.; Bridgart, R.

    2012-12-01

    Standards for model interoperability and scientific workflow software provide techniques and tools for coupling hydrological simulation models. However, model builders are yet to realize the benefits of these and continue to write ad hoc implementations and scripts. Three case studies demonstrate different approaches to coupling models, the first using tight interfaces (OpenMI), the second using a scientific workflow system (Trident) and the third using a tailored execution engine (Delft Flood Early Warning System - Delft-FEWS). No approach was objectively better than any other approach. The foremost standard for coupling hydrological models is the Open Modeling Interface (OpenMI), which defines interfaces for models to interact. An implementation of the OpenMI standard involves defining interchange terms and writing a .NET/Java wrapper around the model. An execution wrapper such as OatC.GUI or Pipistrelle executes the models. The team built two OpenMI implementations for eWater Source river system models. Once built, it was easy to swap river system models. The team encountered technical challenges with versions of the .Net framework (3.5 calling 4.0) and with the performance of the execution wrappers when running daily simulations. By design, the OpenMI interfaces are general, leaving significant decisions around the semantics of the interfaces to the implementer. Increasingly, scientific workflow tools such as Kepler, Taverna and Trident are able to replace custom scripts. These tools aim to improve the provenance and reproducibility of processing tasks. In particular, Taverna and the myExperiment website have had success making many bioinformatics workflows reusable and sharable. The team constructed Trident activities for hydrological software including IQQM, REALM and eWater Source. They built an activity generator for model builders to build activities for particular river systems. The models were linked at a simulation level, without any daily time

  1. Modeling and simulation of the human eye

    Science.gov (United States)

    Duran, R.; Ventura, L.; Nonato, L.; Bruno, O.

    2007-02-01

    The computational modeling of the human eye has been wide studied for different sectors of the scientific and technological community. One of the main reasons for this increasing interest is the possibility to reproduce eye optic properties by means of computational simulations, becoming possible the development of efficient devices to treat and to correct the problems of the vision. This work explores this aspect still little investigated of the modeling of the visual system, considering a computational sketch that make possible the use of real data in the modeling and simulation of the human visual system. This new approach makes possible the individual inquiry of the optic system, assisting in the construction of new techniques used to infer vital data in medical investigations. Using corneal topography to collect real data from patients, a computational model of cornea is constructed and a set of simulations were build to ensure the correctness of the system and to investigate the effect of corneal abnormalities in retinal image formation, such as Plcido Discs, Point Spread Function, Wave front and the projection of a real image and it's visualization on retina.

  2. A superbubble feedback model for galaxy simulations

    Science.gov (United States)

    Keller, B. W.; Wadsley, J.; Benincasa, S. M.; Couchman, H. M. P.

    2014-08-01

    We present a new stellar feedback model that reproduces superbubbles. Superbubbles from clustered young stars evolve quite differently to individual supernovae and are substantially more efficient at generating gas motions. The essential new components of the model are thermal conduction, subgrid evaporation and a subgrid multiphase treatment for cases where the simulation mass resolution is insufficient to model the early stages of the superbubble. The multiphase stage is short compared to superbubble lifetimes. Thermal conduction physically regulates the hot gas mass without requiring a free parameter. Accurately following the hot component naturally avoids overcooling. Prior approaches tend to heat too much mass, leaving the hot interstellar medium (ISM) below 106 K and susceptible to rapid cooling unless ad hoc fixes were used. The hot phase also allows feedback energy to correctly accumulate from multiple, clustered sources, including stellar winds and supernovae. We employ high-resolution simulations of a single star cluster to show the model is insensitive to numerical resolution, unresolved ISM structure and suppression of conduction by magnetic fields. We also simulate a Milky Way analogue and a dwarf galaxy. Both galaxies show regulated star formation and produce strong outflows.

  3. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  4. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ

    1985-03-01

    Full Text Available This report outlines progress with the development of computer based dynamic simulation models for ecosystems in the fynbos biome. The models are planned to run on a portable desktop computer with 500 kbytes of memory, extended BASIC language...

  5. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  6. eShopper modeling and simulation

    Science.gov (United States)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  7. An Executable Architecture Tool for the Modeling and Simulation of Operational Process Models

    Science.gov (United States)

    2015-03-16

    national coordination and iterative development. This paper includes a literature review, background information on process models and architecture...future work involves coordination with Subject Matter Experts ( SMEs ), and extracting data from experiments to assign more appropriate values. 3) Sub...development. This paper provided a brief description of other available tools, Fig. 10. Snapshot of Simulation Output Results for Example 3 background

  8. NUMERICAL MODEL APPLICATION IN ROWING SIMULATOR DESIGN

    Directory of Open Access Journals (Sweden)

    Petr Chmátal

    2016-04-01

    Full Text Available The aim of the research was to carry out a hydraulic design of rowing/sculling and paddling simulator. Nowadays there are two main approaches in the simulator design. The first one includes a static water with no artificial movement and counts on specially cut oars to provide the same resistance in the water. The second approach, on the other hand uses pumps or similar devices to force the water to circulate but both of the designs share many problems. Such problems are affecting already built facilities and can be summarized as unrealistic feeling, unwanted turbulent flow and bad velocity profile. Therefore, the goal was to design a new rowing simulator that would provide nature-like conditions for the racers and provide an unmatched experience. In order to accomplish this challenge, it was decided to use in-depth numerical modeling to solve the hydraulic problems. The general measures for the design were taken in accordance with space availability of the simulator ́s housing. The entire research was coordinated with other stages of the construction using BIM. The detailed geometry was designed using a numerical model in Ansys Fluent and parametric auto-optimization tools which led to minimum negative hydraulic phenomena and decreased investment and operational costs due to the decreased hydraulic losses in the system.

  9. Challenges and needs in fire management: A landscape simulation modeling perspective [chapter 4

    Science.gov (United States)

    Robert E. Keane; Geoffrey J. Cary; Mike D. Flannigan

    2011-01-01

    Fire management will face many challenges in the future from global climate change to protecting people, communities, and values at risk. Simulation modeling will be a vital tool for addressing these challenges but the next generation of simulation models must be spatially explicit to address critical landscape ecology relationships and they must use mechanistic...

  10. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    Science.gov (United States)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  11. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  12. Wang-Landau simulation of Gō model molecules.

    Science.gov (United States)

    Böker, Arne; Paul, Wolfgang

    2016-01-01

    Gō-like models are one of the oldest protein modeling concepts in computational physics and have proven their value over and over for forty years. The essence of a Gō model is to define a native contact matrix for a well-defined low-energy polymer configuration, e.g., the native state in the case of proteins or peptides. Many different potential shapes and many different cut-off distances in the definition of this native contact matrix have been proposed and applied. We investigate here the physical consequences of the choice for this cut-off distance in the Gō models derived for a square-well tangent sphere homopolymer chain. For this purpose we are performing flat-histogram Monte Carlo simulations of Wang-Landau type, obtaining the thermodynamic and structural properties of such models over the complete temperature range. Differences and similarities with Gō models for proteins and peptides are discussed.

  13. Mathematical analysis and numerical simulation of a model of morphogenesis.

    Science.gov (United States)

    Muñoz, Ana I; Tello, José Ignacio

    2011-10-01

    We consider a simple mathematical model of distribution of morphogens (signaling molecules responsible for the differentiation of cells and the creation of tissue patterns). The mathematical model is a particular case of the model proposed by Lander, Nie and Wan in 2006 and similar to the model presented in Lander, Nie, Vargas and Wan 2005. The model consists of a system of three equations: a PDE of parabolic type with dynamical boundary conditions modelling the distribution of free morphogens and two ODEs describing the evolution of bound and free receptors. Three biological processes are taken into account: diffusion, degradation and reversible binding. We study the stationary solutions and the evolution problem. Numerical simulations show the behavior of the solution depending on the values of the parameters.

  14. Numerical model for learning concepts of streamflow simulation

    Science.gov (United States)

    DeLong, L.L.; ,

    1993-01-01

    Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.

  15. Modeling and simulation of normal and hemiparetic gait

    Science.gov (United States)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  16. Interactive Modelling and Simulation of Human Motion

    DEFF Research Database (Denmark)

    Engell-Nørregård, Morten Pol

    Dansk resumé Denne ph.d.-afhandling beskæftiger sig med modellering og simulation af menneskelig bevægelse. Emnerne i denne afhandling har mindst to ting til fælles. For det første beskæftiger de sig med menneskelig bevægelse. Selv om de udviklede modeller også kan benyttes til andre ting,er det ....... Endvidere kan den anvendes med enhver softbody simuleringsmodel som finite elements eller mass spring systemer. • En kontrol metode til deformerbare legemer baseret på rum tids opti- mering. fremgangsmåden kan anvendes til at styre sammentrækning af muskler i en muskel simulering....

  17. Computer Modelling and Simulation for Inventory Control

    Directory of Open Access Journals (Sweden)

    G.K. Adegoke

    2012-07-01

    Full Text Available This study concerns the role of computer simulation as a device for conducting scientific experiments on inventory control. The stores function utilizes a bulk of physical assets and engages a bulk of financial resources in a manufacturing outfit therefore there is a need for an efficient inventory control. The reason being that inventory control reduces cost of production and thereby facilitates the effective and efficient accomplishment of production objectives of an organization. Some mathematical and statistical models were used to compute the Economic Order Quantity (EOQ. Test data were gotten from a manufacturing company and same were simulated. The results generated were used to predict a real life situation and have been presented and discussed. The language of implementation for the three models is Turbo Pascal due to its capability, generality and flexibility as a scientific programming language.

  18. Model parameters for simulation of physiological lipids

    Science.gov (United States)

    McGlinchey, Nicholas

    2016-01-01

    Coarse grain simulation of proteins in their physiological membrane environment can offer insight across timescales, but requires a comprehensive force field. Parameters are explored for multicomponent bilayers composed of unsaturated lipids DOPC and DOPE, mixed‐chain saturation POPC and POPE, and anionic lipids found in bacteria: POPG and cardiolipin. A nonbond representation obtained from multiscale force matching is adapted for these lipids and combined with an improved bonding description of cholesterol. Equilibrating the area per lipid yields robust bilayer simulations and properties for common lipid mixtures with the exception of pure DOPE, which has a known tendency to form nonlamellar phase. The models maintain consistency with an existing lipid–protein interaction model, making the force field of general utility for studying membrane proteins in physiologically representative bilayers. © 2016 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26864972

  19. Cultivating a disease management partnership: a value-chain model.

    Science.gov (United States)

    Murray, Carolyn F; Monroe, Wendy; Stalder, Sharon A

    2003-01-01

    Disease management (DM) is one of the health care industry's more innovative value-chain models, whereby multiple relationships are created to bring complex and time-sensitive services to market. The very nature of comprehensive, seamless DM provided through an outsourced arrangement necessitates a level of cooperation, trust, and synergy that may be lacking from more traditional vendor-customer relationships. This discussion highlights the experience of one health plan and its vendor partner and their approach to the development and delivery of an outsourced heart failure (HF) DM program. The program design and rollout are discussed within principles adapted from the theoretical framework of a value-chain model. Within the value-chain model, added value is created by the convergence and synergistic integration of the partners' discrete strengths. Although each partner brings unique attributes to the relationship, those attributes are significantly enhanced by the value-chain model, thus allowing each party to bring the added value of the relationship to their respective customers. This partnership increases innovation, leverages critical capabilities, and improves market responsiveness. Implementing a comprehensive, outsourced DM program is no small task. DM programs incorporate a broad array of services affecting nearly every department in a health plan's organization. When true seamless integration between multiple organizations with multiple stakeholders is the objective, implementation and ongoing operations can become even more complex. To effectively address the complexities presented by an HF DM program, the parties in this case moved beyond a typical purchaser-vendor relationship to one that is more closely akin to a strategic partnership. This discussion highlights the development of this partnership from the perspective of both organizations, as revealed through contracting and implementation activities. It is intended to provide insight into the program

  20. Shanghai Stock Prices as Determined by the Present Value Model

    OpenAIRE

    Gregory C. Chow

    2003-01-01

    Derived from the present-value model of stock prices, our model implies that the log stock price is a linear function of expected log dividends and the expected rate of growth of dividends where expectations are formed adaptively. The model explains very well the prices of 47 stocks traded on the Shanghai Stock Exchange observed at the beginning of 1996, 1997, and 1998. The estimated parameters are remarkably similar to those reported for stocks traded on the Hong Kong Stock Exchange and the ...

  1. Interactive Communication Systems Simulation Model (ICSSM) Extension.

    Science.gov (United States)

    1983-07-01

    IERC, SEPOCH, CARRI, CARRO ) CHIP (DELTC, Generates samples SFSK chip- KDWVFM, NSAMP, modulation waveform. IERC, SMPCI, SMPCQ, SEPOCH) MDULAT (IT, TQ...IERC, SOTS, tion reference. CARRI, CARRO , SMPCI, SMPCQ) RESTOR (IDXP, RP, Modeling utility for storing work- SO, THQ, PHI, ing parameter values in

  2. NUMERICAL SIMULATION AND MODELING OF UNSTEADY FLOW ...

    African Journals Online (AJOL)

    2014-06-30

    Jun 30, 2014 ... Modeling is by definition an approximation of reality, so its results are ... The values of lift coefficient were improved after modifications of the .... of static pressure is defined boundary conditions at the origin of the variation of ...

  3. Theory, Modeling and Simulation Annual Report 2000

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  4. Theory, Modeling and Simulation Annual Report 2000

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  5. Catalog of Wargaming and Military Simulation Models

    Science.gov (United States)

    1992-02-07

    PROPONENT: USAF ASD, McDonnell Douglas Corp. POINT OF CONTACT: Photon Research Associates (Alias): Mr. Jeff Johnson , (619) 455-9741; McDonnell Douglas...POINTOF CONTACT: Dr. R. Johnson , (DSN) 295-1593 or (301) 295-1593. PURPOSE: The model provides simulation of airland activities in a theater of operations...training, and education. PROPONENT: J-8 Political Military Affairs Directorate. POINT OF CONTACT: LTC Steven G. Stainer . PURPOSE: RDSS is a system

  6. Fully Adaptive Radar Modeling and Simulation Development

    Science.gov (United States)

    2017-04-01

    Organization (NATO) Sensors Electronics Technology (SET)-227 Panel on Cognitive Radar. The FAR M&S architecture developed in Phase I allows for...Air Force’s previously developed radar M&S tools. This report is organized as follows. In Chapter 3, we provide an overview of the FAR framework...AFRL-RY-WP-TR-2017-0074 FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT Kristine L. Bell and Anthony Kellems Metron, Inc

  7. Difficulties with True Interoperability in Modeling & Simulation

    Science.gov (United States)

    2011-12-01

    Standards in M&S cover multiple layers of technical abstraction. There are middleware specifica- tions, such as the High Level Architecture (HLA) ( IEEE Xplore ... IEEE Xplore Digital Library. 2010. 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA) – Framework and Rules...using different communication protocols being able to allow da- 2642978-1-4577-2109-0/11/$26.00 ©2011 IEEE Report Documentation Page Form ApprovedOMB No

  8. Recommendations on Model Fidelity for Wind Turbine Gearbox Simulations: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keller, J.; La Cava, W.; Austin, J.; Nejad, A. R.; Halse, C.; Bastard, L.; Helsen, J.

    2015-01-01

    This work investigates the minimum level of fidelity required to accurately simulate wind turbine gearboxes using state-of-the-art design tools. Excessive model fidelity including drivetrain complexity, gearbox complexity, excitation sources, and imperfections, significantly increases computational time, but may not provide a commensurate increase in the value of the results. Essential design parameters are evaluated, including the planetary load-sharing factor, gear tooth load distribution, and sun orbit motion. Based on the sensitivity study results, recommendations for the minimum model fidelities are provided.

  9. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  10. Interactive Modelling and Simulation of Human Motion

    DEFF Research Database (Denmark)

    Engell-Nørregård, Morten Pol

    Dansk resumé Denne ph.d.-afhandling beskæftiger sig med modellering og simulation af menneskelig bevægelse. Emnerne i denne afhandling har mindst to ting til fælles. For det første beskæftiger de sig med menneskelig bevægelse. Selv om de udviklede modeller også kan benyttes til andre ting,er det...... menneskers led, der udviser både ikke-konveksitet og flere frihedsgrader • En generel og alsidig model for aktivering af bløde legemer. Modellen kan anvendes som et animations værktøj, men er lige så velegnet til simulering af menneskelige muskler, da den opfylder de grundlæggende fysiske principper...... primære fokus på at modellere den menneskelige krop. For det andet, beskæftiger de sig alle med simulering som et redskab til at syntetisere bevægelse og dermed skabe animationer. Dette er en vigtigt pointe, da det betyder, at vi ikke kun skaber værktøjer til animatorer, som de kan bruge til at lave sjove...

  11. WiBro Mobility Simulation Model

    Directory of Open Access Journals (Sweden)

    Junaid Qayyum

    2011-09-01

    Full Text Available WiBro, or Wireless Broadband, is the newest variety of mobile wireless broadband access. WiBro technology is being developed by the Korean Telecoms industry. It is based on the IEEE 802.16e (Mobile WiMax international standard. Korean based fixed-line operators KT, SK Telecom were the first to get the licenses by the South Korean government to provide WiBro Commercially. Samsung had a demonstration on WiBro Mobile Phones and Systems at the "APEC IT Exhibition 2006". WiBro is comprised of two phases namely WiBro Phase I and WiBro Phase II. Samsung Electronics has been extensively contributing to Koreas WiBro (Wireless Broadband initiative as well as the IEEE 802.16 standards. The WiBro is a specific subset of the 802.16 standards, specially focusing on supporting full mobility of wireless access systems with OFDMA PHY interface. In this work, we have developed a simulation model of the WiBro system consisting of a set of Base Stations and Mobile Subscriber Stations by using the OPNET Modeler. The simulation model has been utilized to evaluate effective MAC layer throughput, resource usage efficiency, QoS class differentiation, and system capacity and performance under various simulation scenarios.

  12. Progress in Modeling and Simulation of Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Turner, John A [ORNL

    2016-01-01

    Modeling and simulation of batteries, in conjunction with theory and experiment, are important research tools that offer opportunities for advancement of technologies that are critical to electric motors. The development of data from the application of these tools can provide the basis for managerial and technical decision-making. Together, these will continue to transform batteries for electric vehicles. This collection of nine papers presents the modeling and simulation of batteries and the continuing contribution being made to this impressive progress, including topics that cover: * Thermal behavior and characteristics * Battery management system design and analysis * Moderately high-fidelity 3D capabilities * Optimization Techniques and Durability As electric vehicles continue to gain interest from manufacturers and consumers alike, improvements in economy and affordability, as well as adoption of alternative fuel sources to meet government mandates are driving battery research and development. Progress in modeling and simulation will continue to contribute to battery improvements that deliver increased power, energy storage, and durability to further enhance the appeal of electric vehicles.

  13. Simulating an Optimizing Model of Currency Substitution Simulating an Optimizing Model of Currency Substitution

    Directory of Open Access Journals (Sweden)

    Leonardo Leiderman

    1992-03-01

    Full Text Available Simulating an Optimizing Model of Currency Substitution This paper reports simulations based on the parameter estimates of an intertemporal model of currency substitution under nonexpected utility obtained by Bufman and Leiderman (1991. Here we first study the quantitative impact of changes in the degree of dollarization and in the elasticity of currency substitution on government seigniorage. Then, when examine whether the model can account for the comovement of consumption growth and assets' returnr after the 1985 stabilization program, and in particular for the consumption boom of 1986-87. The results are generally encouraging for future applications of optimizing models of currencysubstitution to policy and practical issues.

  14. Multiple Regression and Mediator Variables can be used to Avoid Double Counting when Economic Values are Derived using Stochastic Herd Simulation

    DEFF Research Database (Denmark)

    Østergaard, Søren; Ettema, Jehan Frans; Hjortø, Line

    Multiple regression and model building with mediator variables was addressed to avoid double counting when economic values are estimated from data simulated with herd simulation modeling (using the SimHerd model). The simulated incidence of metritis was analyzed statistically as the independent...... variable, while using the traits representing the direct effects of metritis on yield, fertility and occurrence of other diseases as mediator variables. The economic value of metritis was estimated to be €78 per 100 cow-years for each 1% increase of metritis in the period of 1-100 days in milk...... in multiparous cows. The merit of using this approach was demonstrated since the economic value of metritis was estimated to be 81% higher when no mediator variables were included in the multiple regression analysis...

  15. The heuristic value of redundancy models of aging.

    Science.gov (United States)

    Boonekamp, Jelle J; Briga, Michael; Verhulst, Simon

    2015-11-01

    Molecular studies of aging aim to unravel the cause(s) of aging bottom-up, but linking these mechanisms to organismal level processes remains a challenge. We propose that complementary top-down data-directed modelling of organismal level empirical findings may contribute to developing these links. To this end, we explore the heuristic value of redundancy models of aging to develop a deeper insight into the mechanisms causing variation in senescence and lifespan. We start by showing (i) how different redundancy model parameters affect projected aging and mortality, and (ii) how variation in redundancy model parameters relates to variation in parameters of the Gompertz equation. Lifestyle changes or medical interventions during life can modify mortality rate, and we investigate (iii) how interventions that change specific redundancy parameters within the model affect subsequent mortality and actuarial senescence. Lastly, as an example of data-directed modelling and the insights that can be gained from this, (iv) we fit a redundancy model to mortality patterns observed by Mair et al. (2003; Science 301: 1731-1733) in Drosophila that were subjected to dietary restriction and temperature manipulations. Mair et al. found that dietary restriction instantaneously reduced mortality rate without affecting aging, while temperature manipulations had more transient effects on mortality rate and did affect aging. We show that after adjusting model parameters the redundancy model describes both effects well, and a comparison of the parameter values yields a deeper insight in the mechanisms causing these contrasting effects. We see replacement of the redundancy model parameters by more detailed sub-models of these parameters as a next step in linking demographic patterns to underlying molecular mechanisms.

  16. Characteristic Finite Difference Methods for Moving Boundary Value Problem of Numerical Simulation of Oil Deposit

    Institute of Scientific and Technical Information of China (English)

    袁益让

    1994-01-01

    The software for oil-gas transport and accumulation is to describe the history of oil-gas transport and accumulation in basin evolution. It is of great value in rational evaluation of prospecting and exploiting oil-gas resources. The mathematical model can be described as a coupled system of nonlinear partial differential equations with moving boundary value problem. This paper puts forward a kind of characteristic finite difference schemes, and derives from them optimal order estimates in l~2 norm for the error in the approximate solutions. The research is important both theoretically and practically for the model analysis in the field, for model numerical method and for software development.

  17. Value stream mapping and simulation for implementation of lean manufacturing practices in a footwear company

    Directory of Open Access Journals (Sweden)

    Danilo Felipe Silva de Lima

    2016-03-01

    Full Text Available The development of the Value Stream Mapping (VSM is generally the first step for implementation of Lean Manufacturing (LM. The aim of this paper is to present an application of VSM with simulation in order to analyze the impacts of the LM adoption in the performance of a footwear plant. Therefore, a VSM was designed for the current state and, through the implementation of lean elements, a future state could be designed. Different scenarios were simulated for the future state implementation and the results were compared each other. Transfer, cutting and assembly sections were chosen to be simulated, because it was considered that would be possible to establish a one-piece flow between those processes. After the simulation, the scenario that presented the best results provided a 19% productivity increase over the current state, as well as improvement in all other process variables. The application of simulation as an additional element of VSM has helped to identify the advantages of the joint approach, since it enables to test different alternatives and better define the future state and its implementation strategies.

  18. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  19. Consolidation modelling for thermoplastic composites forming simulation

    Science.gov (United States)

    Xiong, H.; Rusanov, A.; Hamila, N.; Boisse, P.

    2016-10-01

    Pre-impregnated thermoplastic composites are widely used in the aerospace industry for their excellent mechanical properties, Thermoforming thermoplastic prepregs is a fast manufacturing process, the automotive industry has shown increasing interest in this manufacturing processes, in which the reconsolidation is an essential stage. The model of intimate contact is investigated as the consolidation model, compression experiments have been launched to identify the material parameters, several numerical tests show the influents of the temperature and pressure applied during processing. Finally, a new solid-shell prismatic element has been presented for the simulation of consolidation step in the thermoplastic composites forming process.

  20. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe

    2015-01-01

    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  1. Solar Electric Bicycle Body Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Zhikun Wang

    2013-10-01

    Full Text Available A new solar electric bicycle design and study were carried out on in this paper. Application of CAD technology to establish three-dimension geometric model, using the kinetic analysis on the frame and other parts for numerical simulation and static strength analysis for the vehicle model design, virtual assembly, complete frame dynamics analysis and vibration analysis, with considering other factors, first on the frame structure improvement, second on security of design calculation analysis and comparison, finally get the ideal body design.

  2. Viscoelastic flow simulations in model porous media

    Science.gov (United States)

    De, S.; Kuipers, J. A. M.; Peters, E. A. J. F.; Padding, J. T.

    2017-05-01

    We investigate the flow of unsteadfy three-dimensional viscoelastic fluid through an array of symmetric and asymmetric sets of cylinders constituting a model porous medium. The simulations are performed using a finite-volume methodology with a staggered grid. The solid-fluid interfaces of the porous structure are modeled using a second-order immersed boundary method [S. De et al., J. Non-Newtonian Fluid Mech. 232, 67 (2016), 10.1016/j.jnnfm.2016.04.002]. A finitely extensible nonlinear elastic constitutive model with Peterlin closure is used to model the viscoelastic part. By means of periodic boundary conditions, we model the flow behavior for a Newtonian as well as a viscoelastic fluid through successive contractions and expansions. We observe the presence of counterrotating vortices in the dead ends of our geometry. The simulations provide detailed insight into how flow structure, viscoelastic stresses, and viscoelastic work change with increasing Deborah number De. We observe completely different flow structures and different distributions of the viscoelastic work at high De in the symmetric and asymmetric configurations, even though they have the exact same porosity. Moreover, we find that even for the symmetric contraction-expansion flow, most energy dissipation is occurring in shear-dominated regions of the flow domain, not in extensional-flow-dominated regions.

  3. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  4. LISP based simulation generators for modeling complex space processes

    Science.gov (United States)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  5. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  6. Simulation Model of Brushless Excitation System

    Directory of Open Access Journals (Sweden)

    Ahmed N.A.  Alla

    2007-01-01

    Full Text Available Excitation system is key element in the dynamic performance of electric power systems, accurate excitation models are of great importance in simulating and investigating the power system transient phenomena. Parameter identification of the Brushless excitation system was presented. First a block diagram for the EXS parameter was proposed based on the documents and maps in the power station. To identify the parameters of this model, a test procedure to obtain step response, was presented. Using the Genetic Algorithm with the Matlab-software it was possible to identify all the necessary parameters of the model. Using the same measured input signals the response from the standard model showed nearly the same behavior as the excitation system.

  7. Modeling and simulation of direct contact evaporators

    Directory of Open Access Journals (Sweden)

    F.B. Campos

    2001-09-01

    Full Text Available A dynamic model of a direct contact evaporator was developed and coupled to a recently developed superheated bubble model. The latter model takes into account heat and mass transfer during the bubble formation and ascension stages and is able to predict gas holdup in nonisothermal systems. The results of the coupled model, which does not have any adjustable parameter, were compared with experimental data. The transient behavior of the liquid-phase temperature and the vaporization rate under quasi-steady-state conditions were in very good agreement with experimental data. The transient behavior of liquid height was only reasonably simulated. In order to explain this partial disagreement, some possible causes were analyzed.

  8. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    , that can accurately and efficiently simulate wind turbine wakes. The linear k-ε eddy viscosity model (EVM) is a popular turbulence model in RANS; however, it underpredicts the velocity wake deficit and cannot predict the anisotropic Reynolds-stresses in the wake. In the current work, nonlinear eddy...... viscosity models (NLEVM) are applied to wind turbine wakes. NLEVMs can model anisotropic turbulence through a nonlinear stress-strain relation, and they can improve the velocity deficit by the use of a variable eddy viscosity coefficient, that delays the wake recovery. Unfortunately, all tested NLEVMs show...... numerically unstable behavior for fine grids, which inhibits a grid dependency study for numerical verification. Therefore, a simpler EVM is proposed, labeled as the k-ε - fp EVM, that has a linear stress-strain relation, but still has a variable eddy viscosity coefficient. The k-ε - fp EVM is numerically...

  9. Computing Critical Values of Exact Tests by Incorporating Monte Carlo Simulations Combined with Statistical Tables.

    Science.gov (United States)

    Vexler, Albert; Kim, Young Min; Yu, Jihnhee; Lazar, Nicole A; Hutson, Aland

    2014-12-01

    Various exact tests for statistical inference are available for powerful and accurate decision rules provided that corresponding critical values are tabulated or evaluated via Monte Carlo methods. This article introduces a novel hybrid method for computing p-values of exact tests by combining Monte Carlo simulations and statistical tables generated a priori. To use the data from Monte Carlo generations and tabulated critical values jointly, we employ kernel density estimation within Bayesian-type procedures. The p-values are linked to the posterior means of quantiles. In this framework, we present relevant information from the Monte Carlo experiments via likelihood-type functions, whereas tabulated critical values are used to reflect prior distributions. The local maximum likelihood technique is employed to compute functional forms of prior distributions from statistical tables. Empirical likelihood functions are proposed to replace parametric likelihood functions within the structure of the posterior mean calculations to provide a Bayesian-type procedure with a distribution-free set of assumptions. We derive the asymptotic properties of the proposed nonparametric posterior means of quantiles process. Using the theoretical propositions, we calculate the minimum number of needed Monte Carlo resamples for desired level of accuracy on the basis of distances between actual data characteristics (e.g. sample sizes) and characteristics of data used to present corresponding critical values in a table. The proposed approach makes practical applications of exact tests simple and rapid. Implementations of the proposed technique are easily carried out via the recently developed STATA and R statistical packages.

  10. Integrating Visualizations into Modeling NEST Simulations.

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  11. Integrating Visualizations into Modeling NEST Simulations

    Directory of Open Access Journals (Sweden)

    Christian eNowke

    2015-12-01

    Full Text Available Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  12. Assessing the value of increased model resolution in forecasting fire danger

    Science.gov (United States)

    Jeanne Hoadley; Miriam Rorig; Ken Westrick; Larry Bradshaw; Sue Ferguson; Scott Goodrick; Paul Werth

    2003-01-01

    The fire season of 2000 was used as a case study to assess the value of increasing mesoscale model resolution for fire weather and fire danger forecasting. With a domain centered on Western Montana and Northern Idaho, MM5 simulations were run at 36, 12, and 4-km resolutions for a 30 day period at the height of the fire season. Verification analyses for meteorological...

  13. The Effects of Use of Average Instead of Daily Weather Data in Crop Growth Simulation Models

    NARCIS (Netherlands)

    Nonhebel, Sanderine

    1994-01-01

    Development and use of crop growth simulation models has increased in the last decades. Most crop growth models require daily weather data as input values. These data are not easy to obtain and therefore in many studies daily data are generated, or average values are used as input data for these

  14. Simulation modeling for microbial risk assessment.

    Science.gov (United States)

    Cassin, M H; Paoli, G M; Lammerding, A M

    1998-11-01

    Quantitative microbial risk assessment implies an estimation of the probability and impact of adverse health outcomes due to microbial hazards. In the case of food safety, the probability of human illness is a complex function of the variability of many parameters that influence the microbial environment, from the production to the consumption of a food. The analytical integration required to estimate the probability of foodborne illness is intractable in all but the simplest of models. Monte Carlo simulation is an alternative to computing analytical solutions. In some cases, a risk assessment may be commissioned to serve a larger purpose than simply the estimation of risk. A Monte Carlo simulation can provide insights into complex processes that are invaluable, and otherwise unavailable, to those charged with the task of risk management. Using examples from a farm-to-fork model of the fate of Escherichia coli O157:H7 in ground beef hamburgers, this paper describes specifically how such goals as research prioritization, risk-based characterization of control points, and risk-based comparison of intervention strategies can be objectively achieved using Monte Carlo simulation.

  15. Application of simulation models for the optimization of business processes

    Science.gov (United States)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  16. Raytracing simulations of coupled dark energy models

    CERN Document Server

    Pace, Francesco; Moscardini, Lauro; Bacon, David; Crittenden, Robert

    2014-01-01

    Dark matter and dark energy are usually assumed to be independent, coupling only gravitationally. An extension to this simple picture is to model dark energy as a scalar field which is directly coupled to the cold dark matter fluid. Such a non-trivial coupling in the dark sector leads to a fifth force and a time-dependent dark matter particle mass. In this work we examine the impact that dark energy-dark matter couplings have on weak lensing statistics by constructing realistic simulated weak-lensing maps using raytracing techniques through a suite of N-body cosmological simulations. We construct maps for an array of different lensing quantities, covering a range of scales from a few arcminutes to several degrees. The concordance $\\Lambda$CDM model is compared to different coupled dark energy models, described either by an exponential scalar field potential (standard coupled dark energy scenario) or by a SUGRA potential (bouncing model). We analyse several statistical quantities, in particular the power spect...

  17. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  18. A holistic model for Islamic accountants and its value added

    OpenAIRE

    El-Halaby, Sherif; Hussainey, Khaled

    2015-01-01

    Purpose – The core objective for this study is introduce the holistic model for Islamic accountants through exploring the perspectives of Muslim scholars; Islamic sharia and AAOIFI ethical standards. The study also contributes to existing literature by exploring the main added value of Muslim accountant towards stakeholders through investigates the main roles of an Islamic accountants. Design/methodology/approach – The paper critically reviews historical debates about Islamic accounting and t...

  19. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained. It is fo......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained....... It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected...

  20. Galaxy alignments: Theory, modelling and simulations

    CERN Document Server

    Kiessling, Alina; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L; Rassat, Anais

    2015-01-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in large-scale structure tend to align the shapes and angular momenta of nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both $N$-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the ...

  1. A Simulation Model for Component Commonality

    Institute of Scientific and Technical Information of China (English)

    ZHU Xiao-chi; ZHANG Zi-gang

    2002-01-01

    Component commonality has been cited as a powerful approach for manufacturers to cope with increased component proliferation and to control inventory costs. To fully realize its potential benefits, one needs a clear understanding of its impacts on the system. In this paper, the feasibility of using a simulation model to provide a systematic perspective for manufacturing firms to implement a commonality strategy is demonstrated. Alternative commonality strategies including the stage of employing commonality and the allocation policies are simulated. Several interesting results on effects of commonality, allocation policies,and optimal solutions are obtained. We then summarize qualitative insights and managerial implications into the component commonality design and implementation, and inventory management in a general multi-stage assembly system.

  2. Assumed PDF modeling in rocket combustor simulations

    Science.gov (United States)

    Lempke, M.; Gerlinger, P.; Aigner, M.

    2013-03-01

    In order to account for the interaction between turbulence and chemistry, a multivariate assumed PDF (Probability Density Function) approach is used to simulate a model rocket combustor with finite-rate chemistry. The reported test case is the PennState preburner combustor with a single shear coaxial injector. Experimental data for the wall heat flux is available for this configuration. Unsteady RANS (Reynolds-averaged Navier-Stokes) simulation results with and without the assumed PDF approach are analyzed and compared with the experimental data. Both calculations show a good agreement with the experimental wall heat flux data. Significant changes due to the utilization of the assumed PDF approach can be observed in the radicals, e. g., the OH mass fraction distribution, while the effect on the wall heat flux is insignificant.

  3. The Deficit Model and the Forgotten Moral Values

    Directory of Open Access Journals (Sweden)

    Marko Ahteensuu

    2011-03-01

    Full Text Available This paper was presented at the first meeting of the NSU study group “Conceptions of ethical and social values in post-secular society: Towards a new ethical imagination in a cosmopolitan world society”, held on January 28-30, 2011 at Copenhagen Business School. The deficit model explains the general public’s negative attitudes towards science and/or certain scientific applications with the public’s scientific ignorance. The deficit model is commonly criticized for oversimplifying the connection between scientific knowledge and attitudes. Other relevant factors – such as ideology, social identity, trust, culture, and worldviews – should be taken into consideration to a greater extent. We argue that explanations based on the proposed factors sometimes implicitly reintroduce the deficit model type of thinking. The strength of the factors is that they broaden the explanations to concern moral issues. We analyse two central argument types of GMO discussion, and show the central role of moral values in them. Thus, as long as arguments are seen to affect the attitudes of the general public, the role of moral values should be made explicit in the explanations concerning their attitudes.

  4. AskIT Service Desk Support Value Model

    Energy Technology Data Exchange (ETDEWEB)

    Ashcraft, Phillip Lynn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cummings, Susan M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fogle, Blythe G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valdez, Christopher D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-07

    The value model discussed herein provides an accurate and simple calculation of the funding required to adequately staff the AskIT Service Desk (SD).  The model is incremental – only technical labor cost is considered.  All other costs, such as management, equipment, buildings, HVAC, and training are considered common elements of providing any labor related IT Service. Depending on the amount of productivity loss and the number of hours the defect was unresolved, the value of resolving work from the SD is unquestionably an economic winner; the average cost of $16 per SD resolution can commonly translate to cost avoidance exceeding well over $100. Attempting to extract too much from the SD will likely create a significant downside. The analysis used to develop the value model indicates that the utilization of the SD is very high (approximately 90%).  As a benchmark, consider a comment from a manager at Vitalyst (a commercial IT service desk) that their utilization target is approximately 60%.  While high SD utilization is impressive, over the long term it is likely to cause unwanted consequences to staff such as higher turnover, illness, or burnout.  A better solution is to staff the SD so that analysts have time to improve skills through training, develop knowledge, improve processes, collaborate with peers, and improve customer relationship skills.

  5. Simulation modeling of outcomes and cost effectiveness.

    Science.gov (United States)

    Ramsey, S D; McIntosh, M; Etzioni, R; Urban, N

    2000-08-01

    Modeling will continue to be used to address important issues in clinical practice and health policy issues that have not been adequately studied with high-quality clinical trials. The apparent ad hoc nature of models belies the methodologic rigor that is applied to create the best models in cancer prevention and care. Models have progressed from simple decision trees to extremely complex microsimulation analyses, yet all are built using a logical process based on objective evaluation of the path between intervention and outcome. The best modelers take great care to justify both the structure and content of the model and then test their assumptions using a comprehensive process of sensitivity analysis and model validation. Like clinical trials, models sometimes produce results that are later found to be invalid as other data become available. When weighing the value of models in health care decision making, it is reasonable to consider the alternatives. In the absence of data, clinical policy decisions are often based on the recommendations of expert opinion panels or on poorly defined notions of the standard of care or medical necessity. Because such decision making rarely entails the rigorous process of data collection, synthesis, and testing that is the core of well-conducted modeling, it is usually not possible for external audiences to examine the assumptions and data that were used to derive the decisions. One of the modeler's most challenging tasks is to make the structure and content of the model transparent to the intended audience. The purpose of this article is to clarify the process of modeling, so that readers of models are more knowledgeable about their uses, strengths, and limitations.

  6. Conceptual Model of Quantities, Units, Dimensions, and Values

    Science.gov (United States)

    Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar

    2011-01-01

    JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.

  7. Representation of Solar Capacity Value in the ReEDS Capacity Expansion Model

    Energy Technology Data Exchange (ETDEWEB)

    Sigrin, B.; Sullivan, P.; Ibanez, E.; Margolis, R.

    2014-03-01

    An important issue for electricity system operators is the estimation of renewables' capacity contributions to reliably meeting system demand, or their capacity value. While the capacity value of thermal generation can be estimated easily, assessment of wind and solar requires a more nuanced approach due to the resource variability. Reliability-based methods, particularly assessment of the Effective Load-Carrying Capacity, are considered to be the most robust and widely-accepted techniques for addressing this resource variability. This report compares estimates of solar PV capacity value by the Regional Energy Deployment System (ReEDS) capacity expansion model against two sources. The first comparison is against values published by utilities or other entities for known electrical systems at existing solar penetration levels. The second comparison is against a time-series ELCC simulation tool for high renewable penetration scenarios in the Western Interconnection. Results from the ReEDS model are found to compare well with both comparisons, despite being resolved at a super-hourly temporal resolution. Two results are relevant for other capacity-based models that use a super-hourly resolution to model solar capacity value. First, solar capacity value should not be parameterized as a static value, but must decay with increasing penetration. This is because -- for an afternoon-peaking system -- as solar penetration increases, the system's peak net load shifts to later in the day -- when solar output is lower. Second, long-term planning models should determine system adequacy requirements in each time period in order to approximate LOLP calculations. Within the ReEDS model we resolve these issues by using a capacity value estimate that varies by time-slice. Within each time period the net load and shadow price on ReEDS's planning reserve constraint signals the relative importance of additional firm capacity.

  8. Traffic flow dynamics data, models and simulation

    CERN Document Server

    Treiber, Martin

    2013-01-01

    This textbook provides a comprehensive and instructive coverage of vehicular traffic flow dynamics and modeling. It makes this fascinating interdisciplinary topic, which to date was only documented in parts by specialized monographs, accessible to a broad readership. Numerous figures and problems with solutions help the reader to quickly understand and practice the presented concepts. This book is targeted at students of physics and traffic engineering and, more generally, also at students and professionals in computer science, mathematics, and interdisciplinary topics. It also offers material for project work in programming and simulation at college and university level. The main part, after presenting different categories of traffic data, is devoted to a mathematical description of the dynamics of traffic flow, covering macroscopic models which describe traffic in terms of density, as well as microscopic many-particle models in which each particle corresponds to a vehicle and its driver. Focus chapters on ...

  9. Biomechanics trends in modeling and simulation

    CERN Document Server

    Ogden, Ray

    2017-01-01

    The book presents a state-of-the-art overview of biomechanical and mechanobiological modeling and simulation of soft biological tissues. Seven well-known scientists working in that particular field discuss topics such as biomolecules, networks and cells as well as failure, multi-scale, agent-based, bio-chemo-mechanical and finite element models appropriate for computational analysis. Applications include arteries, the heart, vascular stents and valve implants as well as adipose, brain, collagenous and engineered tissues. The mechanics of the whole cell and sub-cellular components as well as the extracellular matrix structure and mechanotransduction are described. In particular, the formation and remodeling of stress fibers, cytoskeletal contractility, cell adhesion and the mechanical regulation of fibroblast migration in healing myocardial infarcts are discussed. The essential ingredients of continuum mechanics are provided. Constitutive models of fiber-reinforced materials with an emphasis on arterial walls ...

  10. Hierarchical Boltzmann simulations and model error estimation

    Science.gov (United States)

    Torrilhon, Manuel; Sarna, Neeraj

    2017-08-01

    A hierarchical simulation approach for Boltzmann's equation should provide a single numerical framework in which a coarse representation can be used to compute gas flows as accurately and efficiently as in computational fluid dynamics, but a subsequent refinement allows to successively improve the result to the complete Boltzmann result. We use Hermite discretization, or moment equations, for the steady linearized Boltzmann equation for a proof-of-concept of such a framework. All representations of the hierarchy are rotationally invariant and the numerical method is formulated on fully unstructured triangular and quadrilateral meshes using a implicit discontinuous Galerkin formulation. We demonstrate the performance of the numerical method on model problems which in particular highlights the relevance of stability of boundary conditions on curved domains. The hierarchical nature of the method allows also to provide model error estimates by comparing subsequent representations. We present various model errors for a flow through a curved channel with obstacles.

  11. Modelling and Simulation for Major Incidents

    Directory of Open Access Journals (Sweden)

    Eleonora Pacciani

    2015-11-01

    Full Text Available In recent years, there has been a rise in Major Incidents with big impact on the citizens health and the society. Without the possibility of conducting live experiments when it comes to physical and/or toxic trauma, only an accurate in silico reconstruction allows us to identify organizational solutions with the best possible chance of success, in correlation with the limitations on available resources (e.g. medical team, first responders, treatments, transports, and hospitals availability and with the variability of the characteristic of event (e.g. type of incident, severity of the event and type of lesions. Utilizing modelling and simulation techniques, a simplified mathematical model of physiological evolution for patients involved in physical and toxic trauma incident scenarios has been developed and implemented. The model formalizes the dynamics, operating standards and practices of medical response and the main emergency service in the chain of emergency management during a Major Incident.

  12. Heinrich events modeled in transient glacial simulations

    Science.gov (United States)

    Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe

    2017-04-01

    Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.

  13. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Sheng [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China); Qu, Xiaobo [Griffith School of Engineering, Griffith University, Gold Coast, 4222 Australia (Australia); Xu, Cheng [Department of Transportation Management Engineering, Zhejiang Police College, Hangzhou, 310053 China (China); College of Transportation, Jilin University, Changchun, 130022 China (China); Ma, Dongfang, E-mail: mdf2004@zju.edu.cn [Ocean College, Zhejiang University, Hangzhou, 310058 China (China); Wang, Dianhai [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China)

    2015-10-16

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated.

  14. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  15. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  16. The simulation model of planar electrochemical transducer

    Science.gov (United States)

    Zhevnenko, D. A.; Vergeles, S. S.; Krishtop, T. V.; Tereshonok, D. V.; Gornev, E. S.; Krishtop, V. G.

    2016-12-01

    Planar electrochemical systems are very perspective to build modern motion and pressure sensors. Planar microelectronic technology is successfully used for electrochemical transducer of motion parameters. These systems are characterized by an exceptionally high sensitivity towards mechanic exposure due to high rate of conversion of the mechanic signal to electric current. In this work, we have developed a mathematical model of this planar electrochemical system, which detects the mechanical signals. We simulate the processes of mass and charge transfer in planar electrochemical transducer and calculated its transfer function with different geometrical parameters of the system.

  17. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  18. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    . It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...

  19. CASTOR detector Model, objectives and simulated performance

    CERN Document Server

    Angelis, Aris L S; Bartke, Jerzy; Bogolyubsky, M Yu; Chileev, K; Erine, S; Gladysz-Dziadus, E; Kharlov, Yu V; Kurepin, A B; Lobanov, M O; Maevskaya, A I; Mavromanolakis, G; Nicolis, N G; Panagiotou, A D; Sadovsky, S A; Wlodarczyk, Z

    2001-01-01

    We present a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. We describe the CASTOR calorimeter, a subdetector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented. (22 refs).

  20. CASTOR detector. Model, objectives and simulated performance

    Energy Technology Data Exchange (ETDEWEB)

    Angelis, A. L. S.; Mavromanolakis, G.; Panagiotou, A. D. [University of Athens, Nuclear and Particle Physics Division, Athens (Greece); Aslanoglou, X.; Nicolis, N. [Ioannina Univ., Ioannina (Greece). Dept. of Physics; Bartke, J.; Gladysz-Dziadus, E. [Institute of Nuclear Physics, Cracow (Poland); Lobanov, M.; Erine, S.; Kharlov, Y.V.; Bogolyubsky, M.Y. [Institute for High Energy Physics, Protvino (Russian Federation); Kurepin, A.B.; Chileev, K. [Institute for Nuclear Research, Moscow (Russian Federation); Wlodarczyk, Z. [Pedagogical University, Institute of Physics, Kielce (Poland)

    2001-10-01

    It is presented a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. It is described the CASTOR calorimeter, a sub detector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented.

  1. Modelling and simulations of controlled release fertilizer

    Science.gov (United States)

    Irfan, Sayed Ameenuddin; Razali, Radzuan; Shaari, Ku Zilati Ku; Mansor, Nurlidia

    2016-11-01

    The recent advancement in controlled release fertilizer has provided an alternative solution to the conventional urea, controlled release fertilizer has a good plant nutrient uptake they are environment friendly. To have an optimum plant intake of nutrients from controlled release fertilizer it is very essential to understand the release characteristics. A mathematical model is developed to predict the release characteristics from polymer coated granule. Numerical simulations are performed by varying the parameters radius of granule, soil water content and soil porosity to study their effect on fertilizer release. Understanding these parameters helps in the better design and improve the efficiency of controlled release fertilizer.

  2. Simulation Performance of MMSE Iterative Equalization with Soft Boolean Value Propagation

    CERN Document Server

    Krishnamoorthy, Aravindh; Jandial, Ravi

    2011-01-01

    The performance of MMSE Iterative Equalization based on MAP-SBVP and COD-MAP algorithms (for generating extrinsic information) are compared for fading and non-fading communication channels employing serial concatenated convolution codes. MAP-SBVP is a convolution decoder using a conventional soft-MAP decoder followed by a soft-convolution encoder using the soft-boolean value propagation (SBVP). From the simulations it is observed that for MMSE Iterative Equalization, MAP-SBVP performance is comparable to COD-MAP for fading and non-fading channels.

  3. Monte Carlo Simulation of River Meander Modelling

    Science.gov (United States)

    Posner, A. J.; Duan, J. G.

    2010-12-01

    This study first compares the first order analytical solutions for flow field by Ikeda et. al. (1981) and Johanesson and Parker (1989b). Ikeda et. al.’s (1981) linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g. cohesiveness, stratigraphy, vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations. Several measures are formulated in order to determine which of the resulting planform is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model. Quasi-2D Ikeda (1989) flow solution with Monte Carlo Simulation of Bank Erosion Coefficient.

  4. Axisymmetric Vortex Simulations with Various Turbulence Models

    Directory of Open Access Journals (Sweden)

    Brian Howard Fiedler

    2010-10-01

    Full Text Available The CFD code FLUENTTM has been applied to a vortex within an updraft above a frictional lower boundary. The sensitivity of vortex intensity and structure to the choice of turbulent model is explored. A high Reynolds number of 108 is employed to make the investigation relevant to the atmospheric vortex known as a tornado. The simulations are axisymmetric and are integrated forward in time to equilibrium.  In a variety of turbulence models tested, the Reynolds Stress Model allows for the greatest intensification of the vortex, with the azimuthal wind speed near the surface being 2.4 times the speed of the updraft, consistent with the destructive nature of tornadoes.  The Standard k-e Model, which is simpler than the Reynolds Stress Model but still more detailed than what is commonly available in numerical weather prediction models, produces an azimuthal wind speed near the surface of at most 0.6 times the updraft speed.        

  5. Simulation model for port shunting yards

    Science.gov (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  6. The mathematical model of a LUNG simulator

    Directory of Open Access Journals (Sweden)

    František Šolc

    2014-12-01

    Full Text Available The paper discusses the design, modelling, implementation and testing of a specific LUNG simulator,. The described research was performed as a part of the project AlveoPic – Advanced Lung Research for Veterinary Medicine of Particles for Inhalation. The simulator was designed to establish a combined study programme comprising Biomedical Engineering Sciences (FEEC BUT and Healthcare and Rehabilitation Technology (FH Technikum Wien. The simulator is supposed to be an advanced laboratory equipment which should enhance the standard of the existing research activities within the above-mentioned study programs to the required level. Thus, the proposed paper introduces significant technical equipment for the laboratory education of students at both FH Technikum Wien and the Faculty of Electrical Engineering and Communication, Brno University of Technology. The apparatuses described here will be also used to support cooperative research activities. In the given context, the authors specify certain technical solutions and parameters related to artificial lungs, present the electrical equipment of the system, and point out the results of the PC-based measurement and control.

  7. Modeling Stakeholder/Value Dependency through Mean Failure Cost

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2010-01-01

    In an earlier series of works, Boehm et al. discuss the nature of information system dependability and highlight the variability of system dependability according to stakeholders. In a recent paper, the dependency patterns of this model are analyzed. In our recent works, we presented a stakeholder dependent quantitative security model, where we quantify security for a given stakeholder by the mean of the loss incurred by the stakeholder as a result of security threats. We show how this mean can be derived from the security threat configuration (represented as a vector of probabilities that reflect the likelihood of occurrence of the various security threats). We refer to our security metric as MFC, for Mean Failure Cost. In this paper, we analyze Boehm's model from the standpoint of the proposed metric, and show whether, to what extent, and how our metric addresses the issues raised by Boehm's Stakeholder/Value definition of system dependability.

  8. Value of the distant future: Model-independent results

    Science.gov (United States)

    Katz, Yuri A.

    2017-01-01

    This paper shows that the model-independent account of correlations in an interest rate process or a log-consumption growth process leads to declining long-term tails of discount curves. Under the assumption of an exponentially decaying memory in fluctuations of risk-free real interest rates, I derive the analytical expression for an apt value of the long run discount factor and provide a detailed comparison of the obtained result with the outcome of the benchmark risk-free interest rate models. Utilizing the standard consumption-based model with an isoelastic power utility of the representative economic agent, I derive the non-Markovian generalization of the Ramsey discounting formula. Obtained analytical results allowing simple calibration, may augment the rigorous cost-benefit and regulatory impact analysis of long-term environmental and infrastructure projects.

  9. An efficient simulator of 454 data using configurable statistical models

    Directory of Open Access Journals (Sweden)

    Persson Bengt

    2011-10-01

    Full Text Available Abstract Background Roche 454 is one of the major 2nd generation sequencing platforms. The particular characteristics of 454 sequence data pose new challenges for bioinformatic analyses, e.g. assembly and alignment search algorithms. Simulation of these data is therefore useful, in order to further assess how bioinformatic applications and algorithms handle 454 data. Findings We developed a new application named 454sim for simulation of 454 data at high speed and accuracy. The program is multi-thread capable and is available as C++ source code or pre-compiled binaries. Sequence reads are simulated by 454sim using a set of statistical models for each chemistry. 454sim simulates recorded peak intensities, peak quality deterioration and it calculates quality values. All three generations of the Roche 454 chemistry ('GS20', 'GS FLX' and 'Titanium' are supported and defined in external text files for easy access and tweaking. Conclusions We present a new platform independent application named 454sim. 454sim is generally 200 times faster compared to previous programs and it allows for simple adjustments of the statistical models. These improvements make it possible to carry out more complex and rigorous algorithm evaluations in a reasonable time scale.

  10. Three-dimensional conceptual model for service-oriented simulation

    Institute of Scientific and Technical Information of China (English)

    Wen-guang WANG; Wei-ping WANG; Justyna ZANDER; Yi-fan ZHU

    2009-01-01

    In this letter, we propose a novel three-dimensional conceptual model for an emerging service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks, In particular, the model inspects the crossover between the disciplines of modeling and simulation,service-orientation, and software/systems engineering. Finally, two specific simulation frameworks are studied as examples.

  11. Three-dimensional conceptual model for service-oriented simulation

    CERN Document Server

    Wang, Wenguang; Zander, Justyna; Zhu, Yifan; 10.1631/jzus.A0920258

    2009-01-01

    In this letter, we propose a novel three-dimensional conceptual model for an emerging service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks. In particular, the model inspects the crossover between the disciplines of modeling and simulation, service-orientation, and software/systems engineering. Finally, two specific simulation frameworks are studied as examples.

  12. Simulation Model developed for a Small-Scale PV-System in a Distribution Network

    DEFF Research Database (Denmark)

    Koch-Ciobotaru, C.; Mihet-Popa, Lucian; Isleifsson, Fridrik Rafn

    2012-01-01

    This paper presents a PV panel simulation model using the single-diode four-parameter model based on data sheet values. The model was implemented first in MATLAB/Simulink, and the results have been compared with the data sheet values and characteristics of the PV panels in standard test conditions....... Moreover to point out the strong dependency on ambient conditions and its influence on array operation and to validate simulation results with measured data a complex model has also been developed. A PV inverter model, using the same equations and parameters as in MATLAB/Simulink has also been developed...

  13. Markov Chains Used to Determine the Model of Stock Value and Compared with Other New Models of Stock Value (P/E Model and Ohlson Model

    Directory of Open Access Journals (Sweden)

    Abbasali Pouraghajan

    2012-10-01

    Full Text Available The aim of this study a comparison between the three models for the valuation of stocks in Tehran Stock Exchange. These three names PE, Olson or residual income and a Markov chain (Markov are. Researchers in their study were to calculate the valuation of shares in the first two terms and then calculate the value of the enamel Markov chain to achieve a comparative mode. Result of research shows that almost in all cases, there is no significant difference between explanatory power of these models in determining shares value and investments in Tehran exchange market can for assessment of shares uses from these 3 models, but in most cases residual income assessment model by considering less standard error of regression can say, partly is better model in determining the company's value which maybe the main reason be have high explanatory power of two dependent profit variable overall, and book value of share holder's salary by using the overall accounting relation in comparison with two other models.

  14. Tank waste remediation system simulation analysis retrieval model

    Energy Technology Data Exchange (ETDEWEB)

    Fordham, R.A.

    1996-09-30

    The goal of simulation was to test tll(., consequences of assumptions. For the TWRS SIMAN Retrieval Model, l@lie specific assumptions are primarily defined with respect to waste processing arid transfer timing. The model tracks 73 chem1913ical constituents from underground waste tanks to glass; yet, the detailed (@hemistrv and complete set of unit operations of the TWRS process flow sheet are represented only at the level necessary to define the waste processing and transfer logic and to estimate the feed composition for the treatment facilities. Tlierefor(,, the model should net be regarded as a substitute for the TWRS process flow sheet. Pra(!ticallv the model functions as a dyrt(imic extension of the flow sheet model. I I The following sections present the description, assunipt@ions, architecture, arid evalua- tion of the TWRS SIMAN Retrieval Model. Section 2 describes the model in terms of an overview of the processes represented. Section 3 presents the assumptions for the simulation model. Specific assumptions 9.tt(l parameter values used in the model are provided for waste retrieval, pretreatment, low-level waste (LLNN7) immobilization, and high-level waste (HLW) immobilization functions. Section 4 describes the model in terms of its functional architec- rare to d(@fine a basis for a systematic evaluation of the model. Finally, Section 5 documents an independent test and evaluation of the niodel`s performance (i.e., the verification and validation). Additionally, Appendix A gives a complete listing of the tank inventory used. Appendix B documents the verification and validation plan that was used for the (Section 5) evaluation work. A description and listing of all the model variables is given in Appendix C along with a complete source listing.

  15. Classification of missing values in spatial data using spin models

    CERN Document Server

    Žukovič, Milan; 10.1103/PhysRevE.80.011116

    2013-01-01

    A problem of current interest is the estimation of spatially distributed processes at locations where measurements are missing. Linear interpolation methods rely on the Gaussian assumption, which is often unrealistic in practice, or normalizing transformations, which are successful only for mild deviations from the Gaussian behavior. We propose to address the problem of missing values estimation on two-dimensional grids by means of spatial classification methods based on spin (Ising, Potts, clock) models. The "spin" variables provide an interval discretization of the process values, and the spatial correlations are captured in terms of interactions between the spins. The spins at the unmeasured locations are classified by means of the "energy matching" principle: the correlation energy of the entire grid (including prediction sites) is estimated from the sample-based correlations. We investigate the performance of the spin classifiers in terms of computational speed, misclassification rate, class histogram an...

  16. Evapotranspiration of tomato simulated with the CRITERIA model

    Directory of Open Access Journals (Sweden)

    Pasquale Campi

    2014-06-01

    Full Text Available The CRITERIA model simulates crop development and water dynamics in agricultural soils at different spatial scales. The objective of this paper was to test CRITERIA in order to evaluate the suitability of the model as a tool for scheduling irrigation at field scale. The first step of the work was to validate this hypothesis, by means of calibration and validation of CRITERIA on processing tomato in two experimental sites in Southern Italy (Rutigliano and Foggia for the years 2007 and 2008 under different irrigation regimes. The irrigation treatments were: i absence of plant water stress (the control treatments set up for both years and sites, ii moderately stressed (applied in Rutigliano for 2007, and iii severely stressed (applied in Foggia for 2008. The second step consisted in the evaluation of the expected impact of different irrigation regimes on daily actual evapotranspiration. For model calibration, the 2007 data of the control treatment was used, whereas in the validation process of actual evapotranspiration, the other part of the dataset was used. The observed data were crop evapotranspiration, agrometeorological data, leaf area index, physical-chemical and hydrological characteristics of soil, phenological stages and irrigation management. In order to evaluate model performance we used three statistical indicators to compare simulated and measured values of actual evapotranspiration: the normalized differences of seasonal values are less than 10% for all treatments; the model efficiency index on the typical period between two irrigations (4 days was positive for all treatments, with the best values in the Foggia site, for both the irrigated and the severely stressed experiments; the relative root mean square error (RRMSE was smaller than 20% in both the control treatments, but higher than 30% for the stressed treatments. The increase in RRMSE for the stressed experiments is due to CRITERIA simulating a crop in good soil water

  17. Business Value of Information Technology Service Quality Based on Probabilistic Business-Driven Model

    Directory of Open Access Journals (Sweden)

    Jaka Sembiring

    2015-08-01

    Full Text Available The business value of information technology (IT services is often difficult to assess, especially from the point of view of a non-IT manager. This condition could severely impact organizational IT strategic decisions. Various approaches have been proposed to quantify the business value, but some are trapped in technical complexity while others misguide managers into directly and subjectively judging some technical entities outside their domain of expertise. This paper describes a method on how to properly capture both perspectives based on a  probabilistic business-driven model. The proposed model presents a procedure to calculate the business value of IT services. The model also covers IT security services and their business value as an important aspect of IT services that is not covered in previously published researches. The impact of changes in the quality of IT services on business value will also be discussed. A simulation and a case illustration are provided to show the possible application of the proposed model for a simple business process in an enterprise.

  18. Nucleon and pion structure with lattice QCD simulations at physical value of the pion mass

    CERN Document Server

    Abdel-Rehim, A; Constantinou, M; Dimopoulos, P; Frezzotti, R; Hadjiyiannakou, K; Jansen, K; Kallidonis, Ch; Kostrzewa, B; Koutsou, G; Mangin-Brinet, M; Oehm, M; Rossi, G C; Urbach, C; Wenger, U

    2015-01-01

    We present results on the nucleon scalar, axial and tensor charges as well as on the momentum fraction, and the helicity and transversity moments. The pion momentum fraction is also presented. The computation of these key observables is carried out using lattice QCD simulations at a physical value of the pion mass. The evaluation is based on gauge configurations generated with two degenerate sea quarks of twisted mass fermions with a clover term. We investigate excited states contributions with the nucleon quantum numbers by analyzing three sink-source time separations. We find that, for the scalar charge, excited states contribute significantly and to a less degree to the nucleon momentum fraction and helicity moment. Our analysis yields a value for the nucleon axial charge agrees with the experimental value and we predict a value of 1.027(62) in the $\\overline{\\text{MS}}$ scheme at 2 GeV for the isovector nucleon tensor charge directly at the physical point. The pion momentum fraction is found to be $\\langl...

  19. Establishing the Scientific Value of Multiple GCM-RCM Simulation Programs: The Example of NARCCAP

    Science.gov (United States)

    Mearns, L. O.; Dominguez, F.; Gutowski, W. J., Jr.; Hammerling, D.; Leung, L. R.; Pryor, S. C.; Sain, S. R.

    2015-12-01

    There have been a number of multiple GCM-RCM programs, covering Europe, North America, and now, through CORDEX, most regions of the world. Standard metrics of success for these programs include number of publications, number of users of the data, and number of citations to the program. However, these metrics do not necessarily reflect the scientific value of the program, for example, what new scientific knowledge has been developed. We began to carefully consider how one does establish the scientific value of such programs. We thought that establishing the scientific value of the North American Regional Climate Change Assessment Program (NARCCAP) would be a good way to examine this issue. We present in this paper our assessment of the value of the climate science research produced through the program. These studies include articles that evaluate the current climates of the NARCCAP simulations, analyze the future climate projections, explore temperature and precipitation extremes and apply new statistical techniques to the analyses. A number of articles apply weighting techniques to the ensemble and quantify the uncertainty represented by the ensemble. Of particular interest is determining what we have learned about future climate projections based on the use of higher resolution dynamically generated future climate information. We will evaluate all research articles and major reports (aside from those regarding impacts) that used the NARCCAP database, and we will assess the major research advances indicated in this literature.

  20. Modelling and Simulation of Gas Engines Using Aspen HYSYS

    Directory of Open Access Journals (Sweden)

    M. C. Ekwonu

    2013-12-01

    Full Text Available In this paper gas engine model was developed in Aspen HYSYS V7.3 and validated with Waukesha 16V275GL+ gas engine. Fuel flexibility, fuel types and part load performance of the gas engine were investigated. The design variability revealed that the gas engine can operate on poor fuel with low lower heating value (LHV such as landfill gas, sewage gas and biogas with biogas offering potential integration with bottoming cycles when compared to natural gas. The result of the gas engine simulation gave an efficiency 40.7% and power output of 3592kW.

  1. Geomechanical Simulation of Bayou Choctaw Strategic Petroleum Reserve - Model Calibration.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    A finite element numerical analysis model has been constructed that consists of a realistic mesh capturing the geometries of Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) site and multi - mechanism deformation ( M - D ) salt constitutive model using the daily data of actual wellhead pressure and oil - brine interface. The salt creep rate is not uniform in the salt dome, and the creep test data for BC salt is limited. Therefore, the model calibration is necessary to simulate the geomechanical behavior of the salt dome. The cavern volumetric closures of SPR caverns calculated from CAVEMAN are used for the field baseline measurement. The structure factor, A 2 , and transient strain limit factor, K 0 , in the M - D constitutive model are used for the calibration. The A 2 value obtained experimentally from the BC salt and K 0 value of Waste Isolation Pilot Plant (WIPP) salt are used for the baseline values. T o adjust the magnitude of A 2 and K 0 , multiplication factors A2F and K0F are defined, respectively. The A2F and K0F values of the salt dome and salt drawdown skins surrounding each SPR cavern have been determined through a number of back fitting analyses. The cavern volumetric closures calculated from this model correspond to the predictions from CAVEMAN for six SPR caverns. Therefore, this model is able to predict past and future geomechanical behaviors of the salt dome, caverns, caprock , and interbed layers. The geological concerns issued in the BC site will be explained from this model in a follow - up report .

  2. Global Solar Dynamo Models: Simulations and Predictions

    Indian Academy of Sciences (India)

    Mausumi Dikpati; Peter A. Gilman

    2008-03-01

    Flux-transport type solar dynamos have achieved considerable success in correctly simulating many solar cycle features, and are now being used for prediction of solar cycle timing and amplitude.We first define flux-transport dynamos and demonstrate how they work. The essential added ingredient in this class of models is meridional circulation, which governs the dynamo period and also plays a crucial role in determining the Sun’s memory about its past magnetic fields.We show that flux-transport dynamo models can explain many key features of solar cycles. Then we show that a predictive tool can be built from this class of dynamo that can be used to predict mean solar cycle features by assimilating magnetic field data from previous cycles.

  3. Design and Simulation of Toroidal Twister Model

    Institute of Scientific and Technical Information of China (English)

    TIAN Huifang; LIN Xizhen; ZENG Qinqin

    2006-01-01

    Toroidal composite vessel winded with fiber is a new kind of structural pressure vessels, which not only has high structure efficiency of compound materials pressure vessel, good security and so on, but also has special shape and the property of utilizing toroidal space, and the prospect of the application of toroidal composite vessel winded with fiber is extremely broad. By introducing parameters establishment of toroidal vessel and elaborating the principle of filament winding for toroidal vessel, the design model of filament winding machine for toroidal vessel has been introduced, and the design model has been dynamically simulated by the software of ADAMS, which will give more referrence for the design of real toroidal vessel twister.

  4. VISION: Verifiable Fuel Cycle Simulation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Abdellatif M. Yacout; Gretchen E. Matthern; Steven J. Piet; David E. Shropshire

    2009-04-01

    The nuclear fuel cycle is a very complex system that includes considerable dynamic complexity as well as detail complexity. In the nuclear power realm, there are experts and considerable research and development in nuclear fuel development, separations technology, reactor physics and waste management. What is lacking is an overall understanding of the entire nuclear fuel cycle and how the deployment of new fuel cycle technologies affects the overall performance of the fuel cycle. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing and delays in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works and can transition as technologies are changed. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model and some examples of how to use VISION.

  5. Modeling and visual simulation of Microalgae photobioreactor

    Science.gov (United States)

    Zhao, Ming; Hou, Dapeng; Hu, Dawei

    Microalgae is a kind of nutritious and high photosynthetic efficiency autotrophic plant, which is widely distributed in the land and the sea. It can be extensively used in medicine, food, aerospace, biotechnology, environmental protection and other fields. Photobioreactor which is important equipment is mainly used to cultivate massive and high-density microalgae. In this paper, based on the mathematical model of microalgae which grew under different light intensity, three-dimensional visualization model was built and implemented in 3ds max, Virtools and some other three dimensional software. Microalgae is photosynthetic organism, it can efficiently produce oxygen and absorb carbon dioxide. The goal of the visual simulation is to display its change and impacting on oxygen and carbon dioxide intuitively. In this paper, different temperatures and light intensities were selected to control the photobioreactor, and dynamic change of microalgal biomass, Oxygen and carbon dioxide was observed with the aim of providing visualization support for microalgal and photobioreactor research.

  6. A rainfall simulation model for agricultural development in Bangladesh

    Directory of Open Access Journals (Sweden)

    M. Sayedur Rahman

    2000-01-01

    Full Text Available A rainfall simulation model based on a first-order Markov chain has been developed to simulate the annual variation in rainfall amount that is observed in Bangladesh. The model has been tested in the Barind Tract of Bangladesh. Few significant differences were found between the actual and simulated seasonal, annual and average monthly. The distribution of number of success is asymptotic normal distribution. When actual and simulated daily rainfall data were used to drive a crop simulation model, there was no significant difference of rice yield response. The results suggest that the rainfall simulation model perform adequately for many applications.

  7. Toy Models for Galaxy Formation versus Simulations

    CERN Document Server

    Dekel, A; Tweed, D; Cacciato, M; Ceverino, D; Primack, J R

    2013-01-01

    We describe simple useful toy models for key processes of galaxy formation in its most active phase, at z > 1, and test the approximate expressions against the typical behaviour in a suite of high-resolution hydro-cosmological simulations of massive galaxies at z = 4-1. We address in particular the evolution of (a) the total mass inflow rate from the cosmic web into galactic haloes based on the EPS approximation, (b) the penetration of baryonic streams into the inner galaxy, (c) the disc size, (d) the implied steady-state gas content and star-formation rate (SFR) in the galaxy subject to mass conservation and a universal star-formation law, (e) the inflow rate within the disc to a central bulge and black hole as derived using energy conservation and self-regulated Q ~ 1 violent disc instability (VDI), and (f) the implied steady state in the disc and bulge. The toy models provide useful approximations for the behaviour of the simulated galaxies. We find that (a) the inflow rate is proportional to mass and to (...

  8. Modelling and simulation of multitechnological machine systems

    Energy Technology Data Exchange (ETDEWEB)

    Holopainen, T. (ed.) [VTT Manufacturing Technology, Espoo (Finland)

    2001-07-01

    The Smart Machines and Systems 2010 (SMART) technology programme 1997-2000 aimed at supporting the machine and electromechanical industries in incorporating the modern technology into their products and processes. The public research projects in this programme were planned to accumulate the latest research results and transfer them for the benefit of industrial product development. The major research topic in the SMART programme was called Modelling and Simulation of Multitechnological Mechatronic Systems. The behaviour of modern machine systems and subsystems addresses many different types of physical phenomena and their mutual interactions: mechanical behaviour of structures, electromagnetic effects, hydraulics, vibrations and acoustics etc. together with associated control systems and software. The actual research was carried out in three separate projects called Modelling and Simulation of Mechtronic Machine Systems for Product Development and Condition Monitoring Purposes (MASI), Virtual Testing of Hydraulically Driven Machines (HYSI), and Control of Low Frequency Vibration of a Mobile Machine (AKSUS). This publication contains the papers presented at the final seminar of these three research projects, held on November 30th at Otaniemi Espoo. (orig.)

  9. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform......Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...

  10. Towards Modelling and Simulation of Crowded Environments in Cell Biology

    Science.gov (United States)

    Bittig, Arne T.; Jeschke, Matthias; Uhrmacher, Adelinde M.

    2010-09-01

    In modelling and simulation of cell biological processes, spatial homogeneity in the distribution of components is a common but not always valid assumption. Spatial simulation methods differ in computational effort and accuracy, and usually rely on tool-specific input formats for model specification. A clear separation between modelling and simulation allows a declarative model specification thereby facilitating reuse of models and exploiting different simulators. We outline a modelling formalism covering both stochastic spatial simulation at the population level and simulation of individual entities moving in continuous space as well as the combination thereof. A multi-level spatial simulator is presented that combines populations of small particles simulated according to the Next Subvolume Method with individually represented large particles following Brownian motion. This approach entails several challenges that need to be overcome, but nicely balances between calculation effort and required levels of detail.

  11. Model Checking with Edge-Valued Decision Diagrams

    Science.gov (United States)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library. We provide efficient algorithms for manipulating EVMDDs and review the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi- Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools. Compared to the CUDD package, our tool is several orders of magnitude faster

  12. Face image modeling by multilinear subspace analysis with missing values.

    Science.gov (United States)

    Geng, Xin; Smith-Miles, Kate; Zhou, Zhi-Hua; Wang, Liang

    2011-06-01

    Multilinear subspace analysis (MSA) is a promising methodology for pattern-recognition problems due to its ability in decomposing the data formed from the interaction of multiple factors. The MSA requires a large training set, which is well organized in a single tensor, which consists of data samples with all possible combinations of the contributory factors. However, such a "complete" training set is difficult (or impossible) to obtain in many real applications. The missing-value problem is therefore crucial to the practicality of the MSA but has been hardly investigated up to present. To solve the problem, this paper proposes an algorithm named M(2)SA, which is advantageous in real applications due to the following: 1) it inherits the ability of the MSA to decompose the interlaced semantic factors; 2) it does not depend on any assumptions on the data distribution; and 3) it can deal with a high percentage of missing values. M(2)SA is evaluated by face image modeling on two typical multifactorial applications, i.e., face recognition and facial age estimation. Experimental results show the effectiveness of M(2) SA even when the majority of the values in the training tensor are missing.

  13. Modeling Value Chain Analysis of Distance Education using UML

    Science.gov (United States)

    Acharya, Anal; Mukherjee, Soumen

    2010-10-01

    Distance education continues to grow as a methodology for the delivery of course content in higher education in India as well as abroad. To manage this growing demand and to provide certain flexibility, there must be certain strategic planning about the use of ICT tools. Value chain analysis is a framework for breaking down the sequence of business functions into a set of activities through which utility could be added to service. Thus it can help to determine the competitive advantage that is enjoyed by an institute. To implement these business functions certain visual representation is required. UML allows for this representation by using a set of structural and behavioral diagrams. In this paper, the first section defines a framework for value chain analysis and highlights its advantages. The second section gives a brief overview of related work in this field. The third section gives a brief discussion on distance education. The fourth section very briefly introduces UML. The fifth section models value chain of distance education using UML. Finally we discuss the limitations and the problems posed in this domain.

  14. Modeling and simulation of cascading contingencies

    Science.gov (United States)

    Zhang, Jianfeng

    This dissertation proposes a new approach to model and study cascading contingencies in large power systems. The most important contribution of the work involves the development and validation of a heuristic analytic model to assess the likelihood of cascading contingencies, and the development and validation of a uniform search strategy. We model the probability of cascading contingencies as a function of power flow and power flow changes. Utilizing logistic regression, the proposed model is calibrated using real industry data. This dissertation analyzes random search strategies for Monte Carlo simulations and proposes a new uniform search strategy based on the Metropolis-Hastings Algorithm. The proposed search strategy is capable of selecting the most significant cascading contingencies, and it is capable of constructing an unbiased estimator to provide a measure of system security. This dissertation makes it possible to reasonably quantify system security and justify security operations when economic concerns conflict with reliability concerns in the new competitive power market environment. It can also provide guidance to system operators about actions that may be taken to reduce the risk of major system blackouts. Various applications can be developed to take advantage of the quantitative security measures provided in this dissertation.

  15. Simulation Model for Foreign Trade During the Crisis in Romania

    Directory of Open Access Journals (Sweden)

    Mirela Diaconescu

    2014-12-01

    Full Text Available The paper proposes to analyze the evolution of foreign trade during the crisis in Romania. The evolution of foreign trade is analyzed using a simulation model. The period of analysis is 2006-2014. The data source is Eurostat and National Bank of Romania. Also, based on these data, we propose an econometric model which can be developed using different scenarios and forecasting of evolution of foreign trade. In period of economic recession, protectionist sentiments against imports competing with domestic products tend to rise. The same phenomenon was manifested in Romania. Thus, our study started from this consideration. Using econometric model we made scenarios predictions and the results are similar to the real values.

  16. IMPROVEMENT OF BUBBLE MODEL FOR CAVITATING FLOW SIMULATIONS

    Institute of Scientific and Technical Information of China (English)

    TAMURA Y.; MATSUMOTO Y.

    2009-01-01

    In the present research,a bubble dynamics based model for cavitating flow simulations is extended to higher void fraction region for wider range of applications.The present bubble model is based on the so-called Rayleigh-Plesset equation that calculates a temporal bubble radius with the surrounding liquid pressure and is considered to be valid in an area below a certain void fraction.The solution algorithm is modified so that the Rayleigh-Plesset equation is no more solved once the bubble radius(or void fraction)reaches at a certain value till the liquid pressure recovers above the vapor pressure in order to overcome this problem.This procedure is expected to stabilize the numerical calculation.The results of simple two-dimensional flow field are presented compared with the existing bubble model.

  17. Observed and simulated trophic index (TRIX) values for the Adriatic Sea basin

    Science.gov (United States)

    Fiori, Emanuela; Zavatarelli, Marco; Pinardi, Nadia; Mazziotti, Cristina; Ferrari, Carla Rita

    2016-09-01

    The main scope of the Marine Strategy Framework Directive is to achieve good environmental status (GES) of the EU's marine waters by 2020, in order to protect the marine environment more effectively. The trophic index (TRIX) was developed by Vollenweider in 1998 for the coastal area of Emilia-Romagna (northern Adriatic Sea) and was used by the Italian legislation to characterize the trophic state of coastal waters. We compared the TRIX index calculated from in situ data ("in situ TRIX") with the corresponding index simulated with a coupled physics and biogeochemical numerical model ("model TRIX") implemented in the overall Adriatic Sea. The comparison between in situ and simulated data was carried out for a data time series on the Emilia-Romagna coastal strip. This study shows the compatibility of the model with the in situ TRIX and the importance of the length of the time series in order to get robust index estimates. The model TRIX is finally calculated for the whole Adriatic Sea, showing trophic index differences across the Adriatic coastal areas.

  18. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M

    2011-01-01

    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  19. AISIM (Automated Interactive Simulation Modeling System) VAX Version Training Manual.

    Science.gov (United States)

    1985-02-01

    AD-Ri6t 436 AISIM (RUTOMATED INTERACTIVE SIMULATION MODELING 1/2 SYSTEM) VAX VERSION TRAI (U) HUGHES AIRCRAFT CO FULLERTON CA GROUND SYSTEMS GROUP S...Continue on reverse if necessary and Identify by block number) THIS DOCUMENT IS THE TRAINING MANUAL FOR THE AUTOMATED INTERACTIVE SIMULATION MODELING SYSTEM...form. Page 85 . . . . . . . . APPENDIX B SIMULATION REPORT FOR WORKING EXAMPLE Pa jPage.8 7AD-Ai6i 46 ISIM (AUTOMATED INTERACTIVE SIMULATION MODELING 2

  20. Tecnomatix Plant Simulation modeling and programming by means of examples

    CERN Document Server

    Bangsow, Steffen

    2015-01-01

    This book systematically introduces the development of simulation models as well as the implementation and evaluation of simulation experiments with Tecnomatix Plant Simulation. It deals with all users of Plant Simulation, who have more complex tasks to handle. It also looks for an easy entry into the program. Particular attention has been paid to introduce the simulation flow language SimTalk and its use in various areas of the simulation. The author demonstrates with over 200 examples how to combine the blocks for simulation models and how to deal with SimTalk for complex control and analys

  1. The impact of missing data in a generalized integer-valued autoregression model for count data.

    Science.gov (United States)

    Alosh, Mohamed

    2009-11-01

    The impact of the missing data mechanism on estimates of model parameters for continuous data has been extensively investigated in the literature. In comparison, minimal research has been carried out for the impact of missing count data. The focus of this article is to investigate the impact of missing data on a transition model, termed the generalized autoregressive model of order 1 for longitudinal count data. The model has several features, including modeling dependence and accounting for overdispersion in the data, that make it appealing for the clinical trial setting. Furthermore, the model can be viewed as a natural extension of the commonly used log-linear model. Following introduction of the model and discussion of its estimation we investigate the impact of different missing data mechanisms on estimates of the model parameters through a simulation experiment. The findings of the simulation experiment show that, as in the case of normally distributed data, estimates under the missing completely at random (MCAR) and missing at random (MAR) mechanisms are close to their analogue for the full dataset and that the missing not at random (MNAR) mechanism has the greatest bias. Furthermore, estimates based on imputing the last observed value carried forward (LOCF) for missing data under the MAR assumption are similar to those of the MAR. This latter finding might be attributed to the Markov property underlying the model and to the high level of dependence among successive observations used in the simulation experiment. Finally, we consider an application of the generalized autoregressive model to a longitudinal epilepsy dataset analyzed in the literature.

  2. Modeling human response errors in synthetic flight simulator domain

    Science.gov (United States)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  3. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  4. An open source simulation model for soil and sediment bioturbation.

    Science.gov (United States)

    Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin

    2011-01-01

    Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach.

  5. Bayesian estimation in IRT models with missing values in background variables

    Directory of Open Access Journals (Sweden)

    Christian Aßmann

    2015-12-01

    Full Text Available Large scale assessment studies typically aim at investigating the relationship between persons competencies and explaining variables. Individual competencies are often estimated by explicitly including explaining background variables into corresponding Item Response Theory models. Since missing values in background variables inevitably occur, strategies to handle the uncertainty related to missing values in parameter estimation are required. We propose to adapt a Bayesian estimation strategy based on Markov Chain Monte Carlo techniques. Sampling from the posterior distribution of parameters is thereby enriched by sampling from the full conditional distribution of the missing values. We consider non-parametric as well as parametric approximations for the full conditional distributions of missing values, thus allowing for a flexible incorporation of metric as well as categorical background variables. We evaluate the validity of our approach with respect to statistical accuracy by a simulation study controlling the missing values generating mechanism. We show that the proposed Bayesian strategy allows for effective comparison of nested model specifications via gauging highest posterior density intervals of all involved model parameters. An illustration of the suggested approach uses data from the National Educational Panel Study on mathematical competencies of fifth grade students.

  6. Values of Land and Renewable Resources in a Three-Sector Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Zhang Wei-Bin

    2015-04-01

    Full Text Available This paper studies dynamic interdependence of capital, land and resource values in a three sector growth model with endogenous wealth and renewable resources. The model is based on the neoclassical growth theory, Ricardian theory and growth theory with renewable resources. The household’s decision is modeled with an alternative approach proposed by Zhang two decades ago. The economic system consists of the households, industrial, agricultural, and resource sectors. The model describes a dynamic interdependence between wealth accumulation, resource change, and division of labor under perfect competition. We simulate the model to demonstrate the existence of a unique stable equilibrium point and plot the motion of the dynamic system. The study conducts comparative dynamic analysis with regard to changes in the propensity to consume resources, the propensity to consume housing, the propensity to consume agricultural goods, the propensity to consume industrial goods, the propensity to save, the population, and the output elasticity of capital of the resource sector.

  7. Comparing Productivity Simulated with Inventory Data Using Different Modelling Technologies

    Science.gov (United States)

    Klopf, M.; Pietsch, S. A.; Hasenauer, H.

    2009-04-01

    The Lime Stone National Park in Austria was established in 1997 to protect sensible lime stone soils from degradation due to heavy forest management. Since 1997 the management activities were successively reduced and standing volume and coarse woody debris (CWD) increased and degraded soils began to recover. One option to study the rehabilitation process towards natural virgin forest state is the use of modelling technology. In this study we will test two different modelling approaches for their applicability to Lime Stone National Park. We will compare standing tree volume simulated resulting from (i) the individual tree growth model MOSES, and (ii) the species and management sensitive adaptation of the biogeochemical-mechanistic model Biome-BGC. The results from the two models are compared with filed observations form repeated permanent forest inventory plots of the Lime Stone National Park in Austria. The simulated CWD predictions of the BGC-model were compared with dead wood measurements (standing and lying dead wood) recorded at the permanent inventory plots. The inventory was established between 1994 and 1996 and remeasured from 2004 to 2005. For this analysis 40 plots of this inventory were selected which comprise the required dead wood components and are dominated by a single tree species. First we used the distance dependant individual tree growth model MOSES to derive the standing timber and the amount of mortality per hectare. MOSES is initialized with the inventory data at plot establishment and each sampling plot is treated as forest stand. The Biome-BGC is a process based biogeochemical model with extensions for Austrian tree species, a self initialization and a forest management tool. The initialization for the actual simulations with the BGC model was done as follows: We first used spin up runs to derive a balanced forest vegetation, similar to an undisturbed forest. Next we considered the management history of the past centuries (heavy clear cuts

  8. LADEE Satellite Modeling and Simulation Development

    Science.gov (United States)

    Adams, Michael; Cannon, Howard; Frost, Chad

    2011-01-01

    As human activity on and around the Moon increases, so does the likelihood that our actions will have an impact on its atmosphere. The Lunar Atmosphere and Dust Environment Explorer (LADEE), a NASA satellite scheduled to launch in 2013, will orbit the Moon collecting composition, density, and time variability data to characterize the current state of the lunar atmosphere. LADEE will also test the concept of the "Modular Common Bus" spacecraft architecture, an effort to reduce both development time and cost by designing reusable, modular components for use in multiple missions with similar requirements. An important aspect of this design strategy is to both simulate the spacecraft and develop the flight code in Simulink, a block diagram-style programming language that allows easy algorithm visualization and performance testing. Before flight code can be tested, however, a realistic simulation of the satellite and its dynamics must be generated and validated. This includes all of the satellite control system components such as actuators used for force and torque generation and sensors used for inertial orientation reference. My primary responsibilities have included designing, integrating, and testing models for the LADEE thrusters, reaction wheels, star trackers, and rate gyroscopes.

  9. Value-at-Risk-Estimation in the Mexican Stock Exchange Using Conditional Heteroscedasticity Models and Theory of Extreme Values

    OpenAIRE

    Alejandro Iván Aguirre Salado; Humberto Vaquera Huerta; Martha Elva Ramírez Guzmán; José René Valdez Lazalde; Carlos Arturo Aguirre Salado

    2013-01-01

    This work proposes an approach for estimating value at risk (VaR) of the Mexican stock exchange index (IPC) by using a combination of the autoregressive moving average models (ARMA); three different models of the arch family, one symmetric (GARCH) and two asymmetric (GJR-GARCH and EGARCH); and the extreme value theory (EVT). The ARMA models were initially used to obtain uncorrelated residuals, which were later used for the analysis of extreme values. The GARCH, EGARCH and GJR-GARCH models, by...

  10. A new Monte Carlo simulation model for laser transmission in smokescreen based on MATLAB

    Science.gov (United States)

    Lee, Heming; Wang, Qianqian; Shan, Bin; Li, Xiaoyang; Gong, Yong; Zhao, Jing; Peng, Zhong

    2016-11-01

    A new Monte Carlo simulation model of laser transmission in smokescreen is promoted in this paper. In the traditional Monte Carlo simulation model, the radius of particles is set at the same value and the initial cosine value of photons direction is fixed also, which can only get the approximate result. The new model is achieved based on MATLAB and can simulate laser transmittance in smokescreen with different sizes of particles, and the output result of the model is close to the real scenarios. In order to alleviate the influence of the laser divergence while traveling in the air, we changed the initial direction cosine of photons on the basis of the traditional Monte Carlo model. The mixed radius particle smoke simulation results agree with the measured transmittance under the same experimental conditions with 5.42% error rate.

  11. Modeling and simulation of axisymmetric stagnation flames

    Science.gov (United States)

    Sone, Kazuo

    Laminar flame modeling is an important element in turbulent combustion research. The accuracy of a turbulent combustion model is highly dependent upon our understanding of laminar flames and their behavior in many situations. How much we understand combustion can only be measured by how well the model describes and predicts combustion phenomena. One of the most commonly used methane combustion models is GRI-Mech 3.0. However, how well the model describes the reacting flow phenomena is still uncertain even after many attempts to validate the model or quantify uncertainties. In the present study, the behavior of laminar flames under different aerodynamic and thermodynamic conditions is studied numerically in a stagnation-flow configuration. In order to make such a numerical study possible, the spectral element method is reformulated to accommodate the large density variations in methane reacting flows. In addition, a new axisymmetric basis function set for the spectral element method that satisfies the correct behavior near the axis is developed, and efficient integration techniques are developed to accurately model axisymmetric reacting flow within a reasonable amount of computational time. The numerical method is implemented using an object-oriented programming technique, and the resulting computer program is verified with several different verification methods. The present study then shows variances with the commonly used GRI-Mech 3.0 chemical kinetics model through a direct simulation of laboratory flames that allows direct comparison to experimental data. It is shown that the methane combustion model based on GRI-Mech 3.0 works well for methane-air mixtures near stoichiometry. However, GRI-Mech 3.0 leads to an overprediction of laminar flame speed for lean mixtures and an underprediction for rich mixtures. This result is slightly different from conclusion drawn in previous work, in which experimental data are compared with a one-dimensional numerical solutions

  12. Modeling and simulation of surface roughness

    Energy Technology Data Exchange (ETDEWEB)

    Patrikar, Rajendra M

    2004-04-30

    With the technology advancement, electronic devices are miniaturized at every development node. Physical parameters such as microscopic roughness are affecting these devices because surface to volume ratio is increasing rapidly. On all the real surfaces microscopic roughness appears, which affects many electronic properties of the material, which in turn decides the yield and reliability of the devices. Different type of parameters and simulation methods are used to describe the surface roughness. Classically surface roughness was modeled by methods such as power series and Fast Fourier Transform (FFT). Limitations of this methods lead to use the concept of self-similar fractals to model the rough surface through Mandelbrot-Weierstrass function. It is difficult to express surface roughness as a function of process parameters in the form of analytical functions. Method based on neural networks has been used to model these surfaces to map the process parameters to roughness parameters. Finally, change in electrical parameters such as capacitance, resistance and noise due to surface roughness has been computed by numerical methods.

  13. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  14. Human Performance Modeling and Simulation for Launch Team Applications

    Science.gov (United States)

    Peaden, Cary J.; Payne, Stephen J.; Hoblitzell, Richard M., Jr.; Chandler, Faith T.; LaVine, Nils D.; Bagnall, Timothy M.

    2006-01-01

    This paper describes ongoing research into modeling and simulation of humans for launch team analysis, training, and evaluation. The initial research is sponsored by the National Aeronautics and Space Administration's (NASA)'s Office of Safety and Mission Assurance (OSMA) and NASA's Exploration Program and is focused on current and future launch team operations at Kennedy Space Center (KSC). The paper begins with a description of existing KSC launch team environments and procedures. It then describes the goals of new Simulation and Analysis of Launch Teams (SALT) research. The majority of this paper describes products from the SALT team's initial proof-of-concept effort. These products include a nominal case task analysis and a discrete event model and simulation of launch team performance during the final phase of a shuttle countdown; and a first proof-of-concept training demonstration of launch team communications in which the computer plays most roles, and the trainee plays a role of the trainee's choice. This paper then describes possible next steps for the research team and provides conclusions. This research is expected to have significant value to NASA's Exploration Program.

  15. The modeling of miniature UAV flight visualization simulation platform

    Science.gov (United States)

    Li, Dong-hui; Li, Xin; Yang, Le-le; Li, Xiong

    2015-12-01

    This paper combines virtual technology with visualization visual simulation theory, construct the framework of visual simulation platform, apply open source software FlightGear simulator combined with GoogleEarth design a small UAV flight visual simulation platform. Using software AC3D to build 3D models of aircraft and complete the model loading based on XML configuration, the design and simulation of visualization modeling visual platform is presented. By using model-driven and data transforming in FlightGear , the design of data transmission module is realized based on Visual Studio 2010 development platform. Finally combined with GoogleEarth it can achieve the tracking and display.

  16. Shuttle operations simulation model programmers'/users' manual

    Science.gov (United States)

    Porter, D. G.

    1972-01-01

    The prospective user of the shuttle operations simulation (SOS) model is given sufficient information to enable him to perform simulation studies of the space shuttle launch-to-launch operations cycle. The procedures used for modifying the SOS model to meet user requirements are described. The various control card sequences required to execute the SOS model are given. The report is written for users with varying computer simulation experience. A description of the components of the SOS model is included that presents both an explanation of the logic involved in the simulation of the shuttle operations cycle and a description of the routines used to support the actual simulation.

  17. Discrete Element Simulation of Asphalt Mastics Based on Burgers Model

    Institute of Scientific and Technical Information of China (English)

    LIU Yu; FENG Shi-rong; HU Xia-guang

    2007-01-01

    In order to investigate the viscoelastic performance of asphalt mastics, a micro-mechanical model for asphalt mastics was built by applying Burgers model to discrete element simulation and constructing Burgers contact model. Then the numerical simulation of creep tests was conducted, and results from the simulation were compared with the analytical solution for Burgers model. The comparision snowed that the two results agreed well with each other, suggesting that discrete element model based on Burgers model could be employed in the numerical simulation for asphalt mastics.

  18. A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation

    Science.gov (United States)

    Wee, Loo Kang; Goh, Giam Hwee

    2013-01-01

    We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…

  19. Knowledge-based modeling of discrete-event simulation systems

    NARCIS (Netherlands)

    H. de Swaan Arons

    1999-01-01

    textabstractModeling a simulation system requires a great deal of customization. At first sight no system seems to resemble exactly another system and every time a new model has to be designed the modeler has to start from scratch. The present simulation languages provide the modeler with powerful

  20. Knowledge-based modeling of discrete-event simulation systems

    NARCIS (Netherlands)

    H. de Swaan Arons

    1999-01-01

    textabstractModeling a simulation system requires a great deal of customization. At first sight no system seems to resemble exactly another system and every time a new model has to be designed the modeler has to start from scratch. The present simulation languages provide the modeler with powerful t