WorldWideScience

Sample records for model simulated values

  1. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira

    2014-08-01

    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  2. Digitized Onondaga Lake Dissolved Oxygen Concentrations and Model Simulated Values using Bayesian Monte Carlo Methods

    Data.gov (United States)

    U.S. Environmental Protection Agency — The dataset is lake dissolved oxygen concentrations obtained form plots published by Gelda et al. (1996) and lake reaeration model simulated values using Bayesian...

  3. ARM Cloud Radar Simulator Package for Global Climate Models Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yuying [North Carolina State Univ., Raleigh, NC (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-05-01

    It has been challenging to directly compare U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility ground-based cloud radar measurements with climate model output because of limitations or features of the observing processes and the spatial gap between model and the single-point measurements. To facilitate the use of ARM radar data in numerical models, an ARM cloud radar simulator was developed to converts model data into pseudo-ARM cloud radar observations that mimic the instrument view of a narrow atmospheric column (as compared to a large global climate model [GCM] grid-cell), thus allowing meaningful comparison between model output and ARM cloud observations. The ARM cloud radar simulator value-added product (VAP) was developed based on the CloudSat simulator contained in the community satellite simulator package, the Cloud Feedback Model Intercomparison Project (CFMIP) Observation Simulator Package (COSP) (Bodas-Salcedo et al., 2011), which has been widely used in climate model evaluation with satellite data (Klein et al., 2013, Zhang et al., 2010). The essential part of the CloudSat simulator is the QuickBeam radar simulator that is used to produce CloudSat-like radar reflectivity, but is capable of simulating reflectivity for other radars (Marchand et al., 2009; Haynes et al., 2007). Adapting QuickBeam to the ARM cloud radar simulator within COSP required two primary changes: one was to set the frequency to 35 GHz for the ARM Ka-band cloud radar, as opposed to 94 GHz used for the CloudSat W-band radar, and the second was to invert the view from the ground to space so as to attenuate the beam correctly. In addition, the ARM cloud radar simulator uses a finer vertical resolution (100 m compared to 500 m for CloudSat) to resolve the more detailed structure of clouds captured by the ARM radars. The ARM simulator has been developed following the COSP workflow (Figure 1) and using the capabilities available in COSP

  4. The Impact of Different Absolute Solar Irradiance Values on Current Climate Model Simulations

    Science.gov (United States)

    Rind, David H.; Lean, Judith L.; Jonas, Jeffrey

    2014-01-01

    Simulations of the preindustrial and doubled CO2 climates are made with the GISS Global Climate Middle Atmosphere Model 3 using two different estimates of the absolute solar irradiance value: a higher value measured by solar radiometers in the 1990s and a lower value measured recently by the Solar Radiation and Climate Experiment. Each of the model simulations is adjusted to achieve global energy balance; without this adjustment the difference in irradiance produces a global temperature change of 0.48C, comparable to the cooling estimated for the Maunder Minimum. The results indicate that by altering cloud cover the model properly compensates for the different absolute solar irradiance values on a global level when simulating both preindustrial and doubled CO2 climates. On a regional level, the preindustrial climate simulations and the patterns of change with doubled CO2 concentrations are again remarkably similar, but there are some differences. Using a higher absolute solar irradiance value and the requisite cloud cover affects the model's depictions of high-latitude surface air temperature, sea level pressure, and stratospheric ozone, as well as tropical precipitation. In the climate change experiments it leads to an underestimation of North Atlantic warming, reduced precipitation in the tropical western Pacific, and smaller total ozone growth at high northern latitudes. Although significant, these differences are typically modest compared with the magnitude of the regional changes expected for doubled greenhouse gas concentrations. Nevertheless, the model simulations demonstrate that achieving the highest possible fidelity when simulating regional climate change requires that climate models use as input the most accurate (lower) solar irradiance value.

  5. Dynamic Value at Risk: A Comparative Study Between Heteroscedastic Models and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    José Lamartine Távora Junior

    2006-12-01

    Full Text Available The objective of this paper was to analyze the risk management of a portfolio composed by Petrobras PN, Telemar PN and Vale do Rio Doce PNA stocks. It was verified if the modeling of Value-at-Risk (VaR through the place Monte Carlo simulation with volatility of GARCH family is supported by hypothesis of efficient market. The results have shown that the statistic evaluation in inferior to dynamics, evidencing that the dynamic analysis supplies support to the hypothesis of efficient market of the Brazilian share holding market, in opposition of some empirical evidences. Also, it was verified that the GARCH models of volatility is enough to accommodate the variations of the shareholding Brazilian market, since the model is capable to accommodate the great dynamic of the Brazilian market.

  6. Simulating the value of electric-vehicle-grid integration using a behaviourally realistic model

    Science.gov (United States)

    Wolinetz, Michael; Axsen, Jonn; Peters, Jotham; Crawford, Curran

    2018-02-01

    Vehicle-grid integration (VGI) uses the interaction between electric vehicles and the electrical grid to provide benefits that may include reducing the cost of using intermittent renwable electricity or providing a financial incentive for electric vehicle ownerhip. However, studies that estimate the value of VGI benefits have largely ignored how consumer behaviour will affect the magnitude of the impact. Here, we simulate the long-term impact of VGI using behaviourally realistic and empirically derived models of vehicle adoption and charging combined with an electricity system model. We focus on the case where a central entity manages the charging rate and timing for participating electric vehicles. VGI is found not to increase the adoption of electric vehicles, but does have a a small beneficial impact on electricity prices. By 2050, VGI reduces wholesale electricity prices by 0.6-0.7% (0.7 MWh-1, 2010 CAD) relative to an equivalent scenario without VGI. Excluding consumer behaviour from the analysis inflates the value of VGI.

  7. Investigating added value of regional climate modeling in North American winter storm track simulations

    Science.gov (United States)

    Poan, E. D.; Gachon, P.; Laprise, R.; Aider, R.; Dueymes, G.

    2018-03-01

    Extratropical Cyclone (EC) characteristics depend on a combination of large-scale factors and regional processes. However, the latter are considered to be poorly represented in global climate models (GCMs), partly because their resolution is too coarse. This paper describes a framework using possibilities given by regional climate models (RCMs) to gain insight into storm activity during winter over North America (NA). Recent past climate period (1981-2005) is considered to assess EC activity over NA using the NCEP regional reanalysis (NARR) as a reference, along with the European reanalysis ERA-Interim (ERAI) and two CMIP5 GCMs used to drive the Canadian Regional Climate Model—version 5 (CRCM5) and the corresponding regional-scale simulations. While ERAI and GCM simulations show basic agreement with NARR in terms of climatological storm track patterns, detailed bias analyses show that, on the one hand, ERAI presents statistically significant positive biases in terms of EC genesis and therefore occurrence while capturing their intensity fairly well. On the other hand, GCMs present large negative intensity biases in the overall NA domain and particularly over NA eastern coast. In addition, storm occurrence over the northwestern topographic regions is highly overestimated. When the CRCM5 is driven by ERAI, no significant skill deterioration arises and, more importantly, all storm characteristics near areas with marked relief and over regions with large water masses are significantly improved with respect to ERAI. Conversely, in GCM-driven simulations, the added value contributed by CRCM5 is less prominent and systematic, except over western NA areas with high topography and over the Western Atlantic coastlines where the most frequent and intense ECs are located. Despite this significant added-value on seasonal-mean characteristics, a caveat is raised on the RCM ability to handle storm temporal `seriality', as a measure of their temporal variability at a given

  8. Verification of Fourier phase and amplitude values from simulated heart motion using a hydrodynamic cardiac model

    International Nuclear Information System (INIS)

    Yiannikas, J.; Underwood, D.A.; Takatani, Setsuo; Nose, Yukihiko; MacIntyre, W.J.; Cook, S.A.; Go, R.T.; Golding, L.; Loop, F.D.

    1986-01-01

    Using pusher-plate-type artificial hearts, changes in the degree of synchrony and stroke volume were compared to phase and amplitude calculations from the first Fourier component of individual-pixel time-activity curves generated from gated radionuclide images (RNA) of these hearts. In addition, the ability of Fourier analysis to quantify paradoxical volume shifts was tested using a ventricular aneurysm model by which the Fourier amplitude was correlated to known increments of paradoxical volume. Predetermined phase-angle differences (incremental increases in asynchrony) and the mean phase-angle difference calculated from RNAs showed an agreement of -7 0 +-4.4 0 (mean +-SD). A strong correlation was noted between stroke volume and Fourier amplitude (r=0.98; P<0.0001) as well as between the paradoxical volume accepted by the 'aneurysm' and the Fourier amplitude (r=0.97; P<0.0001). The degree of asynchrony and changes in stroke volume were accurately reflected by the Fourier phase and amplitude values, respectively. In the specific case of ventricular aneurysms, the data demonstrate that using this method, the paradoxically moving areas may be localized, and the expansile volume within these regions can be quantified. (orig.)

  9. The impact of MCS models and EFAC values on the dose simulation for a proton pencil beam

    Science.gov (United States)

    Chen, Shih-Kuan; Chiang, Bing-Hao; Lee, Chung-Chi; Tung, Chuan-Jong; Hong, Ji-Hong; Chao, Tsi-Chian

    2017-08-01

    The Multiple Coulomb Scattering (MCS) model plays an important role in accurate MC simulation, especially for small field applications. The Rossi model is used in MCNPX 2.7.0, and the Lewis model in Geant4.9.6.p02. These two models may generate very different angular and spatial distributions in small field proton dosimetry. Beside angular and spatial distributions, step size is also an important issue that causes path length effects. The Energy Fraction (EFAC) value can be used in MCNPX 2.7.0 to control step sizes of MCS. In this study, we use MCNPX 2.7.0, Geant4.9.6.p02, and one pencil beam algorithm to evaluate the effect of dose deposition because of different MCS models and different EFAC values in proton disequilibrium situation. Different MCS models agree well with each other under a proton equilibrium situation. Under proton disequilibrium situations, the MCNPX and Geant4 results, however, show a significant deviation (up to 43%). In addition, the path length effects are more significant when EFAC is equal to 0.917 and 0.94 in small field proton dosimetry, and using a 0.97 EFAC value is the best for both accuracy and efficiency

  10. Very high resolution regional climate model simulations over Greenland: Identifying added value

    DEFF Research Database (Denmark)

    Lucas-Picher, P.; Wulff-Nielsen, M.; Christensen, J.H.

    2012-01-01

    meteorological stations (Danish Meteorological Institute) at the coast and automatic weather stations on the ice sheet (Greenland Climate Network). Generally, the temperature and precipitation biases are small, indicating a realistic simulation of the climate over Greenland that is suitable to drive ice sheet...

  11. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    Science.gov (United States)

    Hu, Ming-Che

    Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received

  12. Learning from Noisy and Delayed Rewards: The Value of Reinforcement Learning to Defense Modeling and Simulation

    Science.gov (United States)

    2012-09-01

    theory of planned behavior which has been empirically studied as an explanation for behavior adoption ( Ajzen , 1991). The theory of planned behavior...human behavioral intention ( Ajzen , 1991). The theory of planned behavior states that behavioral intentions are formed by a combination of input from...the outcome evaluation, e, an evaluation of the value of the potential outcome ( Ajzen , 1991). A = n∑ i biei (43) Perceived behavioral control (PBC

  13. Potential for added value in precipitation simulated by high-resolution nested Regional Climate Models and observations

    Energy Technology Data Exchange (ETDEWEB)

    Di Luca, Alejandro; Laprise, Rene [Universite du Quebec a Montreal (UQAM), Centre ESCER (Etude et Simulation du Climat a l' Echelle Regionale), Departement des Sciences de la Terre et de l' Atmosphere, PK-6530, Succ. Centre-ville, B.P. 8888, Montreal, QC (Canada); De Elia, Ramon [Universite du Quebec a Montreal, Ouranos Consortium, Centre ESCER (Etude et Simulation du Climat a l' Echelle Regionale), Montreal (Canada)

    2012-03-15

    Regional Climate Models (RCMs) constitute the most often used method to perform affordable high-resolution regional climate simulations. The key issue in the evaluation of nested regional models is to determine whether RCM simulations improve the representation of climatic statistics compared to the driving data, that is, whether RCMs add value. In this study we examine a necessary condition that some climate statistics derived from the precipitation field must satisfy in order that the RCM technique can generate some added value: we focus on whether the climate statistics of interest contain some fine spatial-scale variability that would be absent on a coarser grid. The presence and magnitude of fine-scale precipitation variance required to adequately describe a given climate statistics will then be used to quantify the potential added value (PAV) of RCMs. Our results show that the PAV of RCMs is much higher for short temporal scales (e.g., 3-hourly data) than for long temporal scales (16-day average data) due to the filtering resulting from the time-averaging process. PAV is higher in warm season compared to cold season due to the higher proportion of precipitation falling from small-scale weather systems in the warm season. In regions of complex topography, the orographic forcing induces an extra component of PAV, no matter the season or the temporal scale considered. The PAV is also estimated using high-resolution datasets based on observations allowing the evaluation of the sensitivity of changing resolution in the real climate system. The results show that RCMs tend to reproduce relatively well the PAV compared to observations although showing an overestimation of the PAV in warm season and mountainous regions. (orig.)

  14. Potential for added value in precipitation simulated by high-resolution nested Regional Climate Models and observations

    Science.gov (United States)

    di Luca, Alejandro; de Elía, Ramón; Laprise, René

    2012-03-01

    Regional Climate Models (RCMs) constitute the most often used method to perform affordable high-resolution regional climate simulations. The key issue in the evaluation of nested regional models is to determine whether RCM simulations improve the representation of climatic statistics compared to the driving data, that is, whether RCMs add value. In this study we examine a necessary condition that some climate statistics derived from the precipitation field must satisfy in order that the RCM technique can generate some added value: we focus on whether the climate statistics of interest contain some fine spatial-scale variability that would be absent on a coarser grid. The presence and magnitude of fine-scale precipitation variance required to adequately describe a given climate statistics will then be used to quantify the potential added value (PAV) of RCMs. Our results show that the PAV of RCMs is much higher for short temporal scales (e.g., 3-hourly data) than for long temporal scales (16-day average data) due to the filtering resulting from the time-averaging process. PAV is higher in warm season compared to cold season due to the higher proportion of precipitation falling from small-scale weather systems in the warm season. In regions of complex topography, the orographic forcing induces an extra component of PAV, no matter the season or the temporal scale considered. The PAV is also estimated using high-resolution datasets based on observations allowing the evaluation of the sensitivity of changing resolution in the real climate system. The results show that RCMs tend to reproduce relatively well the PAV compared to observations although showing an overestimation of the PAV in warm season and mountainous regions.

  15. Clinical value of virtual three-dimensional instrument and cerebral aneurysm models in the interventional preoperative simulation

    International Nuclear Information System (INIS)

    Wei Xin; Xie Xiaodong; Wang Chaohua

    2007-01-01

    Objective: To establish virtual three-dimensional instrument and cerebral aneurysm models by using three-dimensional moulding software, and to explore the effect of the models in interventional preoperative simulation. Methods: The virtual individual models including cerebral arteries and aneurysms were established by using the three-dimensional moulding software of 3D Studio MAX R3 based on standard virtual cerebral aneurysm models and individual DSA image. The virtual catheter, guide wire, stent and coil were also established. The study of interventional preoperative simulation was run in personal computer, and included 3 clinical cases. Results: The simulation results of the working angle and the moulding angle of the head of catheter and guide wire in 3 cases were identical with that of operation results. The simulation results of the requirement of number and size of coil in 1 case of anterior communicating aneurysm and 1 case of posterior communicating aneurysm were identical with that of operation results. The simulation results of coil for aneurysmal shape in 1 case of giant internal carotid artery aneurysm were more than 2 three-dimensional coils with size of 3 mm x 3 cm from the operation results, and the position of the second coil in aneurysmal neck was adjusted according to the results of real-time simulation. The results of retrospective simulation of operation procedure indicated that the simulation methods for regular and small aneurysms could become a routine simulation means but more simulation experience was needed to build up for the giant aneurysms. Conclusions: The virtual three-dimensional instrument and cerebral aneurysm models established by the general software provided a new study method for neuro-interventional preoperative simulation, and it played an important guidance role in developing neuro-interventional operation. (authors)

  16. [Finite element modeling of material property assignment based on CT gray value and its application in simulation of osteotomy for deformities].

    Science.gov (United States)

    Ouyang, Han-Bin; Xie, Pu-Sheng; Deng, Yu-Ping; Yang, Yang; Chen, Peng-Yu; Huang, Hua-Jun; Huang, Wen-Hua

    2016-06-20

    To explore a new method for finite element modeling to achieve material property assignment based on in situ CT gray value in simulated osteotomies for deformities. A CT scan dataset of the lower limb of a patient with extorsion deformity was obtained for three-dimensional reconstruction using Mimics software and preparing a solid model. In the CAD software, the parameters for osteotomy simulation were defined including the navigation axis, rotation angle and reference plane. The tibia model was imported to the FEA pre-processing software for meshing procedure and then exported to Mimics. All the segments of the tibia meshed model were assigned uneven material properties based on the relationship between CT gray values and material properties in the Mimics software. Finally, all the segments of the tibia model, reference axis and reference plane were assembled in the pre-processing software to form a full finite element model of a corrected tibia, which was submitted to resolver for biomechanical analysis. The tibia model established using our modeling method had inhomogeneous material properties based on CT gray values, and was available for finite element analysis for the simulation of osteotomy. The proposed finite element modeling method, which retains the accuracy of the material property assignment based on CT gray value, can solve the reposition problem commonly seen in modeling via the routine method of property assignment and provides an efficient, flexible and accurate computational biomechanical analysis method for orthopedic surgery.

  17. Comparison of radiance and polarization values observed in the Mediterranean Sea and simulated in a Monte Carlo model

    DEFF Research Database (Denmark)

    Adams, J.T.; Aas, E.; Højerslev, N.K.

    2002-01-01

    Measurements of the radiance and degree of polarization made in 1971 in the Mediterranean Sea are presented along with the simulation of all observed quantities by a Monte Carlo technique. It is shown that our independent scattering treatment utilizing a Stokes vector formalism to describe...... the polarization state of the light field produces remarkably good agreement with those values measured in situ. (C) 2002 Optical Society of America...

  18. A comparison of model-based imputation methods for handling missing predictor values in a linear regression model: A simulation study

    Science.gov (United States)

    Hasan, Haliza; Ahmad, Sanizah; Osman, Balkish Mohd; Sapri, Shamsiah; Othman, Nadirah

    2017-08-01

    In regression analysis, missing covariate data has been a common problem. Many researchers use ad hoc methods to overcome this problem due to the ease of implementation. However, these methods require assumptions about the data that rarely hold in practice. Model-based methods such as Maximum Likelihood (ML) using the expectation maximization (EM) algorithm and Multiple Imputation (MI) are more promising when dealing with difficulties caused by missing data. Then again, inappropriate methods of missing value imputation can lead to serious bias that severely affects the parameter estimates. The main objective of this study is to provide a better understanding regarding missing data concept that can assist the researcher to select the appropriate missing data imputation methods. A simulation study was performed to assess the effects of different missing data techniques on the performance of a regression model. The covariate data were generated using an underlying multivariate normal distribution and the dependent variable was generated as a combination of explanatory variables. Missing values in covariate were simulated using a mechanism called missing at random (MAR). Four levels of missingness (10%, 20%, 30% and 40%) were imposed. ML and MI techniques available within SAS software were investigated. A linear regression analysis was fitted and the model performance measures; MSE, and R-Squared were obtained. Results of the analysis showed that MI is superior in handling missing data with highest R-Squared and lowest MSE when percent of missingness is less than 30%. Both methods are unable to handle larger than 30% level of missingness.

  19. Simulation model estimates of test accuracy and predictive values for the Danish Salmonella surveillance program in dairy herds

    DEFF Research Database (Denmark)

    Warnick, L.D.; Nielsen, L.R.; Nielsen, Jens

    2006-01-01

    The Danish government and cattle industry instituted a Salmonella surveillance program in October 2002 to help reduce Salmonella enterica subsp. enterica serotype Dublin (S. Dublin) infections. All dairy herds are tested by measuring antibodies in bulk tank milk at 3-month intervals. The program...... is based on a well-established ELISA, but the overall test program accuracy and misclassification was not previously investigated. We developed a model to simulate repeated bulk tank milk antibody measurements for dairy herds conditional on true infection status. The distributions of bulk tank milk...

  20. Multifractal Value at Risk model

    Science.gov (United States)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  1. EMC Simulation and Modeling

    Science.gov (United States)

    Takahashi, Takehiro; Schibuya, Noboru

    The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.

  2. A simulation model to quantify the value of implementing whole-herd Bovine viral diarrhea virus testing strategies in beef cow-calf herds.

    Science.gov (United States)

    Nickell, Jason S; White, Brad J; Larson, Robert L; Renter, David G; Sanderson, Mike W

    2011-03-01

    Although numerous diagnostic tests are available to identify cattle persistently infected (PI) with Bovine viral diarrhea virus (BVDV) in cow-calf herds, data are sparse when evaluating the economic viability of individual tests or diagnostic strategies. Multiple factors influence BVDV testing in determining if testing should be performed and which strategy to use. A stochastic model was constructed to estimate the value of implementing various whole-herd BVDV cow-calf testing protocols. Three common BVDV tests (immunohistochemistry, antigen-capture enzyme-linked immunosorbent assay, and polymerase chain reaction) performed on skin tissue were evaluated as single- or two-test strategies. The estimated testing value was calculated for each strategy at 3 herd sizes that reflect typical farm sizes in the United States (50, 100, and 500 cows) and 3 probabilities of BVDV-positive herd status (0.077, 0.19, 0.47) based upon the literature. The economic value of testing was the difference in estimated gross revenue between simulated cow-calf herds that either did or did not apply the specific testing strategy. Beneficial economic outcomes were more frequently observed when the probability of a herd being BVDV positive was 0.47. Although the relative value ranking of many testing strategies varied by each scenario, the two-test strategy composed of immunohistochemistry had the highest estimated value in all but one herd size-herd prevalence permutation. These data indicate that the estimated value of applying BVDV whole-herd testing strategies is influenced by the selected strategy, herd size, and the probability of herd BVDV-positive status; therefore, these factors should be considered when designing optimum testing strategies for cow-calf herds.

  3. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  4. Coal value chain - simulation model

    CSIR Research Space (South Africa)

    Fourie, M

    2005-08-01

    Full Text Available m u la tio n M o de l M e la n ie Fo u rie (S a so l T ec hn o lo gy ) a n d Jo ha n Ja n se va n R en sb u rg (C SI R ) co py rig ht re se rv e d 20 05 , Sa so l T e ch n... o lo gy & Sa so l M in in g 19 th SA IIE a n d 35 th O R SS A Co n fe re n ce 20 05 O u tli n e Ba ck gr o u n d Si m u la tio n o bje ct iv e s Si m u la tio n m o de l M...

  5. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  6. A Monte Carlo Simulation Comparing the Statistical Precision of Two High-Stakes Teacher Evaluation Methods: A Value-Added Model and a Composite Measure

    Science.gov (United States)

    Spencer, Bryden

    2016-01-01

    Value-added models are a class of growth models used in education to assign responsibility for student growth to teachers or schools. For value-added models to be used fairly, sufficient statistical precision is necessary for accurate teacher classification. Previous research indicated precision below practical limits. An alternative approach has…

  7. The perceived value of using BIM for energy simulation

    Science.gov (United States)

    Lewis, Anderson M.

    Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using

  8. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  9. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  10. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  11. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  12. Participatory Systems Modeling to Explore Sustainable Solutions: Triple-Value Simulation Modeling Cases Tackle Nutrient and Watershed Management from a Socio-Ecological Systems (ses) Perspective

    Science.gov (United States)

    Buchholtz ten Brink, M. R.; Heineman, K.; Foley, G. J.; Ruder, E.; Tanners, N.; Bassi, A.; Fiksel, J.

    2016-12-01

    Decision makers often need assistance in understanding dynamic interactions and linkages among economic, environmental and social systems in coastal watersheds. They also need scientific input to better evaluate potential costs and benefits of alternative policy interventions. The US EPA is applying sustainability science to address these needs. Triple Value (3V) Scoping and Modeling projects bring a systems approach to understand complex environmental problems, incorporate local knowledge, and allow decision-makers to explore policy scenarios. This leads to better understanding of feedbacks and outcomes to both human and environmental systems.The Suffolk County, NY (eastern Long Island) 3V Case uses SES interconnections to explore possible policy options and scenarios for intervention to mitigate the effects of excess nitrogen (N) loading to ground, surface, and estuarine waters. Many of the environmental impacts of N pollution have adverse effects on social and economic well-being and productivity. Key are loss of enjoyment and recreational use of local beach environments and loss of income and revenues from tourism and local fisheries. Stakeholders generated this Problem Statement: Suffolk County is experiencing widespread degradation to groundwater and the coastal marine environment caused by excess nitrogen. How can local stakeholders and decision makers in Suffolk County arrest and reverse this degradation, restore conditions to support a healthy thriving ecosystem, strengthen the County's resilience to emerging and expected environmental threats from global climate change, support and promote economic growth, attract a vibrant and sustainable workforce, and maintain and enhance quality of life and affordability for all County residents? They then built a Causal Loop Diagram of indicators and relationships that reflect these issues and identified a set of alternative policy interventions to address them. The project team conducted an extensive review of

  13. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  14. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  15. Mathematical modeling and simulation in animal health - Part II: principles, methods, applications, and value of physiologically based pharmacokinetic modeling in veterinary medicine and food safety assessment.

    Science.gov (United States)

    Lin, Z; Gehring, R; Mochel, J P; Lavé, T; Riviere, J E

    2016-10-01

    This review provides a tutorial for individuals interested in quantitative veterinary pharmacology and toxicology and offers a basis for establishing guidelines for physiologically based pharmacokinetic (PBPK) model development and application in veterinary medicine. This is important as the application of PBPK modeling in veterinary medicine has evolved over the past two decades. PBPK models can be used to predict drug tissue residues and withdrawal times in food-producing animals, to estimate chemical concentrations at the site of action and target organ toxicity to aid risk assessment of environmental contaminants and/or drugs in both domestic animals and wildlife, as well as to help design therapeutic regimens for veterinary drugs. This review provides a comprehensive summary of PBPK modeling principles, model development methodology, and the current applications in veterinary medicine, with a focus on predictions of drug tissue residues and withdrawal times in food-producing animals. The advantages and disadvantages of PBPK modeling compared to other pharmacokinetic modeling approaches (i.e., classical compartmental/noncompartmental modeling, nonlinear mixed-effects modeling, and interspecies allometric scaling) are further presented. The review finally discusses contemporary challenges and our perspectives on model documentation, evaluation criteria, quality improvement, and offers solutions to increase model acceptance and applications in veterinary pharmacology and toxicology. © 2016 John Wiley & Sons Ltd.

  16. Simulating cyber warfare and cyber defenses: information value considerations

    Science.gov (United States)

    Stytz, Martin R.; Banks, Sheila B.

    2011-06-01

    Simulating cyber warfare is critical to the preparation of decision-makers for the challenges posed by cyber attacks. Simulation is the only means we have to prepare decision-makers for the inevitable cyber attacks upon the information they will need for decision-making and to develop cyber warfare strategies and tactics. Currently, there is no theory regarding the strategies that should be used to achieve objectives in offensive or defensive cyber warfare, and cyber warfare occurs too rarely to use real-world experience to develop effective strategies. To simulate cyber warfare by affecting the information used for decision-making, we modify the information content of the rings that are compromised during in a decision-making context. The number of rings affected and value of the information that is altered (i.e., the closeness of the ring to the center) is determined by the expertise of the decision-maker and the learning outcome(s) for the simulation exercise. We determine which information rings are compromised using the probability that the simulated cyber defenses that protect each ring can be compromised. These probabilities are based upon prior cyber attack activity in the simulation exercise as well as similar real-world cyber attacks. To determine which information in a compromised "ring" to alter, the simulation environment maintains a record of the cyber attacks that have succeeded in the simulation environment as well as the decision-making context. These two pieces of information are used to compute an estimate of the likelihood that the cyber attack can alter, destroy, or falsify each piece of information in a compromised ring. The unpredictability of information alteration in our approach adds greater realism to the cyber event. This paper suggests a new technique that can be used for cyber warfare simulation, the ring approach for modeling context-dependent information value, and our means for considering information value when assigning cyber

  17. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  18. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  19. Analysis of Macro-micro Simulation Models for Service-Oriented Public Platform: Coordination of Networked Services and Measurement of Public Values

    Science.gov (United States)

    Kinoshita, Yumiko

    When service sectors are a major driver for the growth of the world economy, we are challenged to implement service-oriented infrastructure as e-Gov platform to achieve further growth and innovation for both developed and developing countries. According to recent trends in service industry, it is clarified that main factors for the growth of service sectors are investment into knowledge, trade, and the enhanced capacity of micro, small, and medium-sized enterprises (MSMEs). In addition, the design and deployment of public service platform require appropriate evaluation methodology. Reflecting these observations, this paper proposes macro-micro simulation approach to assess public values (PV) focusing on MSMEs. Linkage aggregate variables (LAVs) are defined to show connection between macro and micro impacts of public services. As a result, the relationship of demography, business environment, macro economy, and socio-economic impact are clarified and their values are quantified from the behavioral perspectives of citizens and firms.

  20. The models of phase transformations definition in the software Deform and their effect on the output values from the numerical simulation of gear thermal processing.

    Directory of Open Access Journals (Sweden)

    Sona Benesova

    2014-11-01

    Full Text Available With the aid of DEFORM® software it is possible to conduct numerical simulation of workpiece phase composition during and upon heat treatment. The computation can be based on either the graphical representation of TTT diagram of the steel in question or one of the mathematical models integrated in the software, the latter being applicable if the required constants are known. The present paper gives an evaluation of differences between results of numerical simulations with various definitions of phase transformation for the heat treatment of a gearwheel and a specially prepared specimen of simple shape. It was found that the preparation of input data in terms of thorough mapping of characteristics of the material is essential. 

  1. NET PRESENT VALUE SIMULATING WITH A SPREADSHEET

    Directory of Open Access Journals (Sweden)

    Maria CONSTANTINESCU

    2010-01-01

    Full Text Available Decision making has always been a difficult process, based on various combinations if objectivity (when scientific tools were used and subjectivity (considering that decisions are finally made by people, with their strengths and weaknesses. The IT revolution has also reached the areas of management and decision making, helping managers make better and more informed decisions by providing them with a variety of tools, from the personal computers to the specialized software. Most simulations are performed in a spreadsheet, because the number of calculations required soon overwhelms human capability.

  2. PDOP values for simulated GPS/Galileo positioning

    DEFF Research Database (Denmark)

    Cederholm, Jens Peter

    2005-01-01

    The paper illustrates satellite coverage and PDOP values for a simulated combined GPS/Galileo system. The designed GPS satellite constellation and the planned Galileo satellite constellation are presented. The combined system is simulated and the number of visible satellites and PDOP values...

  3. PDOP values for simulated GPS/Galileo positioning

    DEFF Research Database (Denmark)

    Cederholm, Jens Peter

    2005-01-01

    The paper illustrates satellite coverage and PDOP values for a simulated combined GPS/Galileo system. The designed GPS satellite constellation and the planned Galileo satellite constellation are presented. The combined system is simulated and the number of visible satellites and PDOP values are e...

  4. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  5. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    leaving students. It is a probabilistic model. In the next part of this article, two more models - 'input/output model' used for production systems or economic studies and a. 'discrete event simulation model' are introduced. Aircraft Performance Model.

  6. Achieving Value in Primary Care: The Primary Care Value Model.

    Science.gov (United States)

    Rollow, William; Cucchiara, Peter

    2016-03-01

    The patient-centered medical home (PCMH) model provides a compelling vision for primary care transformation, but studies of its impact have used insufficiently patient-centered metrics with inconsistent results. We propose a framework for defining patient-centered value and a new model for value-based primary care transformation: the primary care value model (PCVM). We advocate for use of patient-centered value when measuring the impact of primary care transformation, recognition, and performance-based payment; for financial support and research and development to better define primary care value-creating activities and their implementation; and for use of the model to support primary care organizations in transformation. © 2016 Annals of Family Medicine, Inc.

  7. Modelling and Simulation: An Overview

    OpenAIRE

    McAleer, Michael; Chan, Felix; Oxley, Les

    2013-01-01

    This discussion paper resulted in a publication in 'Selected Papers of the MSSANZ 19th Biennial Conference on Modelling and Simulation Mathematics and Computers in Simulation', 2013, pp. viii. The papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal: the emp...

  8. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  9. Mean Value Modelling of Turbocharged SI Engines

    DEFF Research Database (Denmark)

    Müller, Martin; Hendricks, Elbert; Sorenson, Spencer C.

    1998-01-01

    The development of a computer simulation to predict the performance of a turbocharged spark ignition engine during transient operation. New models have been developed for the turbocharged and the intercooling system. An adiabatic model for the intake manifold is presented.......The development of a computer simulation to predict the performance of a turbocharged spark ignition engine during transient operation. New models have been developed for the turbocharged and the intercooling system. An adiabatic model for the intake manifold is presented....

  10. Proving the ecosystem value through hydrological modelling

    Science.gov (United States)

    Dorner, W.; Spachinger, K.; Porter, M.; Metzka, R.

    2008-11-01

    Ecosystems provide valuable functions. Also natural floodplains and river structures offer different types of ecosystem functions such as habitat function, recreational area and natural detention. From an economic stand point the loss (or rehabilitation) of these natural systems and their provided natural services can be valued as a damage (or benefit). Consequently these natural goods and services must be economically valued in project assessments e.g. cost-benefit-analysis or cost comparison. Especially in smaller catchments and river systems exists significant evidence that natural flood detention reduces flood risk and contributes to flood protection. Several research projects evaluated the mitigating effect of land use, river training and the loss of natural flood plains on development, peak and volume of floods. The presented project analysis the hypothesis that ignoring natural detention and hydrological ecosystem services could result in economically inefficient solutions for flood protection and mitigation. In test areas, subcatchments of the Danube in Germany, a combination of hydrological and hydrodynamic models with economic evaluation techniques was applied. Different forms of land use, river structure and flood protection measures were assed and compared from a hydrological and economic point of view. A hydrodynamic model was used to simulate flows to assess the extent of flood affected areas and damages to buildings and infrastructure as well as to investigate the impacts of levees and river structure on a local scale. These model results provided the basis for an economic assessment. Different economic valuation techniques, such as flood damage functions, cost comparison method and substation-approach were used to compare the outcomes of different hydrological scenarios from an economic point of view and value the ecosystem service. The results give significant evidence that natural detention must be evaluated as part of flood mitigation projects

  11. Proving the ecosystem value through hydrological modelling

    International Nuclear Information System (INIS)

    Dorner, W; Spachinger, K; Metzka, R; Porter, M

    2008-01-01

    Ecosystems provide valuable functions. Also natural floodplains and river structures offer different types of ecosystem functions such as habitat function, recreational area and natural detention. From an economic stand point the loss (or rehabilitation) of these natural systems and their provided natural services can be valued as a damage (or benefit). Consequently these natural goods and services must be economically valued in project assessments e.g. cost-benefit-analysis or cost comparison. Especially in smaller catchments and river systems exists significant evidence that natural flood detention reduces flood risk and contributes to flood protection. Several research projects evaluated the mitigating effect of land use, river training and the loss of natural flood plains on development, peak and volume of floods. The presented project analysis the hypothesis that ignoring natural detention and hydrological ecosystem services could result in economically inefficient solutions for flood protection and mitigation. In test areas, subcatchments of the Danube in Germany, a combination of hydrological and hydrodynamic models with economic evaluation techniques was applied. Different forms of land use, river structure and flood protection measures were assed and compared from a hydrological and economic point of view. A hydrodynamic model was used to simulate flows to assess the extent of flood affected areas and damages to buildings and infrastructure as well as to investigate the impacts of levees and river structure on a local scale. These model results provided the basis for an economic assessment. Different economic valuation techniques, such as flood damage functions, cost comparison method and substation-approach were used to compare the outcomes of different hydrological scenarios from an economic point of view and value the ecosystem service. The results give significant evidence that natural detention must be evaluated as part of flood mitigation projects

  12. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operati...

  13. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  14. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  15. Incorporating Customer Lifetime Value into Marketing Simulation Games

    Science.gov (United States)

    Cannon, Hugh M.; Cannon, James N.; Schwaiger, Manfred

    2010-01-01

    Notwithstanding the emerging prominence of customer lifetime value (CLV) and customer equity (CE) in the marketing literature during the past decade, virtually nothing has been done to address these concepts in the literature on simulation and gaming. This article addresses the failing, discussing the nature of CLV and CE and demonstrating how…

  16. E3value to BPMN model transformation

    NARCIS (Netherlands)

    Fatemi, Hassan; van Sinderen, Marten J.; Wieringa, Roelf J.; Wieringa, P.A.; Camarinha-Matos, Luis M.; Pereira Klen, Alexandra; Afsarmanesh, Hamidesh

    2011-01-01

    Business value and coordination process perspectives need to be taken into consideration while modeling business collaborations. The need for these two models stems from the importance of separating the how from the what concerns. A business value model shows what is offered by whom to whom while a

  17. A Discrete Event Simulation Model to Assess the Economic Value of a Hypothetical Pharmacogenomics Test for Statin-Induced Myopathy in Patients Initiating a Statin in Secondary Cardiovascular Prevention.

    Science.gov (United States)

    Mitchell, Dominic; Guertin, Jason R; Dubois, Anick; Dubé, Marie-Pierre; Tardif, Jean-Claude; Iliza, Ange Christelle; Fanton-Aita, Fiorella; Matteau, Alexis; LeLorier, Jacques

    2018-04-12

    Statin (HMG-CoA reductase inhibitor) therapy is the mainstay dyslipidemia treatment and reduces the risk of a cardiovascular (CV) event (CVE) by up to 35%. However, adherence to statin therapy is poor. One reason patients discontinue statin therapy is musculoskeletal pain and the associated risk of rhabdomyolysis. Research is ongoing to develop a pharmacogenomics (PGx) test for statin-induced myopathy as an alternative to the current diagnosis method, which relies on creatine kinase levels. The potential economic value of a PGx test for statin-induced myopathy is unknown. We developed a lifetime discrete event simulation (DES) model for patients 65 years of age initiating a statin after a first CVE consisting of either an acute myocardial infarction (AMI) or a stroke. The model evaluates the potential economic value of a hypothetical PGx test for diagnosing statin-induced myopathy. We have assessed the model over the spectrum of test sensitivity and specificity parameters. Our model showed that a strategy with a perfect PGx test had an incremental cost-utility ratio of 4273 Canadian dollars ($Can) per quality-adjusted life year (QALY). The probabilistic sensitivity analysis shows that when the payer willingness-to-pay per QALY reaches $Can12,000, the PGx strategy is favored in 90% of the model simulations. We found that a strategy favoring patients staying on statin therapy is cost effective even if patients maintained on statin are at risk of rhabdomyolysis. Our results are explained by the fact that statins are highly effective in reducing the CV risk in patients at high CV risk, and this benefit largely outweighs the risk of rhabdomyolysis.

  18. An Interval-Valued Approach to Business Process Simulation Based on Genetic Algorithms and the BPMN

    Directory of Open Access Journals (Sweden)

    Mario G.C.A. Cimino

    2014-05-01

    Full Text Available Simulating organizational processes characterized by interacting human activities, resources, business rules and constraints, is a challenging task, because of the inherent uncertainty, inaccuracy, variability and dynamicity. With regard to this problem, currently available business process simulation (BPS methods and tools are unable to efficiently capture the process behavior along its lifecycle. In this paper, a novel approach of BPS is presented. To build and manage simulation models according to the proposed approach, a simulation system is designed, developed and tested on pilot scenarios, as well as on real-world processes. The proposed approach exploits interval-valued data to represent model parameters, in place of conventional single-valued or probability-valued parameters. Indeed, an interval-valued parameter is comprehensive; it is the easiest to understand and express and the simplest to process, among multi-valued representations. In order to compute the interval-valued output of the system, a genetic algorithm is used. The resulting process model allows forming mappings at different levels of detail and, therefore, at different model resolutions. The system has been developed as an extension of a publicly available simulation engine, based on the Business Process Model and Notation (BPMN standard.

  19. Adding Value in Construction Design Management by Using Simulation Approach

    OpenAIRE

    Doloi, Hemanta

    2008-01-01

    Simulation modelling has been introduced as a decision support tool for front end planning and design analysis of projects. An integrated approach has been discussed linking project scope, end product or project facility performance and the strategic project objectives at the early stage of projects. The case study example on tram network demonstrates that application of simulation helps assessing performance of project operation and making appropriate investment decisions over life cycle of ...

  20. The mixed instrumental controller: using value of information to combine habitual choice and mental simulation.

    Science.gov (United States)

    Pezzulo, Giovanni; Rigoli, Francesco; Chersi, Fabian

    2013-01-01

    Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available "cached" value of actions (linked to model-free mechanisms) or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms). Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated "Value of Information" exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus - ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation.

  1. The Mixed Instrumental Controller: Using Value of Information to Combine Habitual Choice and Mental Simulation

    Science.gov (United States)

    Pezzulo, Giovanni; Rigoli, Francesco; Chersi, Fabian

    2013-01-01

    Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available “cached” value of actions (linked to model-free mechanisms) or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms). Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated “Value of Information” exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus – ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation. PMID:23459512

  2. TREAT Modeling and Simulation Strategy

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  3. THE VALUE OF SIMULATION IN ACCOUNTING TO INFORM MANAGEMENT DECISIONS

    Directory of Open Access Journals (Sweden)

    Kovalev A. E.

    2016-12-01

    Full Text Available The article discusses the role and position of accounting in the planning, forecasting and decision-making. From the point of view of the role of modeling shown the three levels of models: modeling in mind, based on semi-structured data; modeling in mind, based on qualitative information organized; and tool-modeling prediction. Described the evaluation of accounting information as a descriptive model of economic processes of the organization. Included list of recommendations of practical value to the development of the accounting model of economic processes.

  4. FASTBUS simulation models in VHDL

    International Nuclear Information System (INIS)

    Appelquist, G.

    1992-11-01

    Four hardware simulation models implementing the FASTBUS protocol are described. The models are written in the VHDL hardware description language to obtain portability, i.e. without relations to any specific simulator. They include two complete FASTBUS devices, a full-duplex segment interconnect and ancillary logic for the segment. In addition, master and slave models using a high level interface to describe FASTBUS operations, are presented. With these models different configurations of FASTBUS systems can be evaluated and the FASTBUS transactions of new devices can be verified. (au)

  5. Predictability of extreme values in geophysical models

    Directory of Open Access Journals (Sweden)

    A. E. Sterk

    2012-09-01

    Full Text Available Extreme value theory in deterministic systems is concerned with unlikely large (or small values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical models. We study whether finite-time Lyapunov exponents are larger or smaller for initial conditions leading to extremes. General statements on whether extreme values are better or less predictable are not possible: the predictability of extreme values depends on the observable, the attractor of the system, and the prediction lead time.

  6. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Most systems involve parameters and variables, which are random variables due to uncertainties. Probabilistic meth- ods are powerful in modelling such systems. In this second part, we describe probabilistic models and Monte Carlo simulation along with 'classical' matrix methods and differ- ential equations as most real ...

  7. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  8. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are

  9. Calculation for simulation of archery goal value using a web camera and ultrasonic sensor

    Science.gov (United States)

    Rusjdi, Darma; Abdurrasyid, Wulandari, Dewi Arianti

    2017-08-01

    Development of the device simulator digital indoor archery-based embedded systems as a solution to the limitations of the field or open space is adequate, especially in big cities. Development of the device requires simulations to calculate the value of achieving the target based on the approach defined by the parabolic motion variable initial velocity and direction of motion of the arrow reaches the target. The simulator device should be complemented with an initial velocity measuring device using ultrasonic sensors and measuring direction of the target using a digital camera. The methodology uses research and development of application software from modeling and simulation approach. The research objective to create simulation applications calculating the value of the achievement of the target arrows. Benefits as a preliminary stage for the development of the simulator device of archery. Implementation of calculating the value of the target arrows into the application program generates a simulation game of archery that can be used as a reference development of the digital archery simulator in a room with embedded systems using ultrasonic sensors and web cameras. Applications developed with the simulation calculation comparing the outer radius of the circle produced a camera from a distance of three meters.

  10. Value-Oriented Coordination Process Modeling

    NARCIS (Netherlands)

    Fatemi, Hassan; van Sinderen, Marten J.; Wieringa, Roelf J.; Hull, Richard; Mendling, Jan; Tai, Stefan

    Business webs are collections of enterprises designed to jointly satisfy a consumer need. Designing business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business value and coordination process perspectives, and for mutually aligning these

  11. Comparison of perceived value structural models

    Directory of Open Access Journals (Sweden)

    Sunčana Piri Rajh

    2012-07-01

    Full Text Available Perceived value has been considered an important determinant of consumer shopping behavior and studied as such for a long period of time. According to one research stream, perceived value is a variable determined by perceived quality and perceived sacrifice. Another research stream suggests that the perception of value is a result of the consumer risk perception. This implies the presence of two somewhat independent research streams that are integrated by a third research stream – the one suggesting that perceived value is a result of perceived quality and perceived sacrifices while perceived (performance and financial risk mediates the relationship between perceived quality and perceived sacrifices on the one hand, and perceived value on the other. This paper describes the three approaches (models that have been mentioned. The aim of the paper is to determine which of the observed models show the most acceptable level of fit to the empirical data. Using the survey method, research involving three product categories has been conducted on a sample of Croatian consumers. Collected data was analyzed by the structural equation modeling (SEM method. Research has shown an appropriate level of fit of each observed model to the empirical data. However, the model measuring the effect of perceived risk on perceived value indicates the best level of fit, which implies that perceived performance risk and perceived financial risk are the best predictors of perceived value.

  12. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  13. Modeling and Simulation: An Overview

    OpenAIRE

    Michael McAleer; Felix Chan; Les Oxley

    2013-01-01

    The papers in this special issue of Mathematics and Computers in Simulation cover the following topics. Improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal. The empirical properties of some estimators of long memory, characterising trader manipulation in a limitorder driven market, measuring bias in a term-structure model of commodity prices through the c...

  14. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  15. p-values for model evaluation

    International Nuclear Information System (INIS)

    Beaujean, F.; Caldwell, A.; Kollar, D.; Kroeninger, K.

    2011-01-01

    Deciding whether a model provides a good description of data is often based on a goodness-of-fit criterion summarized by a p-value. Although there is considerable confusion concerning the meaning of p-values, leading to their misuse, they are nevertheless of practical importance in common data analysis tasks. We motivate their application using a Bayesian argumentation. We then describe commonly and less commonly known discrepancy variables and how they are used to define p-values. The distribution of these are then extracted for examples modeled on typical data analysis tasks, and comments on their usefulness for determining goodness-of-fit are given.

  16. Value for money in particle-mesh plasma simulations

    International Nuclear Information System (INIS)

    Eastwood, J.W.

    1976-01-01

    The established particle-mesh method of simulating a collisionless plasma is discussed. Problems are outlined, and it is stated that given constraints on mesh size and particle number, the only way to adjust the compromise between dispersive forces, collision time and heating time is by altering the force calculating cycle. In 'value for money', schemes, matching of parts of the force calculation cycle is optimized. Interparticle forces are considered. Optimized combinations of elements of the force calculation cycle are compared. Following sections cover the dispersion relation, and comparisons with other schemes. (U.K.)

  17. p-values for model evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Beaujean, Frederik; Caldwell, Allen [Max-Planck-Institut fuer Physik, Muenchen (Germany); Kollar, Daniel [CERN, Genf (Switzerland); Kroeninger, Kevin [Georg-August-Universitaet, Goettingen (Germany)

    2011-07-01

    In the analysis of experimental results it is often necessary to pass a judgment on the validity of a model as a representation of the data. A quantitative procedure to decide whether a model provides a good description of data is often based on a specific test statistic and a p-value summarizing both the data and the statistic's sampling distribution. Although there is considerable confusion concerning the meaning of p-values, leading to their misuse, they are nevertheless of practical importance in common data analysis tasks. We motivate the application of p-values using a Bayesian argumentation. We then describe commonly and less commonly known test statistics and how they are used to define p-values. The distribution of these are then extracted for examples modeled on typical new physics searches in high energy physics. We comment on their usefulness for determining goodness-of-fit and highlight some common pitfalls.

  18. Modeling and simulation of photovoltaic solar panel

    International Nuclear Information System (INIS)

    Belarbi, M.; Haddouche, K.; Midoun, A.

    2006-01-01

    In this article, we present a new approach for estimating the model parameters of a photovoltaic solar panel according to the irradiance and temperature. The parameters of the one diode model are given from the knowledge of three operating points: short-circuit, open circuit, and maximum power. In the first step, the adopted approach concerns the resolution of the system of equations constituting the three operating points to write all the model parameters according to series resistance. Secondly, we make an iterative resolution at the optimal operating point by using the Newton-Raphson method to calculate the series resistance value as well as the model parameters. Once the panel model is identified, we consider other equations for taking into account the irradiance and temperature effect. The simulation results show the convergence speed of the model parameters and the possibility of visualizing the electrical behaviour of the panel according to the irradiance and temperature. Let us note that a sensitivity of the algorithm at the optimal operating point was observed owing to the fact that a small variation of the optimal voltage value leads to a very great variation of the identified parameters values. With the identified model, we can develop algorithms of maximum power point tracking, and make simulations of a solar water pumping system.(Author)

  19. The Random Walk Drainage Simulation Model as a Teaching Exercise

    Science.gov (United States)

    High, Colin; Richards, Paul

    1972-01-01

    Practical instructions about using the random walk drainage network simulation model as a teaching excercise are given and the results discussed. A source of directional bias in the resulting simulated drainage patterns is identified and given an interpretation in the terms of the model. Three points of educational value concerning the model are…

  20. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...... in their specification of the conditional variance, conditional correlation, innovation distribution, and estimation approach. All of the models belong to the dynamic conditional correlation class, which is particularly suitable because it allows consistent estimations of the risk neutral dynamics with a manageable....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances....

  1. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance...... eigenfunctions and estimates of the distributions of the corresponding expansion coefficients. The simulation method utilizes the eigenfunction expansion procedure to produce preliminary time histories of the three velocity components simultaneously. As a final step, a spectral shaping procedure is then applied....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  2. Anybody can do Value at Risk: A Teaching Study using Parametric Computation and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Yun Hsing Cheung

    2012-12-01

    Full Text Available The three main Value at Risk (VaR methodologies are historical, parametric and Monte Carlo Simulation.Cheung & Powell (2012, using a step-by-step teaching study, showed how a nonparametric historical VaRmodel could be constructed using Excel, thus benefitting teachers and researchers by providing them with areadily useable teaching study and an inexpensive and flexible VaR modelling option. This article extends thatwork by demonstrating how parametric and Monte Carlo Simulation VaR models can also be constructed inExcel, thus providing a total Excel modelling package encompassing all three VaR methods.

  3. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... to operate a boiler plant dynamically means that the boiler designs must be able to absorb any fluctuations in water level and temperature gradients resulting from the pressure change in the boiler. On the one hand a large water-/steam space may be required, i.e. to build the boiler as big as possible. Due...

  4. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... to the internal pressure the consequence of the increased volume (i.e. water-/steam space) is an increased wall thickness in the pressure part of the boiler. The stresses introduced in the boiler pressure part as a result of the temperature gradients are proportional to the square of the wall thickness...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...

  5. Modeling control in manufacturing simulation

    NARCIS (Netherlands)

    Zee, Durk-Jouke van der; Chick, S.; Sánchez, P.J.; Ferrin, D.; Morrice, D.J.

    2003-01-01

    A significant shortcoming of traditional simulation languages is the lack of attention paid to the modeling of control structures, i.e., the humans or systems responsible for manufacturing planning and control, their activities and the mutual tuning of their activities. Mostly they are hard coded

  6. From business value model to coordination process model

    NARCIS (Netherlands)

    Fatemi, Hassan; Wieringa, Roelf J.; Poler, R.; van Sinderen, Marten J.; Sanchis, R.

    2009-01-01

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary

  7. The Values of College Students in Business Simulation Game: A Means-End Chain Approach

    Science.gov (United States)

    Lin, Yu-Ling; Tu, Yu-Zu

    2012-01-01

    Business simulation games (BSGs) enable students to practice making decisions in a virtual environment, accumulate experience in application of strategies, and train themselves in modes of decision-making. This study examines the value sought by players of BSG. In this study, a means-end chain (MEC) model was adopted as the basis, and ladder…

  8. A Modeling & Simulation Implementation Framework for Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Song Xiao

    2012-10-01

    Full Text Available Classical High Level Architecture (HLA systems are facing development problems for lack of supporting fine-grained component integration and interoperation in large-scale complex simulation applications. To provide efficient methods of this issue, an extensible, reusable and composable simulation framework is proposed. To promote the reusability from coarse-grained federate to fine-grained components, this paper proposes a modelling & simulation framework which consists of component-based architecture, modelling methods, and simulation services to support and simplify the process of complex simulation application construction. Moreover, a standard process and simulation tools are developed to ensure the rapid and effective development of simulation application.

  9. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  10. Modeling and Simulation for Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Swinhoe, Martyn T. [Los Alamos National Laboratory

    2012-07-26

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R&D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  11. Model continuity in discrete event simulation: A framework for model-driven development of simulation models

    NARCIS (Netherlands)

    Cetinkaya, D; Verbraeck, A.; Seck, MD

    2015-01-01

    Most of the well-known modeling and simulation (M&S) methodologies state the importance of conceptual modeling in simulation studies, and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to how to

  12. Variants of Modeling Dwelling Market Value

    Directory of Open Access Journals (Sweden)

    Barańska Anna

    2014-10-01

    Full Text Available The object of this paper is to determine real estate market value on the basis of a multidimensional function model in different variants: A - directly from the model estimated on the basis of a big database, B - from the same model form, but estimated on the basis of a reduced database consisting of dwellings most similar to the estimated one, and C - based on modeled prices corrected by random correction, calculated from random deviations for dwellings most similar to the assessed one. In the framework of statistical inference procedures, the resulting comparison was carried out by parametric significance tests. They were applied to draw conclusions on the analyzed variants

  13. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  14. New Trends, News Values, and New Models.

    Science.gov (United States)

    Higgins, Mary Anne

    1996-01-01

    Explores implications of the prediction that in the next millennium the public will experience a scarcity of knowledge and a surplus of information. Reviews research suggesting that journalists focus on these news values: emphasizing how/why, devaluing immediacy, specializing/analyzing, representing a constituency. Examines two new models of…

  15. Simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  16. The "resident's dilemma"? Values and strategies of medical residents for education interactions: a cellular automata simulation.

    Science.gov (United States)

    Heckerling, P S; Gerber, B S; Weiner, S J

    2006-01-01

    Medical residents engage in formal and informal education interactions with fellow residents during the working day, and can choose whether to spend time and effort on such interactions. Time and effort spent on such interactions can bring learning and personal satisfaction to residents, but may also delay completion of clinical work. Using hypothetical cases, we assessed the values and strategies of internal medicine residents at one hospital for both cooperative and non-cooperative education interactions with fellow residents. We then used these data and cellular automata models of two-person games to simulate repeated interactions between residents, and to determine which strategies resulted in greatest accrued value. We conducted sensitivity analyses on several model parameters, to test the robustness of dominant strategies to model assumptions. Twenty-nine of the 57 residents (50.9%) valued cooperation more than non-cooperation no matter what the other resident did during the current interaction. Similarly, thirty-six residents (63.2%) endorsed an unconditional always-cooperate strategy no matter what the other resident had done during their previous interaction. In simulations, an always-cooperate strategy accrued more value (776.42 value units) than an aggregate of strategies containing non-cooperation components (675.0 value units, p = 0.052). Only when the probability of strategy errors reached 50%, or when values were re-ordered to match those of a Prisoner's Dilemma, did non-cooperation-based strategies accrue the most value. Cooperation-based values and strategies were most frequent among our residents, and dominated in simulations of repeated education interactions between them.

  17. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  18. An Alignment Model for Collaborative Value Networks

    Science.gov (United States)

    Bremer, Carlos; Azevedo, Rodrigo Cambiaghi; Klen, Alexandra Pereira

    This paper presents parts of the work carried out in several global organizations through the development of strategic projects with high tactical and operational complexity. By investing in long-term relationships, strongly operating in the transformation of the competitive model and focusing on the value chain management, the main aim of these projects was the alignment of multiple value chains. The projects were led by the Axia Transformation Methodology as well as by its Management Model and following the principles of Project Management. As a concrete result of the efforts made in the last years in the Brazilian market this work also introduces the Alignment Model which supports the transformation process that the companies undergo.

  19. Creating Simulated Microgravity Patient Models

    Science.gov (United States)

    Hurst, Victor; Doerr, Harold K.; Bacal, Kira

    2004-01-01

    The Medical Operational Support Team (MOST) has been tasked by the Space and Life Sciences Directorate (SLSD) at the NASA Johnson Space Center (JSC) to integrate medical simulation into 1) medical training for ground and flight crews and into 2) evaluations of medical procedures and equipment for the International Space Station (ISS). To do this, the MOST requires patient models that represent the physiological changes observed during spaceflight. Despite the presence of physiological data collected during spaceflight, there is no defined set of parameters that illustrate or mimic a 'space normal' patient. Methods: The MOST culled space-relevant medical literature and data from clinical studies performed in microgravity environments. The areas of focus for data collection were in the fields of cardiovascular, respiratory and renal physiology. Results: The MOST developed evidence-based patient models that mimic the physiology believed to be induced by human exposure to a microgravity environment. These models have been integrated into space-relevant scenarios using a human patient simulator and ISS medical resources. Discussion: Despite the lack of a set of physiological parameters representing 'space normal,' the MOST developed space-relevant patient models that mimic microgravity-induced changes in terrestrial physiology. These models are used in clinical scenarios that will medically train flight surgeons, biomedical flight controllers (biomedical engineers; BME) and, eventually, astronaut-crew medical officers (CMO).

  20. The value of simulations and games for tertiary education

    NARCIS (Netherlands)

    Overmans, J.F.A.; Bakker, W.E.; van Zeeland, Y.R.A.; van der Ree, G.; Jeuring, J.T.; van Mil, M.H.W.; Glas, M.A.J.; van de Grint, E.J.M.; Bastings, M.A.S.; de Smale, S.; Dictus, W.J.A.G.

    Simulations and games play an important role in how young people learn. Through simulations and games you can practice skills that are relevant for professional practice. Through simulations and games you can learn to deal with complexity and diversity. Simulations and games already play a role in

  1. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and fie...... as support decision making. However, several other factors affect decision making such as, ethics, politics and economics. Furthermore, the insight gained when models are build leads to point out areas where knowledge is lacking....... of FMD spread that can provide useful and trustworthy advises, there are four important issues, which the model should represent: 1) The herd structure of the country in question, 2) the dynamics of animal movements and contacts between herds, 3) the biology of the disease, and 4) the regulations...

  2. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  3. Models for setting ATM parameter values

    DEFF Research Database (Denmark)

    Blaabjerg, Søren; Gravey, A.; Romæuf, L.

    1996-01-01

    presents approximate methods and discusses their applicability. We then discuss the problem of obtaining traffic characteristic values for a connection that has crossed a series of switching nodes. This problem is particularly relevant for the traffic contract components corresponding to ICIs...... (CDV) tolerance(s). The values taken by these traffic parameters characterize the so-called ''Worst Case Traffic'' that is used by CAC procedures for accepting a new connection and allocating resources to it. Conformance to the negotiated traffic characteristics is defined, at the ingress User...... essential to set traffic characteristic values that are relevant to the considered cell stream, and that ensure that the amount of non-conforming traffic is small. Using a queueing model representation for the GCRA formalism, several methods are available for choosing the traffic characteristics. This paper...

  4. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  5. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    In the present work a framework for optimizing the design of boilers for dynamic operation has been developed. A cost function to be minimized during the optimization has been formulated and for the present design variables related to the Boiler Volume and the Boiler load Gradient (i.e. ring rate...... on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...

  6. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  7. MODELING AND SIMULATION OF A HYDROCRACKING UNIT

    Directory of Open Access Journals (Sweden)

    HASSAN A. FARAG

    2016-06-01

    Full Text Available Hydrocracking is used in the petroleum industry to convert low quality feed stocks into high valued transportation fuels such as gasoline, diesel, and jet fuel. The aim of the present work is to develop a rigorous steady state two-dimensional mathematical model which includes conservation equations of mass and energy for simulating the operation of a hydrocracking unit. Both the catalyst bed and quench zone have been included in this integrated model. The model equations were numerically solved in both axial and radial directions using Matlab software. The presented model was tested against a real plant data in Egypt. The results indicated that a very good agreement between the model predictions and industrial values have been reported for temperature profiles, concentration profiles, and conversion in both radial and axial directions at the hydrocracking unit. Simulation of the quench zone conversion and temperature profiles in the quench zone was also included and gave a low deviation from the actual ones. In concentration profiles, the percentage deviation in the first reactor was found to be 9.28 % and 9.6% for the second reactor. The effect of several parameters such as: Pellet Heat Transfer Coefficient, Effective Radial Thermal Conductivity, Wall Heat Transfer Coefficient, Effective Radial Diffusivity, and Cooling medium (quench zone has been included in this study. The variation of Wall Heat Transfer Coefficient, Effective Radial Diffusivity for the near-wall region, gave no remarkable changes in the temperature profiles. On the other hand, even small variations of Effective Radial Thermal Conductivity, affected the simulated temperature profiles significantly, and this effect could not be compensated by the variations of the other parameters of the model.

  8. A simulation model for material accounting systems

    International Nuclear Information System (INIS)

    Coulter, C.A.; Thomas, K.E.

    1987-01-01

    A general-purpose model that was developed to simulate the operation of a chemical processing facility for nuclear materials has been extended to describe material measurement and accounting procedures as well. The model now provides descriptors for material balance areas, a large class of measurement instrument types and their associated measurement errors for various classes of materials, the measurement instruments themselves with their individual calibration schedules, and material balance closures. Delayed receipt of measurement results (as for off-line analytical chemistry assay), with interim use of a provisional measurement value, can be accurately represented. The simulation model can be used to estimate inventory difference variances for processing areas that do not operate at steady state, to evaluate the timeliness of measurement information, to determine process impacts of measurement requirements, and to evaluate the effectiveness of diversion-detection algorithms. Such information is usually difficult to obtain by other means. Use of the measurement simulation model is illustrated by applying it to estimate inventory difference variances for two material balance area structures of a fictitious nuclear material processing line

  9. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  10. Simulation guided value stream mapping and lean improvement: A case study of a tubular machining facility

    Directory of Open Access Journals (Sweden)

    Wei Xia

    2013-06-01

    Full Text Available Purpose: This paper describes a typical Value stream mapping (VSM application enhanced by the discrete event simulation (DES to a dedicated tubular manufacturing process. Design/Methodology/Approach: VSM is prescribed as part of lean production portfolio of tools, not only highlights process inefficiencies, transactional and communication mismatches, but also guides improvement areas. Meanwhile, DES is used to reduce uncertainty and create consensus by visualizing dynamic process views. It is served as a complementary tool for the traditional VSM to provide sufficient justification and quantifiable evidence needed to convince the lean approaches. A simulation model is developed to replicate the operation of an existing system, and that of a proposed system that modifies the existing design to incorporate lean manufacturing shop floor principles. Findings: A comprehensive model for the tubular manufacturing process is constructed, and distinctive scenarios are derived to uncover an optimal future state of the process. Various simulation scenarios are developed. The simulated results are acquired and investigated, and they are well matched with the real production data. Originality/Value: DES is demonstrated as a guided tool to assist organizations with the decision to implement lean approaches by quantifying benefits from applying the VSM. A roadmap is provided to illustrate how the VSM is used to design a desired future state. The developed simulation scenarios mimic the behavior of the actual manufacturing process in an intuitive manner.

  11. Continuous Spatial Process Models for Spatial Extreme Values

    KAUST Repository

    Sang, Huiyan

    2010-01-28

    We propose a hierarchical modeling approach for explaining a collection of point-referenced extreme values. In particular, annual maxima over space and time are assumed to follow generalized extreme value (GEV) distributions, with parameters μ, σ, and ξ specified in the latent stage to reflect underlying spatio-temporal structure. The novelty here is that we relax the conditionally independence assumption in the first stage of the hierarchial model, an assumption which has been adopted in previous work. This assumption implies that realizations of the the surface of spatial maxima will be everywhere discontinuous. For many phenomena including, e. g., temperature and precipitation, this behavior is inappropriate. Instead, we offer a spatial process model for extreme values that provides mean square continuous realizations, where the behavior of the surface is driven by the spatial dependence which is unexplained under the latent spatio-temporal specification for the GEV parameters. In this sense, the first stage smoothing is viewed as fine scale or short range smoothing while the larger scale smoothing will be captured in the second stage of the modeling. In addition, as would be desired, we are able to implement spatial interpolation for extreme values based on this model. A simulation study and a study on actual annual maximum rainfall for a region in South Africa are used to illustrate the performance of the model. © 2009 International Biometric Society.

  12. Impulse pumping modelling and simulation

    International Nuclear Information System (INIS)

    Pierre, B; Gudmundsson, J S

    2010-01-01

    Impulse pumping is a new pumping method based on propagation of pressure waves. Of particular interest is the application of impulse pumping to artificial lift situations, where fluid is transported from wellbore to wellhead using pressure waves generated at wellhead. The motor driven element of an impulse pumping apparatus is therefore located at wellhead and can be separated from the flowline. Thus operation and maintenance of an impulse pump are facilitated. The paper describes the different elements of an impulse pumping apparatus, reviews the physical principles and details the modelling of the novel pumping method. Results from numerical simulations of propagation of pressure waves in water-filled pipelines are then presented for illustrating impulse pumping physical principles, and validating the described modelling with experimental data.

  13. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  14. Norms and values in sociohydrological models

    Science.gov (United States)

    Roobavannan, Mahendran; van Emmerik, Tim H. M.; Elshafei, Yasmina; Kandasamy, Jaya; Sanderson, Matthew R.; Vigneswaran, Saravanamuthu; Pande, Saket; Sivapalan, Murugesu

    2018-02-01

    Sustainable water resources management relies on understanding how societies and water systems coevolve. Many place-based sociohydrology (SH) modeling studies use proxies, such as environmental degradation, to capture key elements of the social component of system dynamics. Parameters of assumed relationships between environmental degradation and the human response to it are usually obtained through calibration. Since these relationships are not yet underpinned by social-science theories, confidence in the predictive power of such place-based sociohydrologic models remains low. The generalizability of SH models therefore requires major advances in incorporating more realistic relationships, underpinned by appropriate hydrological and social-science data and theories. The latter is a critical input, since human culture - especially values and norms arising from it - influences behavior and the consequences of behaviors. This paper reviews a key social-science theory that links cultural factors to environmental decision-making, assesses how to better incorporate social-science insights to enhance SH models, and raises important questions to be addressed in moving forward. This is done in the context of recent progress in sociohydrological studies and the gaps that remain to be filled. The paper concludes with a discussion of challenges and opportunities in terms of generalization of SH models and the use of available data to allow future prediction and model transfer to ungauged basins.

  15. Norms and values in sociohydrological models

    Directory of Open Access Journals (Sweden)

    M. Roobavannan

    2018-02-01

    Full Text Available Sustainable water resources management relies on understanding how societies and water systems coevolve. Many place-based sociohydrology (SH modeling studies use proxies, such as environmental degradation, to capture key elements of the social component of system dynamics. Parameters of assumed relationships between environmental degradation and the human response to it are usually obtained through calibration. Since these relationships are not yet underpinned by social-science theories, confidence in the predictive power of such place-based sociohydrologic models remains low. The generalizability of SH models therefore requires major advances in incorporating more realistic relationships, underpinned by appropriate hydrological and social-science data and theories. The latter is a critical input, since human culture – especially values and norms arising from it – influences behavior and the consequences of behaviors. This paper reviews a key social-science theory that links cultural factors to environmental decision-making, assesses how to better incorporate social-science insights to enhance SH models, and raises important questions to be addressed in moving forward. This is done in the context of recent progress in sociohydrological studies and the gaps that remain to be filled. The paper concludes with a discussion of challenges and opportunities in terms of generalization of SH models and the use of available data to allow future prediction and model transfer to ungauged basins.

  16. Petroleum reservoir data for testing simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Lloyd, J.M.; Harrison, W.

    1980-09-01

    This report consists of reservoir pressure and production data for 25 petroleum reservoirs. Included are 5 data sets for single-phase (liquid) reservoirs, 1 data set for a single-phase (liquid) reservoir with pressure maintenance, 13 data sets for two-phase (liquid/gas) reservoirs and 6 for two-phase reservoirs with pressure maintenance. Also given are ancillary data for each reservoir that could be of value in the development and validation of simulation models. A bibliography is included that lists the publications from which the data were obtained.

  17. Stochastic models to simulate paratuberculosis in dairy herds

    DEFF Research Database (Denmark)

    Nielsen, Søren Saxmose; Weber, M.F.; Kudahl, Anne Margrethe Braad

    2011-01-01

    Stochastic simulation models are widely accepted as a means of assessing the impact of changes in daily management and the control of different diseases, such as paratuberculosis, in dairy herds. This paper summarises and discusses the assumptions of four stochastic simulation models and their use...... the models are somewhat different in their underlying principles and do put slightly different values on the different strategies, their overall findings are similar. Therefore, simulation models may be useful in planning paratuberculosis strategies in dairy herds, although as with all models caution...

  18. Creating Value in Marketing and Business Simulations: An Author's Viewpoint

    Science.gov (United States)

    Cadotte, Ernest R.

    2016-01-01

    Simulations are a form of competitive training that can provide transformational learning. Participants are pushed by the competition and their own desire to win as well as the continual feedback, encouragement, and guidance of a Business Coach. Simulations enable students to apply their knowledge and practice their business skills over and over.…

  19. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  20. Advanced empirical estimate of information value for credit scoring models

    Directory of Open Access Journals (Sweden)

    Martin Řezáč

    2011-01-01

    Full Text Available Credit scoring, it is a term for a wide spectrum of predictive models and their underlying techniques that aid financial institutions in granting credits. These methods decide who will get credit, how much credit they should get, and what further strategies will enhance the profitability of the borrowers to the lenders. Many statistical tools are avaiable for measuring quality, within the meaning of the predictive power, of credit scoring models. Because it is impossible to use a scoring model effectively without knowing how good it is, quality indexes like Gini, Kolmogorov-Smirnov statisic and Information value are used to assess quality of given credit scoring model. The paper deals primarily with the Information value, sometimes called divergency. Commonly it is computed by discretisation of data into bins using deciles. One constraint is required to be met in this case. Number of cases have to be nonzero for all bins. If this constraint is not fulfilled there are some practical procedures for preserving finite results. As an alternative method to the empirical estimates one can use the kernel smoothing theory, which allows to estimate unknown densities and consequently, using some numerical method for integration, to estimate value of the Information value. The main contribution of this paper is a proposal and description of the empirical estimate with supervised interval selection. This advanced estimate is based on requirement to have at least k, where k is a positive integer, observations of socres of both good and bad client in each considered interval. A simulation study shows that this estimate outperform both the empirical estimate using deciles and the kernel estimate. Furthermore it shows high dependency on choice of the parameter k. If we choose too small value, we get overestimated value of the Information value, and vice versa. Adjusted square root of number of bad clients seems to be a reasonable compromise.

  1. Simulation and Modeling Application in Agricultural Mechanization

    Directory of Open Access Journals (Sweden)

    R. M. Hudzari

    2012-01-01

    Full Text Available This experiment was conducted to determine the equations relating the Hue digital values of the fruits surface of the oil palm with maturity stage of the fruit in plantation. The FFB images were zoomed and captured using Nikon digital camera, and the calculation of Hue was determined using the highest frequency of the value for R, G, and B color components from histogram analysis software. New procedure in monitoring the image pixel value for oil palm fruit color surface in real-time growth maturity was developed. The estimation of day harvesting prediction was calculated based on developed model of relationships for Hue values with mesocarp oil content. The simulation model is regressed and predicts the day of harvesting or a number of days before harvest of FFB. The result from experimenting on mesocarp oil content can be used for real-time oil content determination of MPOB color meter. The graph to determine the day of harvesting the FFB was presented in this research. The oil was found to start developing in mesocarp fruit at 65 days before fruit at ripe maturity stage of 75% oil to dry mesocarp.

  2. Extreme Value Predictions using Monte Carlo Simulations with Artificially Increased Load Spectrum

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2011-01-01

    In the analysis of structures subjected to stationary stochastic load processes the mean out-crossing rate plays an important role as it can be used to determine the extreme value distribution of any response, usually assuming that the sequence of mean out-crossings can be modelled as a Poisson...... to be valid in the Monte Carlo simulations, making it possible to increase the out-crossing rates and thus reduce the necessary length of the time domain simulations by applying a larger load spectrum than relevant from a design point of view. The mean out-crossing rate thus obtained can then afterwards...... be scaled down to its actual value. In the present paper the usefulness of this approach is investigated, considering problems related to wave loads on marine structures. Here the load scale parameter is conveniently taken as the square of the significant wave height....

  3. Model for transient simulation in a PWR steam circuit

    International Nuclear Information System (INIS)

    Mello, L.A. de.

    1982-11-01

    A computer code (SURF) was developed and used to simulate pressure losses along the tubes of the main steam circuit of a PWR nuclear power plant, and the steam flow through relief and safety valves when pressure reactors its thresholds values. A thermodynamic model of turbines (high and low pressure), and its associated components are simulated too. The SURF computer code was coupled to the GEVAP computer code, complementing the simulation of a PWR nuclear power plant main steam circuit. (Author) [pt

  4. Systematic simulations of modified gravity: chameleon models

    Energy Technology Data Exchange (ETDEWEB)

    Brax, Philippe [Institut de Physique Theorique, CEA, IPhT, CNRS, URA 2306, F-91191Gif/Yvette Cedex (France); Davis, Anne-Christine [DAMTP, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Li, Baojiu [Institute for Computational Cosmology, Department of Physics, Durham University, Durham DH1 3LE (United Kingdom); Winther, Hans A. [Institute of Theoretical Astrophysics, University of Oslo, 0315 Oslo (Norway); Zhao, Gong-Bo, E-mail: philippe.brax@cea.fr, E-mail: a.c.davis@damtp.cam.ac.uk, E-mail: baojiu.li@durham.ac.uk, E-mail: h.a.winther@astro.uio.no, E-mail: gong-bo.zhao@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 3FX (United Kingdom)

    2013-04-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.

  5. Systematic simulations of modified gravity: chameleon models

    International Nuclear Information System (INIS)

    Brax, Philippe; Davis, Anne-Christine; Li, Baojiu; Winther, Hans A.; Zhao, Gong-Bo

    2013-01-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc −1 , since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future

  6. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  7. Crowd Human Behavior for Modeling and Simulation

    Science.gov (United States)

    2009-08-06

    Crowd Human Behavior for Modeling and Simulation Elizabeth Mezzacappa, Ph.D. & Gordon Cooke, MEME Target Behavioral Response Laboratory, ARDEC...TYPE Conference Presentation 3. DATES COVERED 00-00-2008 to 00-00-2009 4. TITLE AND SUBTITLE Crowd Human Behavior for Modeling and Simulation...34understanding human behavior " and "model validation and verification" and will focus on modeling and simulation of crowds from a social scientist???s

  8. An approach to value-based selection of a simulator selection: the creation and evaluation of the simulator value index tool.

    Science.gov (United States)

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-01-18

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  10. Dissemination of Cultural Norms and Values: Agent-Based Modeling

    Directory of Open Access Journals (Sweden)

    Denis Andreevich Degterev

    2016-12-01

    Full Text Available This article shows how agent-based modeling allows us to explore the mechanisms of the dissemination of cultural norms and values both within one country and in the whole world. In recent years, this type of simulation is particularly prevalent in the analysis of international relations, becoming more popular than the system dynamics and discrete event simulation. The use of agent-based modeling in the analysis of international relations is connected with the agent-structure problem in international relations. Structure and agents act as interdependent and dynamically changing in the process of interaction between entities. Agent-structure interaction could be modeled by means of the theory of complex adaptive systems with the use of agent-based modeling techniques. One of the first examples of the use of agent-based modeling in political science is a model of racial segregation T. Shellinga. On the basis of this model, the author shows how the change in behavioral patterns at micro-level impacts on the macro-level. Patterns are changing due to the dynamics of cultural norms and values, formed by mass-media and other social institutes. The author shows the main areas of modern application of agent-based modeling in international studies including the analysis of ethnic conflicts, the formation of international coalitions. Particular attention is paid to Robert Axelrod approach based on the use of genetic algorithms to the spread of cultural norms and values. Agent-based modeling shows how to how to create such conditions that the norms that originally are not shared by a significant part of the population, eventually spread everywhere. Practical application of these algorithms is shown by the author of the article on the example of the situation in Ukraine in 2015-2016. The article also reveals the mechanisms of international spread of cultural norms and values. The main think-tanks using agent-based modeling in international studies are

  11. Simulation Model for DMEK Donor Preparation.

    Science.gov (United States)

    Mittal, Vikas; Mittal, Ruchi; Singh, Swati; Narang, Purvasha; Sridhar, Priti

    2018-04-09

    To demonstrate a simulation model for donor preparation in Descemet membrane endothelial keratoplasty (DMEK). The inner transparent membrane of the onion (Allium cepa) was used as a simulation model for human Descemet membrane (DM). Surgical video (see Video, Supplemental Digital Content 1, http://links.lww.com/ICO/A663) demonstrating all the steps was recorded. This model closely simulates human DM and helps DMEK surgeons learn the nuances of DM donor preparation steps with ease. The technique is repeatable, and the model is cost-effective. The described simulation model can assist surgeons and eye bank technicians to learn steps in donor preparation in DMEK.

  12. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  13. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  14. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  15. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  16. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  17. The Advancement Value Chain: An Exploratory Model

    Science.gov (United States)

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  18. Psychosocial value of space simulation for extended spaceflight

    Science.gov (United States)

    Kanas, N.

    1997-01-01

    There have been over 60 studies of Earth-bound activities that can be viewed as simulations of manned spaceflight. These analogs have involved Antarctic and Arctic expeditions, submarines and submersible simulators, land-based simulators, and hypodynamia environments. None of these analogs has accounted for all the variables related to extended spaceflight (e.g., microgravity, long-duration, heterogeneous crews), and some of the stimulation conditions have been found to be more representative of space conditions than others. A number of psychosocial factors have emerged from the simulation literature that correspond to important issues that have been reported from space. Psychological factors include sleep disorders, alterations in time sense, transcendent experiences, demographic issues, career motivation, homesickness, and increased perceptual sensitivities. Psychiatric factors include anxiety, depression, psychosis, psychosomatic symptoms, emotional reactions related to mission stage, asthenia, and postflight personality, and marital problems. Finally, interpersonal factors include tension resulting from crew heterogeneity, decreased cohesion over time, need for privacy, and issues involving leadership roles and lines of authority. Since future space missions will usually involve heterogeneous crews working on complicated objectives over long periods of time, these features require further study. Socio-cultural factors affecting confined crews (e.g., language and dialect, cultural differences, gender biases) should be explored in order to minimize tension and sustain performance. Career motivation also needs to be examined for the purpose of improving crew cohesion and preventing subgrouping, scapegoating, and territorial behavior. Periods of monotony and reduced activity should be addressed in order to maintain morale, provide meaningful use of leisure time, and prevent negative consequences of low stimulation, such as asthenia and crew member withdrawal

  19. Simulation of Heat Transfer to the Gas Coolant with Low Prandtl Number Value

    Directory of Open Access Journals (Sweden)

    T. N. Kulikova

    2015-01-01

    Full Text Available The work concerns the simulating peculiarities of heat transfer to the gas coolants with low values of the Prandtl number, in particular, to the binary mixtures of inert gases.The paper presents simulation results of heat transfer to the fully established flow of a helium-xenon mixture in the round tube of 6 mm in diameter with the boundary condition of the second kind. It considers a flow of three helium-xenon mixtures with different helium content and molecular Prandtl numbers within the range 0.239–0.322 and with Reynolds numbers ranged from 10000 to 50000. During numerical simulation a temperature factor changed from 1.034 to 1.061. CFD-code STAR-CCM+ that is designed for solving a wide range of problems of hydrodynamics, heat transfer and stress was used as the primary software.The applicability of the five models for the turbulent Prandtl number is examined. It is shown that the choice of the model has a significant influence on the heat transfer coefficient. The paper presents structural characteristics of the flow in the wall region. It estimates a thermal stabilization section to be approximately as long as 30 diameters of tube.Simulation results are compared with the known data on heat transfer to gas coolants with low values of the Prandtl number. It is shown that V2F low-Reynolds number -ε turbulence model with an approximation for the turbulent Prandtl number used according Kays-CrawfordWeigand gives the best compliance with the results predicted by relationships of Kays W.M. and Petukhov B.S. The approximating correlation summarizes a set of simulation results.Application of the work results is reasonable when conducting the numerical simulation of heat transfer to binary gas mixtures in channels of different forms. The presented approximating correlation allows rapid estimate of heat transfer coefficients to the gas coolants with a low value of the molecular Prandl number within the investigated range with a flow through the

  20. Modelling and simulation of a heat exchanger

    Science.gov (United States)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  1. A simulation of water pollution model parameter estimation

    Science.gov (United States)

    Kibler, J. F.

    1976-01-01

    A parameter estimation procedure for a water pollution transport model is elaborated. A two-dimensional instantaneous-release shear-diffusion model serves as representative of a simple transport process. Pollution concentration levels are arrived at via modeling of a remote-sensing system. The remote-sensed data are simulated by adding Gaussian noise to the concentration level values generated via the transport model. Model parameters are estimated from the simulated data using a least-squares batch processor. Resolution, sensor array size, and number and location of sensor readings can be found from the accuracies of the parameter estimates.

  2. Multi-valued simulation and abstraction using lattice operations

    NARCIS (Netherlands)

    Vijzelaar, Stefan; Fokkink, W.J.

    2017-01-01

    Abstractions can cause spurious results, which need to be verified in the concrete system to gain conclusive results. Verification based on a multi-valued logic can distinguish between conclusive and inconclusive results, provides increased precision, and allows for encoding additional information

  3. Comparison of perceived value structural models

    OpenAIRE

    Sunčana Piri Rajh

    2012-01-01

    Perceived value has been considered an important determinant of consumer shopping behavior and studied as such for a long period of time. According to one research stream, perceived value is a variable determined by perceived quality and perceived sacrifice. Another research stream suggests that the perception of value is a result of the consumer risk perception. This implies the presence of two somewhat independent research streams that are integrated by a third research stream – the one sug...

  4. Coupled Monte Carlo simulation and Copula theory for uncertainty analysis of multiphase flow simulation models.

    Science.gov (United States)

    Jiang, Xue; Na, Jin; Lu, Wenxi; Zhang, Yu

    2017-11-01

    Simulation-optimization techniques are effective in identifying an optimal remediation strategy. Simulation models with uncertainty, primarily in the form of parameter uncertainty with different degrees of correlation, influence the reliability of the optimal remediation strategy. In this study, a coupled Monte Carlo simulation and Copula theory is proposed for uncertainty analysis of a simulation model when parameters are correlated. Using the self-adaptive weight particle swarm optimization Kriging method, a surrogate model was constructed to replace the simulation model and reduce the computational burden and time consumption resulting from repeated and multiple Monte Carlo simulations. The Akaike information criterion (AIC) and the Bayesian information criterion (BIC) were employed to identify whether the t Copula function or the Gaussian Copula is the optimal Copula function to match the relevant structure of the parameters. The results show that both the AIC and BIC values of the t Copula function are less than those of the Gaussian Copula function. This indicates that the t Copula function is the optimal function for matching the relevant structure of the parameters. The outputs of the simulation model when parameter correlation was considered and when it was ignored were compared. The results show that the amplitude of the fluctuation interval when parameter correlation was considered is less than the corresponding amplitude when parameter estimation was ignored. Moreover, it was demonstrated that considering the correlation among parameters is essential for uncertainty analysis of a simulation model, and the results of uncertainty analysis should be incorporated into the remediation strategy optimization process.

  5. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    OpenAIRE

    Jin Xiao; Bing Zhu; Geer Teng; Changzheng He; Dunhu Liu

    2014-01-01

    Scientific customer value segmentation (CVS) is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM) model. On the one hand, ODCEM integrates the preprocess of missing values and the classif...

  6. VHDL simulation with access to transistor models

    Science.gov (United States)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  7. Policy advice derived from simulation models

    NARCIS (Netherlands)

    Brenner, T.; Werker, C.

    2009-01-01

    When advising policy we face the fundamental problem that economic processes are connected with uncertainty and thus policy can err. In this paper we show how the use of simulation models can reduce policy errors. We suggest that policy is best based on socalled abductive simulation models, which

  8. Model Validation for Simulations of Vehicle Systems

    Science.gov (United States)

    2012-08-01

    jackknife”, Annals of Statistics, 7:1-26, 1979. [45] B. Efron and G. Gong, “A leisurely look at the bootstrap, the jackknife, and cross-validation”, The...battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation of ground vehicle systems...Sandia National Laboratories and a battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation

  9. Transient Modeling and Simulation of Compact Photobioreactors

    OpenAIRE

    Ribeiro, Robert Luis Lara; Mariano, André Bellin; Souza, Jeferson Avila; Vargas, Jose Viriato Coelho

    2008-01-01

    In this paper, a mathematical model is developed to make possible the simulation of microalgae growth and its dependency on medium temperature and light intensity. The model is utilized to simulate a compact photobioreactor response in time with physicochemical parameters of the microalgae Phaeodactylum tricornutum. The model allows for the prediction of the transient and local evolution of the biomass concentration in the photobioreactor with low computational time. As a result, the model is...

  10. Value Reappraisal as a Conceptual Model for Task-Value Interventions

    Science.gov (United States)

    Acee, Taylor W.; Weinstein, Claire Ellen; Hoang, Theresa V.; Flaggs, Darolyn A.

    2018-01-01

    We discuss task-value interventions as one type of relevance intervention and propose a process model of value reappraisal whereby task-value interventions elicit cognitive-affective responses that lead to attitude change and in turn affect academic outcomes. The model incorporates a metacognitive component showing that students can intentionally…

  11. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  12. How processing digital elevation models can affect simulated water budgets

    Science.gov (United States)

    Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  13. Modeling and simulation technology readiness levels.

    Energy Technology Data Exchange (ETDEWEB)

    Clay, Robert L.; Shneider, Max S.; Marburger, S. J.; Trucano, Timothy Guy

    2006-01-01

    This report summarizes the results of an effort to establish a framework for assigning and communicating technology readiness levels (TRLs) for the modeling and simulation (ModSim) capabilities at Sandia National Laboratories. This effort was undertaken as a special assignment for the Weapon Simulation and Computing (WSC) program office led by Art Hale, and lasted from January to September 2006. This report summarizes the results, conclusions, and recommendations, and is intended to help guide the program office in their decisions about the future direction of this work. The work was broken out into several distinct phases, starting with establishing the scope and definition of the assignment. These are characterized in a set of key assertions provided in the body of this report. Fundamentally, the assignment involved establishing an intellectual framework for TRL assignments to Sandia's modeling and simulation capabilities, including the development and testing of a process to conduct the assignments. To that end, we proposed a methodology for both assigning and understanding the TRLs, and outlined some of the restrictions that need to be placed on this process and the expected use of the result. One of the first assumptions we overturned was the notion of a ''static'' TRL--rather we concluded that problem context was essential in any TRL assignment, and that leads to dynamic results (i.e., a ModSim tool's readiness level depends on how it is used, and by whom). While we leveraged the classic TRL results from NASA, DoD, and Sandia's NW program, we came up with a substantially revised version of the TRL definitions, maintaining consistency with the classic level definitions and the Predictive Capability Maturity Model (PCMM) approach. In fact, we substantially leveraged the foundation the PCMM team provided, and augmented that as needed. Given the modeling and simulation TRL definitions and our proposed assignment methodology, we

  14. Managing health care decisions and improvement through simulation modeling.

    Science.gov (United States)

    Forsberg, Helena Hvitfeldt; Aronsson, Håkan; Keller, Christina; Lindblad, Staffan

    2011-01-01

    Simulation modeling is a way to test changes in a computerized environment to give ideas for improvements before implementation. This article reviews research literature on simulation modeling as support for health care decision making. The aim is to investigate the experience and potential value of such decision support and quality of articles retrieved. A literature search was conducted, and the selection criteria yielded 59 articles derived from diverse applications and methods. Most met the stated research-quality criteria. This review identified how simulation can facilitate decision making and that it may induce learning. Furthermore, simulation offers immediate feedback about proposed changes, allows analysis of scenarios, and promotes communication on building a shared system view and understanding of how a complex system works. However, only 14 of the 59 articles reported on implementation experiences, including how decision making was supported. On the basis of these articles, we proposed steps essential for the success of simulation projects, not just in the computer, but also in clinical reality. We also presented a novel concept combining simulation modeling with the established plan-do-study-act cycle for improvement. Future scientific inquiries concerning implementation, impact, and the value for health care management are needed to realize the full potential of simulation modeling.

  15. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  16. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell...

  17. On the added value of WUDAPT for Urban Climate Modelling

    Science.gov (United States)

    Brousse, Oscar; Martilli, Alberto; Mills, Gerald; Bechtel, Benjamin; Hammerberg, Kris; Demuzere, Matthias; Wouters, Hendrik; Van Lipzig, Nicole; Ren, Chao; Feddema, Johannes J.; Masson, Valéry; Ching, Jason

    2017-04-01

    Over half of the planet's population now live in cities and is expected to grow up to 65% by 2050 (United Nations, 2014), most of whom will actually occupy new emerging cities of the global South. Cities' impact on climate is known to be a key driver of environmental change (IPCC, 2014) and has been studied for decades now (Howard, 1875). Still very little is known about our cities' structure around the world, preventing urban climate simulations to be done and hence guidance to be provided for mitigation. Assessing the need to bridge the urban knowledge gap for urban climate modelling perspectives, the World Urban Database and Access Portal Tool - WUDAPT - project (Ching et al., 2015; Mills et al., 2015) developed an innovative technique to map cities globally rapidly and freely. The framework established by Bechtel and Daneke (2012) derives Local Climate Zones (Stewart and Oke, 2012) city maps out of LANDSAT 8 OLI-TIRS imagery (Bechtel et al., 2015) through a supervised classification by a Random Forest Classification algorithm (Breiman, 2001). The first attempt to implement Local Climate Zones (LCZ) out of the WUDAPT product within a major climate model was carried out by Brousse et al. (2016) over Madrid, Spain. This study proved the applicability of LCZs as an enhanced urban parameterization within the WRF model (Chen et al. 2011) employing the urban canopy model BEP-BEM (Martilli, 2002; Salamanca et al., 2010), using the averaged values of the morphological and physical parameters' ranges proposed by Stewart and Oke (2012). Other studies have now used the Local Climate Zones for urban climate modelling purposes (Alexander et al., 2016; Wouters et al. 2016; Hammerberg et al., 2017; Brousse et al., 2017) and demonstrated the added value of the WUDAPT dataset. As urban data accessibility is one of the major challenge for simulations in emerging countries, this presentation will show results of simulations using LCZs and the capacity of the WUDAPT framework to be

  18. Modeling and simulation of blood collection systems.

    Science.gov (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier

    2012-03-01

    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  19. The New Digital Media Value Network: Proposing an Interactive Model of Digital Media Value Activities

    Directory of Open Access Journals (Sweden)

    Sylvia Chan-Olmsted

    2016-07-01

    Full Text Available This study models the dynamic nature of today’s media markets using the framework of value-adding activities in the provision and consumption of media products. The proposed user-centric approach introduces the notion that the actions of external users, social media, and interfaces affect the internal value activities of media firms via a feedback loop, and therefore should themselves be considered value activities. The model also suggests a more comprehensive list of indicators for value assessment.

  20. TESTING BRAND VALUE MEASUREMENT METHODS IN A RANDOM COEFFICIENT MODELING FRAMEWORK

    OpenAIRE

    Szõcs Attila

    2014-01-01

    Our objective is to provide a framework for measuring brand equity, that is, the added value to the product endowed by the brand. Based on a demand and supply model, we propose a structural model that enables testing the structural effect of brand equity (demand side effect) on brand value (supply side effect), using Monte Carlo simulation. Our main research question is which of the three brand value measurement methods (price premium, revenue premium and profit premium) is more suitable from...

  1. Mathematical and Simulation Model Development of Switched Reluctance Motor

    Directory of Open Access Journals (Sweden)

    S. V. Aleksandrovsky

    2011-01-01

    Full Text Available The switched reluctance motor (SRM represents a great interest while being applied in various fields as an alternative to asynchronous motors with a short-circuit rotor. A SRM disadvantage is a nonlinearity of its characteristics. Due to this reason it is desirable to execute investigations using a developed simulation model. The simulation results (electromagnetic torque and current are in good agreement with those values studied in the literature.

  2. TESTING BRAND VALUE MEASUREMENT METHODS IN A RANDOM COEFFICIENT MODELING FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Szõcs Attila

    2014-07-01

    Full Text Available Our objective is to provide a framework for measuring brand equity, that is, the added value to the product endowed by the brand. Based on a demand and supply model, we propose a structural model that enables testing the structural effect of brand equity (demand side effect on brand value (supply side effect, using Monte Carlo simulation. Our main research question is which of the three brand value measurement methods (price premium, revenue premium and profit premium is more suitable from the perspective of the structural link between brand equity and brand value. Our model is based on recent developments in random coefficients model applications.

  3. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  4. Modeling value creation with enterprise architecture

    NARCIS (Netherlands)

    Singh, Prince Mayurank; Jonkers, H.; Iacob, Maria Eugenia; van Sinderen, Marten J.

    2014-01-01

    Firms may not succeed in business if strategies are not properly implemented in practice. Every firm needs to know, represent and master its value creation logic, not only to stay in business but also to keep growing. This paper is about focusing on an important topic in the field of strategic

  5. The added value of business models

    NARCIS (Netherlands)

    Vliet, Harry van

    An overview of innovations in a particular area, for example retail developments in the fashion sector (Van Vliet, 2014), and a subsequent discussion about the probability as to whether these innovations will realise a ‘breakthrough’, has to be supplemented with the question of what the added value

  6. Physically realistic modeling of maritime training simulation

    OpenAIRE

    Cieutat , Jean-Marc

    2003-01-01

    Maritime training simulation is an important matter of maritime teaching, which requires a lot of scientific and technical skills.In this framework, where the real time constraint has to be maintained, all physical phenomena cannot be studied; the most visual physical phenomena relating to the natural elements and the ship behaviour are reproduced only. Our swell model, based on a surface wave simulation approach, permits to simulate the shape and the propagation of a regular train of waves f...

  7. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  8. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  9. Magnetosphere Modeling: From Cartoons to Simulations

    Science.gov (United States)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems

  10. Dynamic Average-Value Modeling of Doubly-Fed Induction Generator Wind Energy Conversion Systems

    Science.gov (United States)

    Shahab, Azin

    In a Doubly-fed Induction Generator (DFIG) wind energy conversion system, the rotor of a wound rotor induction generator is connected to the grid via a partial scale ac/ac power electronic converter which controls the rotor frequency and speed. In this research, detailed models of the DFIG wind energy conversion system with Sinusoidal Pulse-Width Modulation (SPWM) scheme and Optimal Pulse-Width Modulation (OPWM) scheme for the power electronic converter are developed in detail in PSCAD/EMTDC. As the computer simulation using the detailed models tends to be computationally extensive, time consuming and even sometimes not practical in terms of speed, two modified approaches (switching-function modeling and average-value modeling) are proposed to reduce the simulation execution time. The results demonstrate that the two proposed approaches reduce the simulation execution time while the simulation results remain close to those obtained using the detailed model simulation.

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    A familiar example of a feedback loop is the business model in which part of the output or profit is fedback as input or additional capital - for instance, a company may choose to reinvest 10% of the profit for expansion of the business. Such simple models, like ..... would help scientists, engineers and managers towards better.

  12. Complex Simulation Model of Mobile Fading Channel

    Directory of Open Access Journals (Sweden)

    Tomas Marek

    2005-01-01

    Full Text Available In the mobile communication environment the mobile channel is the main limiting obstacle to reach the best performance of wireless system. Modeling of the radio channel consists of two basic fading mechanisms - Long-term fading and Short-term fading. The contribution deals with simulation of complex mobile radio channel, which is the channel with all fading components. Simulation model is based on Clarke-Gans theoretical model for fading channel and is developed in MATLAB environment. Simulation results have shown very good coincidence with theory. This model was developed for hybrid adaptation 3G uplink simulator (described in this issue during the research project VEGA - 1/0140/03.

  13. Simulation Model Development for Mail Screening Process

    National Research Council Canada - National Science Library

    Vargo, Trish; Marvin, Freeman; Kooistra, Scott

    2005-01-01

    STUDY OBJECTIVE: Provide decision analysis support to the Homeland Defense Business Unit, Special Projects Team, in developing a simulation model to help determine the most effective way to eliminate backlog...

  14. SEIR model simulation for Hepatitis B

    Science.gov (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  15. Simulation data mapping in virtual cardiac model.

    Science.gov (United States)

    Jiquan, Liu; Jingyi, Feng; Duan, Huilong; Siping, Chen

    2004-01-01

    Although 3D heart and torso model with realistic geometry are basis of simulation computation in LFX virtual cardiac model, the simulation results are mostly output in 2D format. To solve such a problem and enhance the virtual reality of LFX virtual cardiac model, the methods of voxel mapping and vertex project mapping were presented. With these methods, excitation isochrone map (EIM) was mapped from heart model with realistic geometry to real visible man heart model, and body surface potential map (BSPM) was mapped from torso model with realistic geometry to real visible man body surface. By visualizing in the 4Dview, which is a real-time 3D medical image visualization platform, the visualization results of EIM and BSPM simulation data before and after mapping were also provided. According to the visualization results, the output format of EIM and BSPM simulation data of LFX virtual cardiac model were extended from 2D to 4D (spatio-temporal) and from cardiac model with realistic geometry to real cardiac model, and more realistic and effective simulation was achieved.

  16. Behavioral Influence Modeling and Simulation

    Science.gov (United States)

    2014-06-30

    Data will be collected using an Android application that was developed and installed onto a Samsung tablet. 4.4.1. Data Collection In addition to...4.4.4. Phase Two Data Collection Tool A Samsung Galaxy tablet was selected for Phase II data collection. A category survey application was...break down each aggregate value. For example: Apple was answered by 80% of subjects Apple is an aggregate answer, and covers four answers Apple – 70

  17. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  18. Fully Adaptive Radar Modeling and Simulation Development

    Science.gov (United States)

    2017-04-01

    AFRL-RY-WP-TR-2017-0074 FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT Kristine L. Bell and Anthony Kellems Metron, Inc...SMALL BUSINESS INNOVATION RESEARCH (SBIR) PHASE I REPORT. Approved for public release; distribution unlimited. See additional restrictions...2017 4. TITLE AND SUBTITLE FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT 5a. CONTRACT NUMBER FA8650-16-M-1774 5b. GRANT NUMBER 5c

  19. The added value of the replica simulators in the exploitation of nuclear power plants

    International Nuclear Information System (INIS)

    Diaz Giron, P. a.; Ortega, F.; Rivero, N.

    2011-01-01

    Nuclear power plants full scope replica simulators were in the past solely designed following operational personnel training criteria. Nevertheless, these simulators not only feature a high replica control room but also provide an accurate process response. Control room replica simulators are presently based on complex technological platforms permitting highest physical and functional fidelity, allowing to be used as versatile and value added tools in diverse plants operation and maintenance activities. In recent years. Tecnatom has extended the use of such simulators to different engineering applications. this article intends to identify the simulators use in training and other applications beyond training. (Author)

  20. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  1. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  2. Challenges for Modeling and Simulation

    National Research Council Canada - National Science Library

    Johnson, James

    2002-01-01

    This document deals with modeling and stimulation. The strengths are study processes that rarely or never occur, evaluate a wide range of alternatives, generate new ideas, new concepts and innovative solutions...

  3. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    Directory of Open Access Journals (Sweden)

    Jin Xiao

    2014-01-01

    Full Text Available Scientific customer value segmentation (CVS is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM model. On the one hand, ODCEM integrates the preprocess of missing values and the classification modeling into one step; on the other hand, it utilizes multiple classifiers ensemble technology in constructing the classification models. The empirical results in credit scoring dataset “German” from UCI and the real customer churn prediction dataset “China churn” show that the ODCEM outperforms four commonly used “two-step” models and the ensemble based model LMF and can provide better decision support for market managers.

  4. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  5. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  6. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  7. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  8. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  9. A simulation model for football championships

    OpenAIRE

    Koning, Ruud H.; Koolhaas, Michael; Renes, Gusta

    2001-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like ‘which team had a lucky draw?’ or ‘what is the probability that two teams meet at some moment in the tournament?’. Input to the simulation/probability model are scoring intensities, that are estimated as a weighted average of goals scored. The model has been used in practice to write articles for the popular press, ...

  10. Modelling and Simulation of RF Multilayer Inductors in LTCC Technology

    Directory of Open Access Journals (Sweden)

    A. Čelić

    2009-11-01

    Full Text Available This paper is aimed at presenting the models and characteristics of two types of inductors designed in LTCC (Low Temperature Cofired Ceramic technology. We present the physical model of a 3D planar solenoid-type inductor and of a serial planar solenoid-type inductor for the RF (radio frequency range. To verify the results obtained by using these models, we have compared them with the results obtained by employing the Ansoft HFSS electromagnetic simulator. Very good agreement has been recorded for the effective inductance value, whereas the effective Q factor value has shown a somewhat larger deviation than the inductance.

  11. Modeling churn using customer lifetime value

    OpenAIRE

    Glady, Nicolas; Baesens, Bart; Croux, Christophe

    2009-01-01

    The definition and modeling of customer loyalty have been central issues in customer relationship management since many years. Recent papers propose solutions to detect customers that are becoming less loyal, also called churners. The churner status is then defined as a function of the volume of commercial transactions. In the context of a Belgian retail financial service company, our first contribution is to redefine the notion of customer loyalty by considering it from a customer-centric vi...

  12. Communicating Value in Simulation: Cost Benefit Analysis and Return on Investment.

    Science.gov (United States)

    Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu

    2017-10-26

    Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes", our breakout session critically evaluated the cost benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Communicating Value in Simulation: Cost-Benefit Analysis and Return on Investment.

    Science.gov (United States)

    Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu

    2018-02-01

    Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and the economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost-effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes," our breakout session critically evaluated the cost-benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost-benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. © 2017 by the Society for Academic Emergency Medicine.

  14. Characteristics of hands-on simulations with added value for innovative secondary and higher vocational education

    NARCIS (Netherlands)

    Khaled, A.E.; Gulikers, J.T.M.; Biemans, H.J.A.; Wel, van der M.; Mulder, M.

    2014-01-01

    The intentions with which hands-on simulations are used in vocational education are not always clear. Also, pedagogical-didactic approaches in hands-on simulations are not well conceptualised from a learning theory perspective. This makes it difficult to pinpoint the added value that hands-on

  15. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Modelling Deterministic Systems. N K Srinivasan gradu- ated from Indian. Institute of Science and obtained his Doctorate from Columbia Univer- sity, New York. He has taught in several universities, and later did system analysis, wargaming and simula- tion for defence. His other areas of interest are reliability engineer-.

  16. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  17. Model calibration for building energy efficiency simulation

    International Nuclear Information System (INIS)

    Mustafaraj, Giorgio; Marini, Dashamir; Costa, Andrea; Keane, Marcus

    2014-01-01

    Highlights: • Developing a 3D model relating to building architecture, occupancy and HVAC operation. • Two calibration stages developed, final model providing accurate results. • Using an onsite weather station for generating the weather data file in EnergyPlus. • Predicting thermal behaviour of underfloor heating, heat pump and natural ventilation. • Monthly energy saving opportunities related to heat pump of 20–27% was identified. - Abstract: This research work deals with an Environmental Research Institute (ERI) building where an underfloor heating system and natural ventilation are the main systems used to maintain comfort condition throughout 80% of the building areas. Firstly, this work involved developing a 3D model relating to building architecture, occupancy and HVAC operation. Secondly, the calibration methodology, which consists of two levels, was then applied in order to insure accuracy and reduce the likelihood of errors. To further improve the accuracy of calibration a historical weather data file related to year 2011, was created from the on-site local weather station of ERI building. After applying the second level of calibration process, the values of Mean bias Error (MBE) and Cumulative Variation of Root Mean Squared Error (CV(RMSE)) on hourly based analysis for heat pump electricity consumption varied within the following ranges: (MBE) hourly from −5.6% to 7.5% and CV(RMSE) hourly from 7.3% to 25.1%. Finally, the building was simulated with EnergyPlus to identify further possibilities of energy savings supplied by a water to water heat pump to underfloor heating system. It found that electricity consumption savings from the heat pump can vary between 20% and 27% on monthly bases

  18. Model Driven Development of Simulation Models : Defining and Transforming Conceptual Models into Simulation Models by Using Metamodels and Model Transformations

    NARCIS (Netherlands)

    Küçükkeçeci Çetinkaya, D.

    2013-01-01

    Modeling and simulation (M&S) is an effective method for analyzing and designing systems and it is of interest to scientists and engineers from all disciplines. This thesis proposes the application of a model driven software development approach throughout the whole set of M&S activities and it

  19. Simulation and modeling of turbulent flows

    CERN Document Server

    Gatski, Thomas B; Lumley, John L

    1996-01-01

    This book provides students and researchers in fluid engineering with an up-to-date overview of turbulent flow research in the areas of simulation and modeling. A key element of the book is the systematic, rational development of turbulence closure models and related aspects of modern turbulent flow theory and prediction. Starting with a review of the spectral dynamics of homogenous and inhomogeneous turbulent flows, succeeding chapters deal with numerical simulation techniques, renormalization group methods and turbulent closure modeling. Each chapter is authored by recognized leaders in their respective fields, and each provides a thorough and cohesive treatment of the subject.

  20. Dynamic modeling and simulation of wind turbines

    International Nuclear Information System (INIS)

    Ghafari Seadat, M.H.; Kheradmand Keysami, M.; Lari, H.R.

    2002-01-01

    Using wind energy for generating electricity in wind turbines is a good way for using renewable energies. It can also help to protect the environment. The main objective of this paper is dynamic modeling by energy method and simulation of a wind turbine aided by computer. In this paper, the equations of motion are extracted for simulating the system of wind turbine and then the behavior of the system become obvious by solving the equations. The turbine is considered with three blade rotor in wind direction, induced generator that is connected to the network and constant revolution for simulation of wind turbine. Every part of the wind turbine should be simulated for simulation of wind turbine. The main parts are blades, gearbox, shafts and generator

  1. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  2. The behaviour of adaptive boneremodeling simulation models

    NARCIS (Netherlands)

    Weinans, H.; Huiskes, R.; Grootenboer, H.J.

    1992-01-01

    The process of adaptive bone remodeling can be described mathematically and simulated in a computer model, integrated with the finite element method. In the model discussed here, cortical and trabecular bone are described as continuous materials with variable density. The remodeling rule applied to

  3. Analytical system dynamics modeling and simulation

    CERN Document Server

    Fabien, Brian C

    2008-01-01

    This book offering a modeling technique based on Lagrange's energy method includes 125 worked examples. Using this technique enables one to model and simulate systems as diverse as a six-link, closed-loop mechanism or a transistor power amplifier.

  4. Equivalent drawbead model in finite element simulations

    NARCIS (Netherlands)

    Carleer, Bart D.; Carleer, B.D.; Meinders, Vincent T.; Huetink, Han; Lee, J.K.; Kinzel, G.L.; Wagoner, R.

    1996-01-01

    In 3D simulations of the deep drawing process the drawbead geometries are seldom included. Therefore equivalent drawbeads are used. In order to investigate the drawbead behaviour a 2D plane strain finite element model was used. For verification of this model experiments were performed. The analyses

  5. A simulation model for football championships

    NARCIS (Netherlands)

    Koning, RH; Koolhaas, M; Renes, G; Ridder, G

    2003-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like 'which team bad a lucky draw?' or 'what is the probability that two teams meet at some moment in the tournament?' Input

  6. A simulation model for football championships

    NARCIS (Netherlands)

    Koning, Ruud H.; Koolhaas, Michael; Renes, Gusta

    2001-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like ‘which team had a lucky draw?’ or ‘what is the probability that two teams meet at some moment in the tournament?’. Input

  7. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  8. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  9. Diversity modelling for electrical power system simulation

    International Nuclear Information System (INIS)

    Sharip, R M; Abu Zarim, M A U A

    2013-01-01

    This paper considers diversity of generation and demand profiles against the different future energy scenarios and evaluates these on a technical basis. Compared to previous studies, this research applied a forecasting concept based on possible growth rates from publically electrical distribution scenarios concerning the UK. These scenarios were created by different bodies considering aspects such as environment, policy, regulation, economic and technical. In line with these scenarios, forecasting is on a long term timescale (up to every ten years from 2020 until 2050) in order to create a possible output of generation mix and demand profiles to be used as an appropriate boundary condition for the network simulation. The network considered is a segment of rural LV populated with a mixture of different housing types. The profiles for the 'future' energy and demand have been successfully modelled by applying a forecasting method. The network results under these profiles shows for the cases studied that even though the value of the power produced from each Micro-generation is often in line with the demand requirements of an individual dwelling there will be no problems arising from high penetration of Micro-generation and demand side management for each dwellings considered. The results obtained highlight the technical issues/changes for energy delivery and management to rural customers under the future energy scenarios

  10. Diversity modelling for electrical power system simulation

    Science.gov (United States)

    Sharip, R. M.; Abu Zarim, M. A. U. A.

    2013-12-01

    This paper considers diversity of generation and demand profiles against the different future energy scenarios and evaluates these on a technical basis. Compared to previous studies, this research applied a forecasting concept based on possible growth rates from publically electrical distribution scenarios concerning the UK. These scenarios were created by different bodies considering aspects such as environment, policy, regulation, economic and technical. In line with these scenarios, forecasting is on a long term timescale (up to every ten years from 2020 until 2050) in order to create a possible output of generation mix and demand profiles to be used as an appropriate boundary condition for the network simulation. The network considered is a segment of rural LV populated with a mixture of different housing types. The profiles for the 'future' energy and demand have been successfully modelled by applying a forecasting method. The network results under these profiles shows for the cases studied that even though the value of the power produced from each Micro-generation is often in line with the demand requirements of an individual dwelling there will be no problems arising from high penetration of Micro-generation and demand side management for each dwellings considered. The results obtained highlight the technical issues/changes for energy delivery and management to rural customers under the future energy scenarios.

  11. Landscape Modelling and Simulation Using Spatial Data

    Directory of Open Access Journals (Sweden)

    Amjed Naser Mohsin AL-Hameedawi

    2017-08-01

    Full Text Available In this paper a procedure was performed for engendering spatial model of landscape acclimated to reality simulation. This procedure based on combining spatial data and field measurements with computer graphics reproduced using Blender software. Thereafter that we are possible to form a 3D simulation based on VIS ALL packages. The objective was to make a model utilising GIS, including inputs to the feature attribute data. The objective of these efforts concentrated on coordinating a tolerable spatial prototype, circumscribing facilitation scheme and outlining the intended framework. Thus; the eventual result was utilized in simulation form. The performed procedure contains not only data gathering, fieldwork and paradigm providing, but extended to supply a new method necessary to provide the respective 3D simulation mapping production, which authorises the decision makers as well as investors to achieve permanent acceptance an independent navigation system for Geoscience applications.

  12. Algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images

    International Nuclear Information System (INIS)

    Ogino, Takashi; Egawa, Sunao

    1991-01-01

    New algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images were developed. One, designated plane weighting method, is to correct CT value in proportion to the position of the beam element passing through the voxel. The other, designated solid weighting method, is to correct CT value in proportion to the length of the beam element passing through the voxel and the volume of voxel. Phantom experiments showed fair spatial resolution in the transverse direction. In the longitudinal direction, however, spatial resolution of under slice thickness could not be obtained. Contrast resolution was equivalent for both methods. In patient studies, the reconstructed radiotherapy simulation image was almost similar in visual perception of the density resolution to a simulation film taken by X-ray simulator. (author)

  13. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together...... to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal...... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  14. A queuing model for road traffic simulation

    International Nuclear Information System (INIS)

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-01-01

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme

  15. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  16. Potts-model grain growth simulations: Parallel algorithms and applications

    Energy Technology Data Exchange (ETDEWEB)

    Wright, S.A.; Plimpton, S.J.; Swiler, T.P. [and others

    1997-08-01

    Microstructural morphology and grain boundary properties often control the service properties of engineered materials. This report uses the Potts-model to simulate the development of microstructures in realistic materials. Three areas of microstructural morphology simulations were studied. They include the development of massively parallel algorithms for Potts-model grain grow simulations, modeling of mass transport via diffusion in these simulated microstructures, and the development of a gradient-dependent Hamiltonian to simulate columnar grain growth. Potts grain growth models for massively parallel supercomputers were developed for the conventional Potts-model in both two and three dimensions. Simulations using these parallel codes showed self similar grain growth and no finite size effects for previously unapproachable large scale problems. In addition, new enhancements to the conventional Metropolis algorithm used in the Potts-model were developed to accelerate the calculations. These techniques enable both the sequential and parallel algorithms to run faster and use essentially an infinite number of grain orientation values to avoid non-physical grain coalescence events. Mass transport phenomena in polycrystalline materials were studied in two dimensions using numerical diffusion techniques on microstructures generated using the Potts-model. The results of the mass transport modeling showed excellent quantitative agreement with one dimensional diffusion problems, however the results also suggest that transient multi-dimension diffusion effects cannot be parameterized as the product of the grain boundary diffusion coefficient and the grain boundary width. Instead, both properties are required. Gradient-dependent grain growth mechanisms were included in the Potts-model by adding an extra term to the Hamiltonian. Under normal grain growth, the primary driving term is the curvature of the grain boundary, which is included in the standard Potts-model Hamiltonian.

  17. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  18. Measuring Teacher Quality with Value-Added Modeling

    Science.gov (United States)

    Marder, Michael

    2012-01-01

    Using computers to evaluate teachers based on student test scores is more difficult than it seems. Value-added modeling is a genuinely serious attempt to grapple with the difficulties. Value-added modeling carries the promise of measuring teacher quality automatically and objectively, and improving school systems at minimal cost. The essence of…

  19. The value of load shifting. An estimate for Norway using the EMPS model

    International Nuclear Information System (INIS)

    Doorman, Gerard; Wolfgang, Ove

    2006-05-01

    An attempt is made to estimate the value of Load Shifting (LS) in the Norwegian system, using the EMPS model. A thorough update of the demand side model and cost estimates used in the model was done as a preparation for the project, and the report gives a comprehensive description of the demand models used. The LS measure that is analyzed is moving 600 MW demand in Norway from peak to lower demand hours during the day. The value of this was estimated both in a simplified manner (based on simulated price differences between these periods), and by simulations with the EMPS model and a subsequent calculation of the socio-economic surplus. Neither approaches showed any significant value. The results do not necessarily mean that the value in reality is zero - there are a number of limitations in the model which make it difficult to estimate the real value, like the representation of wind generation, demand variability, outages, exchange prices with continental Europe, flexibility of hydro and thermal generation, reserves and elasticity of demand in the short run. It was verified through sensitivity calculations that especially increasing reserve requirements and increasing the variability of wind generation increased price differences and therefore the value of LS. A number of improvements in the EMPS model and data are proposed to obtain a more suitable simulation model for this kind of analyses: 1) modeling of reserves, 2) representation of wind variability, 3) thermal generation models, 4) differentiation between long and short term price elasticity, 5) review of interconnection capacities, 6) use of quadratic losses and the 7) representation of more stochastic factors like e.g. outages in the simulations. Although the model at present clearly has its limitations with respect to estimating the value of LS, it appears that price differences between spot prices in the actual hours in reality are small. Comparison with Nord Pool spot prices for the years 2003

  20. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  1. Mean Value Engine Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Müller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models what are physically based. Such models are useful for control studies, for engine control system analysis and for model based engine control systems. Very few published MVEMs have included the effects of Exhaust Gas Recirculat...

  2. Multiple Regression and Mediator Variables can be used to Avoid Double Counting when Economic Values are Derived using Stochastic Herd Simulation

    DEFF Research Database (Denmark)

    Østergaard, Søren; Ettema, Jehan Frans; Hjortø, Line

    Multiple regression and model building with mediator variables was addressed to avoid double counting when economic values are estimated from data simulated with herd simulation modeling (using the SimHerd model). The simulated incidence of metritis was analyzed statistically as the independent...... variable, while using the traits representing the direct effects of metritis on yield, fertility and occurrence of other diseases as mediator variables. The economic value of metritis was estimated to be €78 per 100 cow-years for each 1% increase of metritis in the period of 1-100 days in milk...

  3. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  4. Vermont Yankee simulator BOP model upgrade

    International Nuclear Information System (INIS)

    Alejandro, R.; Udbinac, M.J.

    2006-01-01

    The Vermont Yankee simulator has undergone significant changes in the 20 years since the original order was placed. After the move from the original Unix to MS Windows environment, and upgrade to the latest version of SimPort, now called MASTER, the platform was set for an overhaul and replacement of major plant system models. Over a period of a few months, the VY simulator team, in partnership with WSC engineers, replaced outdated legacy models of the main steam, condenser, condensate, circulating water, feedwater and feedwater heaters, and main turbine and auxiliaries. The timing was ideal, as the plant was undergoing a power up-rate, so the opportunity was taken to replace the legacy models with industry-leading, true on-line object oriented graphical models. Due to the efficiency of design and ease of use of the MASTER tools, VY staff performed the majority of the modeling work themselves with great success, with only occasional assistance from WSC, in a relatively short time-period, despite having to maintain all of their 'regular' simulator maintenance responsibilities. This paper will provide a more detailed view of the VY simulator, including how it is used and how it has benefited from the enhancements and upgrades implemented during the project. (author)

  5. Multi-Valued Modal Fixed Point Logics for Model Checking

    Science.gov (United States)

    Nishizawa, Koki

    In this paper, I will show how multi-valued logics are used for model checking. Model checking is an automatic technique to analyze correctness of hardware and software systems. A model checker is based on a temporal logic or a modal fixed point logic. That is to say, a system to be checked is formalized as a Kripke model, a property to be satisfied by the system is formalized as a temporal formula or a modal formula, and the model checker checks that the Kripke model satisfies the formula. Although most existing model checkers are based on 2-valued logics, recently new attempts have been made to extend the underlying logics of model checkers to multi-valued logics. I will summarize these new results.

  6. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  7. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  8. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels

    2013-01-01

    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  9. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  10. preliminary multidomain modelling and simulation study

    African Journals Online (AJOL)

    user

    PRELIMINARY MULTIDOMAIN MODELLING AND SIMULATION STUDY OF A. HORIZONTAL AXIS WIND TURBINE (HAWT) TOWER VIBRATION. I. lliyasu1, I. Iliyasu2, I. K. Tanimu3 and D. O Obada4. 1,4 DEPARTMENT OF MECHANICAL ENGINEERING, AHMADU BELLO UNIVERSITY, ZARIA, KADUNA STATE. NIGERIA.

  11. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  12. Thermohydraulic modeling and simulation of breeder reactors

    International Nuclear Information System (INIS)

    Agrawal, A.K.; Khatib-Rahbar, M.; Curtis, R.T.; Hetrick, D.L.; Girijashankar, P.V.

    1982-01-01

    This paper deals with the modeling and simulation of system-wide transients in LMFBRs. Unprotected events (i.e., the presumption of failure of the plant protection system) leading to core-melt are not considered in this paper. The existing computational capabilities in the area of protected transients in the US are noted. Various physical and numerical approximations that are made in these codes are discussed. Finally, the future direction in the area of model verification and improvements is discussed

  13. Modeling, simulation and performance evaluation of parabolic ...

    African Journals Online (AJOL)

    Model of a parabolic trough power plant, taking into consideration the different losses associated with collection of the solar irradiance and thermal losses is presented. MATLAB software is employed to model the power plant at reference state points. The code is then used to find the different reference values which are ...

  14. Self-Service Banking: Value Creation Models and Information Exchange

    Directory of Open Access Journals (Sweden)

    Ragnvald Sannes

    2001-01-01

    Full Text Available This paper argues that most banks have failed to exploit the potential of self-service banking because they base their service design on an incomplete business model for self-service. A framework for evaluation of self-service banking concepts is developed on the basis of Stabell and Fjeldstad's three value configurations. The value network and the value shop are consistent with self-service banking while the value chain is inappropriate. The impact of the value configurations on information exchange and self-service functionality is discussed, and a framework for design of such services proposed. Current self-service banking practices are compared to the framework, and it is concluded that current practice matches the concept of a value network and not the value shop. However, current practices are only a partial implementation of a value network-based self-service banking concept.

  15. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  16. Simulation of MILD combustion using Perfectly Stirred Reactor model

    KAUST Repository

    Chen, Z.

    2016-07-06

    A simple model based on a Perfectly Stirred Reactor (PSR) is proposed for moderate or intense low-oxygen dilution (MILD) combustion. The PSR calculation is performed covering the entire flammability range and the tabulated chemistry approach is used with a presumed joint probability density function (PDF). The jet, in hot and diluted coflow experimental set-up under MILD conditions, is simulated using this reactor model for two oxygen dilution levels. The computed results for mean temperature, major and minor species mass fractions are compared with the experimental data and simulation results obtained recently using a multi-environment transported PDF approach. Overall, a good agreement is observed at three different axial locations for these comparisons despite the over-predicted peak value of CO formation. This suggests that MILD combustion can be effectively modelled by the proposed PSR model with lower computational cost.

  17. A reservoir simulation approach for modeling of naturally fractured reservoirs

    Directory of Open Access Journals (Sweden)

    H. Mohammadi

    2012-12-01

    Full Text Available In this investigation, the Warren and Root model proposed for the simulation of naturally fractured reservoir was improved. A reservoir simulation approach was used to develop a 2D model of a synthetic oil reservoir. Main rock properties of each gridblock were defined for two different types of gridblocks called matrix and fracture gridblocks. These two gridblocks were different in porosity and permeability values which were higher for fracture gridblocks compared to the matrix gridblocks. This model was solved using the implicit finite difference method. Results showed an improvement in the Warren and Root model especially in region 2 of the semilog plot of pressure drop versus time, which indicated a linear transition zone with no inflection point as predicted by other investigators. Effects of fracture spacing, fracture permeability, fracture porosity, matrix permeability and matrix porosity on the behavior of a typical naturally fractured reservoir were also presented.

  18. Transverse Momentum Distributions of Electron in Simulated QED Model

    Science.gov (United States)

    Kaur, Navdeep; Dahiya, Harleen

    2018-05-01

    In the present work, we have studied the transverse momentum distributions (TMDs) for the electron in simulated QED model. We have used the overlap representation of light-front wave functions where the spin-1/2 relativistic composite system consists of spin-1/2 fermion and spin-1 vector boson. The results have been obtained for T-even TMDs in transverse momentum plane for fixed value of longitudinal momentum fraction x.

  19. Environment Modeling Using Runtime Values for JPF-Android

    Science.gov (United States)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  20. Twitter's tweet method modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  1. Advances in NLTE modeling for integrated simulations

    Science.gov (United States)

    Scott, H. A.; Hansen, S. B.

    2010-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different atomic species for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly-excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with sufficient accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δ n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short time steps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  2. Advances in NLTE Modeling for Integrated Simulations

    International Nuclear Information System (INIS)

    Scott, H.A.; Hansen, S.B.

    2009-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δn = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  3. SIMULATION MODELING OF IT PROJECTS BASED ON PETRI NETS

    Directory of Open Access Journals (Sweden)

    Александр Михайлович ВОЗНЫЙ

    2015-05-01

    Full Text Available An integrated simulation model of IT project based on a modified Petri net model that combines product and model of project tasks has been proposed. Substantive interpretation of the components of the simulation model has been presented, the process of simulation has been described. The conclusions about the integration of the product model and the model of works project were made.

  4. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  6. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L

    2007-01-01

    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  7. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  8. Numerical model simulation of atmospheric coolant plumes

    International Nuclear Information System (INIS)

    Gaillard, P.

    1980-01-01

    The effect of humid atmospheric coolants on the atmosphere is simulated by means of a three-dimensional numerical model. The atmosphere is defined by its natural vertical profiles of horizontal velocity, temperature, pressure and relative humidity. Effluent discharge is characterised by its vertical velocity and the temperature of air satured with water vapour. The subject of investigation is the area in the vicinity of the point of discharge, with due allowance for the wake effect of the tower and buildings and, where application, wind veer with altitude. The model equations express the conservation relationships for mometum, energy, total mass and water mass, for an incompressible fluid behaving in accordance with the Boussinesq assumptions. Condensation is represented by a simple thermodynamic model, and turbulent fluxes are simulated by introduction of turbulent viscosity and diffusivity data based on in-situ and experimental water model measurements. The three-dimensional problem expressed in terms of the primitive variables (u, v, w, p) is governed by an elliptic equation system which is solved numerically by application of an explicit time-marching algorithm in order to predict the steady-flow velocity distribution, temperature, water vapour concentration and the liquid-water concentration defining the visible plume. Windstill conditions are simulated by a program processing the elliptic equations in an axisymmetrical revolution coordinate system. The calculated visible plumes are compared with plumes observed on site with a view to validate the models [fr

  9. The Residual Value Models: A Framework for Business Administration

    OpenAIRE

    Konstantinos J. Liapis

    2010-01-01

    This article investigates the relationship between a firm’s performance and Residual Value Models (RVM) which serve as decision making tools in corporate management. The main measures are the Economic Value Added (EVA®) and Cash Value Added (CVA®), with key components the Residual Income (RI), Free Cash Flow (FCF) and Weighted Average Cost of Capital (WACC). These measures have attracted considerable interest among scientists, practitioners and organizations in recent years. This work focuses...

  10. A Simulation Model for Extensor Tendon Repair

    Directory of Open Access Journals (Sweden)

    Elizabeth Aronstam

    2017-07-01

    Full Text Available Audience: This simulation model is designed for use by emergency medicine residents. Although we have instituted this at the PGY-2 level of our residency curriculum, it is appropriate for any level of emergency medicine residency training. It might also be adapted for use for a variety of other learners, such as practicing emergency physicians, orthopedic surgery residents, or hand surgery trainees. Introduction: Tendon injuries commonly present to the emergency department, so it is essential that emergency physicians be competent in evaluating such injuries. Indeed, extensor tendon repair is included as an ACGME Emergency Medicine Milestone (Milestone 13, Wound Management, Level 5 – “Performs advanced wound repairs, such as tendon repairs…”.1 However, emergency medicine residents may have limited opportunity to develop these skills due to a lack of patients, competition from other trainees, or preexisting referral patterns. Simulation may provide an alternative means to effectively teach these skills in such settings. Previously described tendon repair simulation models that were designed for surgical trainees have used rubber worms4, licorice5, feeding tubes, catheters6,7, drinking straws8, microfoam tape9, sheep forelimbs10 and cadavers.11 These models all suffer a variety of limitations, including high cost, lack of ready availability, or lack of realism. Objectives: We sought to develop an extensor tendon repair simulation model for emergency medicine residents, designed to meet ACGME Emergency Medicine Milestone 13, Level 5. We wished this model to be simple, inexpensive, and realistic. Methods: The learner responsible content/educational handout component of our innovation teaches residents about emergency department extensor tendon repair, and includes: 1 relevant anatomy 2 indications and contraindications for emergency department extensor tendon repair 3 physical exam findings 4 tendon suture techniques and 5 aftercare. During

  11. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.

    1998-02-01

    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  12. Brownian gas models for extreme-value laws

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2013-01-01

    In this paper we establish one-dimensional Brownian gas models for the extreme-value laws of Gumbel, Weibull, and Fréchet. A gas model is a countable collection of independent particles governed by common diffusion dynamics. The extreme-value laws are the universal probability distributions governing the affine scaling limits of the maxima and minima of ensembles of independent and identically distributed one-dimensional random variables. Using the recently introduced concept of stationary Poissonian intensities, we construct two gas models whose global statistical structures are stationary, and yield the extreme-value laws: a linear Brownian motion gas model for the Gumbel law, and a geometric Brownian motion gas model for the Weibull and Fréchet laws. The stochastic dynamics of these gas models are studied in detail, and closed-form analytical descriptions of their temporal correlation structures, their topological phase transitions, and their intrinsic first-passage-time fluxes are presented. (paper)

  13. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  14. Simulation as a surgical teaching model.

    Science.gov (United States)

    Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos

    2018-01-01

    Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Stakeholder Theory and Value Creation Models in Brazilian Firms

    Directory of Open Access Journals (Sweden)

    Natalia Giugni Vidal

    2015-09-01

    Full Text Available Objective – The purpose of this study is to understand how top Brazilian firms think about and communicate value creation to their stakeholders. Design/methodology/approach – We use qualitative content analysis methodology to analyze the sustainability or annual integrated reports of the top 25 Brazilian firms by sales revenue. Findings – Based on our analysis, these firms were classified into three main types of stakeholder value creation models: narrow, broad, or transitioning from narrow to broad. We find that many of the firms in our sample are in a transition state between narrow and broad stakeholder value creation models. We also identify seven areas of concentration discussed by firms in creating value for stakeholders: better stakeholder relationships, better work environment, environmental preservation, increased customer base, local development, reputation, and stakeholder dialogue. Practical implications – This study shows a trend towards broader stakeholder value creation models in Brazilian firms. The findings of this study may inform practitioners interested in broadening their value creation models. Originality/value – This study adds to the discussion of stakeholder theory in the Brazilian context by understanding variations in value creation orientation in Brazil.

  16. A Lookahead Behavior Model for Multi-Agent Hybrid Simulation

    Directory of Open Access Journals (Sweden)

    Mei Yang

    2017-10-01

    Full Text Available In the military field, multi-agent simulation (MAS plays an important role in studying wars statistically. For a military simulation system, which involves large-scale entities and generates a very large number of interactions during the runtime, the issue of how to improve the running efficiency is of great concern for researchers. Current solutions mainly use hybrid simulation to gain fewer updates and synchronizations, where some important continuous models are maintained implicitly to keep the system dynamics, and partial resynchronization (PR is chosen as the preferable state update mechanism. However, problems, such as resynchronization interval selection and cyclic dependency, remain unsolved in PR, which easily lead to low update efficiency and infinite looping of the state update process. To address these problems, this paper proposes a lookahead behavior model (LBM to implement a PR-based hybrid simulation. In LBM, a minimal safe time window is used to predict the interactions between implicit models, upon which the resynchronization interval can be efficiently determined. Moreover, the LBM gives an estimated state value in the lookahead process so as to break the state-dependent cycle. The simulation results show that, compared with traditional mechanisms, LBM requires fewer updates and synchronizations.

  17. Determination of Complex-Valued Parametric Model Coefficients Using Artificial Neural Network Technique

    Directory of Open Access Journals (Sweden)

    A. M. Aibinu

    2010-01-01

    Full Text Available A new approach for determining the coefficients of a complex-valued autoregressive (CAR and complex-valued autoregressive moving average (CARMA model coefficients using complex-valued neural network (CVNN technique is discussed in this paper. The CAR and complex-valued moving average (CMA coefficients which constitute a CARMA model are computed simultaneously from the adaptive weights and coefficients of the linear activation functions in a two-layered CVNN. The performance of the proposed technique has been evaluated using simulated complex-valued data (CVD with three different types of activation functions. The results show that the proposed method can accurately determine the model coefficients provided that the network is properly trained. Furthermore, application of the developed CVNN-based technique for MRI K-space reconstruction results in images with improve resolution.

  18. Value Creation Challenges in Multichannel Retail Business Models

    Directory of Open Access Journals (Sweden)

    Mika Yrjölä

    2014-08-01

    Full Text Available Purpose: The purpose of the paper is to identify and analyze the challenges of value creation in multichannel retail business models. Design/methodology/approach: With the help of semi-structured interviews with top executives from different retailing environments, this study introduces a model of value creation challenges in the context of multichannel retailing. The challenges are analyzed in terms of three retail business model elements, i.e., format, activities, and governance. Findings: Adopting a multichannel retail business model requires critical rethinking of the basic building blocks of value creation. First of all, as customers effortlessly move between multiple channels, multichannel formats can lead to a mismatch between customer and firm value. Secondly, retailers face pressures to use their activities to form integrated total offerings to customers. Thirdly, multiple channels might lead to organizational silos with conflicting goals. A careful orchestration of value creation is needed to determine the roles and incentives of the channel parties involved. Research limitations/implications: In contrast to previous business model literature, this study did not adopt a network-centric view. By embracing the boundary-spanning nature of the business model, other challenges and elements might have been discovered (e.g., challenges in managing relationships with suppliers. Practical implications: As a practical contribution, this paper has analyzed the challenges retailers face in adopting multichannel business models. Customer tendencies for showrooming behavior highlight the need for generating efficient lock-in strategies. Customized, personal offers and information are ways to increase customer value, differentiate from competition, and achieve lock-in. Originality/value: As a theoretical contribution, this paper empirically investigates value creation challenges in a specific context, lowering the level of abstraction in the mostly

  19. The Value Simulation-Based Learning Added to Machining Technology in Singapore

    Science.gov (United States)

    Fang, Linda; Tan, Hock Soon; Thwin, Mya Mya; Tan, Kim Cheng; Koh, Caroline

    2011-01-01

    This study seeks to understand the value simulation-based learning (SBL) added to the learning of Machining Technology in a 15-week core subject course offered to university students. The research questions were: (1) How did SBL enhance classroom learning? (2) How did SBL help participants in their test? (3) How did SBL prepare participants for…

  20. Integral-Value Models for Outcomes over Continuous Time

    DEFF Research Database (Denmark)

    Harvey, Charles M.; Østerdal, Lars Peter

    Models of preferences between outcomes over continuous time are important for individual, corporate, and social decision making, e.g., medical treatment, infrastructure development, and environmental regulation. This paper presents a foundation for such models. It shows that conditions on prefere...... on preferences between real- or vector-valued outcomes over continuous time are satisfied if and only if the preferences are represented by a value function having an integral form......Models of preferences between outcomes over continuous time are important for individual, corporate, and social decision making, e.g., medical treatment, infrastructure development, and environmental regulation. This paper presents a foundation for such models. It shows that conditions...

  1. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  2. Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    André, T. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Morini, F. [Research Group of Theoretical Chemistry and Molecular Modelling, Hasselt University, Agoralaan Gebouw D, B-3590 Diepenbeek (Belgium); Karamitros, M. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, INCIA, UMR 5287, F-33400 Talence (France); Delorme, R. [LPSC, Université Joseph Fourier Grenoble 1, CNRS/IN2P3, Grenoble INP, 38026 Grenoble (France); CEA, LIST, F-91191 Gif-sur-Yvette (France); Le Loirec, C. [CEA, LIST, F-91191 Gif-sur-Yvette (France); Campos, L. [Departamento de Física, Universidade Federal de Sergipe, São Cristóvão (Brazil); Champion, C. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Groetz, J.-E.; Fromm, M. [Université de Franche-Comté, Laboratoire Chrono-Environnement, UMR CNRS 6249, Besançon (France); Bordage, M.-C. [Laboratoire Plasmas et Conversion d’Énergie, UMR 5213 CNRS-INPT-UPS, Université Paul Sabatier, Toulouse (France); Perrot, Y. [Laboratoire de Physique Corpusculaire, UMR 6533, Aubière (France); Barberet, Ph. [Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); and others

    2014-01-15

    Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov–Smirnov test has allowed confirming the statistical compatibility of all simulation results.

  3. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  4. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  5. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  6. eShopper modeling and simulation

    Science.gov (United States)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  7. Simulation modelling in agriculture: General considerations. | R.I. ...

    African Journals Online (AJOL)

    The computer does all the necessary arithmetic when the hypothesis is invoked to predict the future behaviour of the simulated system under given conditions.A general ... in the advisory service. Keywords: agriculture; botany; computer simulation; modelling; simulation model; simulation modelling; south africa; techniques ...

  8. New model for ultracompact coaxial Marx pulse generator simulations

    Science.gov (United States)

    Martin, Benoît; Raymond, Pierre; Wey, Joseph

    2006-04-01

    This article describes a new simulation model developed with PSPICE in order to improve the ultra compact Marx generators designed at the French-German Research Institute of Saint-Louis (ISL). The proposed model is based on a Marx elementary unit and is an equivalent electric circuit that matches the actual configuration of the generator. It consists of a structural description of the elementary stage of a Marx generator including stray components. It also includes a behavioral model of the spark gap switches based on the Vlastos formula determining the arc resistance value. The prebreakdown delay is also taken into account. Experimental data have been used to validate the results of the simulations. An original indirect measurement, allowing the estimation of the spark gap resistance, is also proposed.

  9. Modeling and PSPICE simulation of NBTI effects in VDMOS transistors

    Directory of Open Access Journals (Sweden)

    Marjanović Miloš

    2015-01-01

    Full Text Available In this paper the results of modeling and simulation of NBTI effects in p-channel power VDMOS transistor have been presented. Based on the experimental results, the threshold voltage shifts and changes of transconductance during the NBT stress have been modeled and implemented in the PSPICE model of the IRF9520 transistor. By predefining the threshold voltage value before the NBT stress, and by assigning the stress time, transfer characteristics of the transistor are simulated. These characteristics are within (1.33÷11.25% limits in respect to the measured ones, which represents a good agreement. [Projekat Ministarstva nauke Republike Srbije, br. OI 171026 i br. TR 32026

  10. Effects of model schematisation, geometry and parameter values on urban flood modelling.

    Science.gov (United States)

    Vojinovic, Z; Seyoum, S D; Mwalwaka, J M; Price, R K

    2011-01-01

    One-dimensional (1D) hydrodynamic models have been used as a standard industry practice for urban flood modelling work for many years. More recently, however, model formulations have included a 1D representation of the main channels and a 2D representation of the floodplains. Since the physical process of describing exchanges of flows with the floodplains can be represented in different ways, the predictive capability of different modelling approaches can also vary. The present paper explores effects of some of the issues that concern urban flood modelling work. Impacts from applying different model schematisation, geometry and parameter values were investigated. The study has mainly focussed on exploring how different Digital Terrain Model (DTM) resolution, presence of different features on DTM such as roads and building structures and different friction coefficients affect the simulation results. Practical implications of these issues are analysed and illustrated in a case study from St Maarten, N.A. The results from this study aim to provide users of numerical models with information that can be used in the analyses of flooding processes in urban areas.

  11. Models of consumer value cocreation in health care.

    Science.gov (United States)

    Nambisan, Priya; Nambisan, Satish

    2009-01-01

    In recent years, consumer participation in health care has gained critical importance as health care organizations (HCOs) seek varied avenues to enhance the quality and the value of their offerings. Many large HCOs have established online health communities where health care consumers (patients) can interact with one another to share knowledge and offer emotional support in disease management and care. Importantly, the focus of consumer participation in health care has moved beyond such personal health care management as the potential for consumers to participate in innovation and value creation in varied areas of the health care industry becomes increasingly evident. Realizing such potential, however, will require HCOs to develop a better understanding of the varied types of consumer value cocreation that are enabled by new information and communication technologies such as online health communities and Web 2.0 (social media) technologies. This article seeks to contribute toward such an understanding by offering a concise and coherent theoretical framework to analyze consumer value cocreation in health care. We identify four alternate models of consumer value cocreation-the partnership model, the open-source model, the support-group model, and the diffusion model-and discuss their implications for HCOs. We develop our theoretical framework by drawing on theories and concepts in knowledge creation, innovation management, and online communities. A set of propositions are developed by combining theoretical insights from these areas with real-world examples of consumer value cocreation in health care. The theoretical framework offered here informs on the potential impact of the different models of consumer value cocreation on important organizational variables such as innovation cost and time, service quality, and consumer perceptions of HCO. An understanding of the four models of consumer value cocreation can help HCOs adopt appropriate strategies and practices to

  12. Aqueous Electrolytes: Model Parameters and Process Simulation

    DEFF Research Database (Denmark)

    Thomsen, Kaj

    This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer ...... program including a steady state process simulator for the design, simulation, and optimization of fractional crystallization processes is presented.......This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer...

  13. A Placement Model for Flight Simulators.

    Science.gov (United States)

    1982-09-01

    simulator basing strategies. Captains David R. VanDenburg and Jon D. Veith developed a mathematical model to assist in the placement analysis of A-7...Institute for Defense Analysis, Arlington VA, August 1977. AD A049979. 23. Sugarman , Robert C., Steven L. Johnson, and William F. H. Ring. "B-I Systems...USAF Cost and Plan- nin& Factors. AFR 173-13. Washington: Govern- ment Printing Office, I February 1982. * 30. Van Denburg, Captain David R., USAF

  14. Assessing the potential value for an automated dairy cattle body condition scoring system through stochastic simulation

    NARCIS (Netherlands)

    Bewley, J.M.; Boehlje, M.D.; Gray, A.W.; Hogeveen, H.; Kenyon, S.J.; Eicher, S.D.; Schutz, M.M.

    2010-01-01

    Purpose – The purpose of this paper is to develop a dynamic, stochastic, mechanistic simulation model of a dairy business to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm

  15. Modeling and simulation of normal and hemiparetic gait

    Science.gov (United States)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  16. [Healthcare value chain: a model for the Brazilian healthcare system].

    Science.gov (United States)

    Pedroso, Marcelo Caldeira; Malik, Ana Maria

    2012-10-01

    This article presents a model of the healthcare value chain which consists of a schematic representation of the Brazilian healthcare system. The proposed model is adapted for the Brazilian reality and has the scope and flexibility for use in academic activities and analysis of the healthcare sector in Brazil. It places emphasis on three components: the main activities of the value chain, grouped in vertical and horizontal links; the mission of each link and the main value chain flows. The proposed model consists of six vertical and three horizontal links, amounting to nine. These are: knowledge development; supply of products and technologies; healthcare services; financial intermediation; healthcare financing; healthcare consumption; regulation; distribution of healthcare products; and complementary and support services. Four flows can be used to analyze the value chain: knowledge and innovation; products and services; financial; and information.

  17. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  18. Integer Valued Autoregressive Models for Tipping Bucket Rainfall Measurements

    DEFF Research Database (Denmark)

    Thyregod, Peter; Carstensen, Niels Jacob; Madsen, Henrik

    1999-01-01

    A new method for modelling the dynamics of rain sampled by a tipping bucket rain gauge is proposed. The considered models belong to the class of integer valued autoregressive processes. The models take the autocorelation and discrete nature of the data into account. A first order, a second order...... and a threshold model are presented together with methods to estimate the parameters of each model. The models are demonstrated to provide a good description of dt from actual rain events requiring only two to four parameters....

  19. Challenges and needs in fire management: A landscape simulation modeling perspective [chapter 4

    Science.gov (United States)

    Robert E. Keane; Geoffrey J. Cary; Mike D. Flannigan

    2011-01-01

    Fire management will face many challenges in the future from global climate change to protecting people, communities, and values at risk. Simulation modeling will be a vital tool for addressing these challenges but the next generation of simulation models must be spatially explicit to address critical landscape ecology relationships and they must use mechanistic...

  20. Numerical model for learning concepts of streamflow simulation

    Science.gov (United States)

    DeLong, L.L.; ,

    1993-01-01

    Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.

  1. Reactive transport models and simulation with ALLIANCES

    International Nuclear Information System (INIS)

    Leterrier, N.; Deville, E.; Bary, B.; Trotignon, L.; Hedde, T.; Cochepin, B.; Stora, E.

    2009-01-01

    Many chemical processes influence the evolution of nuclear waste storage. As a result, simulations based only upon transport and hydraulic processes fail to describe adequately some industrial scenarios. We need to take into account complex chemical models (mass action laws, kinetics...) which are highly non-linear. In order to simulate the coupling of these chemical reactions with transport, we use a classical Sequential Iterative Approach (SIA), with a fixed point algorithm, within the mainframe of the ALLIANCES platform. This approach allows us to use the various transport and chemical modules available in ALLIANCES, via an operator-splitting method based upon the structure of the chemical system. We present five different applications of reactive transport simulations in the context of nuclear waste storage: 1. A 2D simulation of the lixiviation by rain water of an underground polluted zone high in uranium oxide; 2. The degradation of the steel envelope of a package in contact with clay. Corrosion of the steel creates corrosion products and the altered package becomes a porous medium. We follow the degradation front through kinetic reactions and the coupling with transport; 3. The degradation of a cement-based material by the injection of an aqueous solution of zinc and sulphate ions. In addition to the reactive transport coupling, we take into account in this case the hydraulic retroaction of the porosity variation on the Darcy velocity; 4. The decalcification of a concrete beam in an underground storage structure. In this case, in addition to the reactive transport simulation, we take into account the interaction between chemical degradation and the mechanical forces (cracks...), and the retroactive influence on the structure changes on transport; 5. The degradation of the steel envelope of a package in contact with a clay material under a temperature gradient. In this case the reactive transport simulation is entirely directed by the temperature changes and

  2. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón

    2013-08-01

    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  3. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  4. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe

    2015-01-01

    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  5. Creating Value Through the Freemium Business Model: A Consumer Perspective

    NARCIS (Netherlands)

    G.J. Rietveld (Joost)

    2016-01-01

    textabstractThis paper develops a consumer-centric framework for creating value through the freemium business model. Goods that are commercialized through the freemium business model offer basic functionality for free and monetize users for extended use or complementary features. Compared to premium

  6. Modelling and simulation of the hydrocracking of heavy oil fractions

    Directory of Open Access Journals (Sweden)

    Matos E.M.

    2000-01-01

    Full Text Available This work presents a model to describe the behavior of the concentration of constituents of heavy fractions from petroleum during the hydrocracking process. An approximation based on pseudocomponents or lumps is adopted due to the complexity of the mixture. The system is modeled as an isothermal tubular reactor with an axial dispersion, where the hydrogen flows upward concurrently with the oil while the solid catalyst particles stay inside the reactor in an expanded bed regime. Simulations are carried out for different values of liquid superficial velocity, reactor length and degree of mixing (Peclet number.

  7. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  8. Extreme value modelling of Ghana stock exchange index.

    Science.gov (United States)

    Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe

    2015-01-01

    Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.

  9. Wedge Experiment Modeling and Simulation for Reactive Flow Model Calibration

    Science.gov (United States)

    Maestas, Joseph T.; Dorgan, Robert J.; Sutherland, Gerrit T.

    2017-06-01

    Wedge experiments are a typical method for generating pop-plot data (run-to-detonation distance versus input shock pressure), which is used to assess an explosive material's initiation behavior. Such data can be utilized to calibrate reactive flow models by running hydrocode simulations and successively tweaking model parameters until a match between experiment is achieved. Typical simulations are performed in 1D and typically use a flyer impact to achieve the prescribed shock loading pressure. In this effort, a wedge experiment performed at the Army Research Lab (ARL) was modeled using CTH (SNL hydrocode) in 1D, 2D, and 3D space in order to determine if there was any justification in using simplified models. A simulation was also performed using the BCAT code (CTH companion tool) that assumes a plate impact shock loading. Results from the simulations were compared to experimental data and show that the shock imparted into an explosive specimen is accurately captured with 2D and 3D simulations, but changes significantly in 1D space and with the BCAT tool. The difference in shock profile is shown to only affect numerical predictions for large run distances. This is attributed to incorrectly capturing the energy fluence for detonation waves versus flat shock loading. Portions of this work were funded through the Joint Insensitive Munitions Technology Program.

  10. Integrating Visualizations into Modeling NEST Simulations.

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  11. Integrating Visualizations into Modeling NEST Simulations

    Directory of Open Access Journals (Sweden)

    Christian eNowke

    2015-12-01

    Full Text Available Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  12. Integrating Visualizations into Modeling NEST Simulations

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  13. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    , that can accurately and efficiently simulate wind turbine wakes. The linear k-ε eddy viscosity model (EVM) is a popular turbulence model in RANS; however, it underpredicts the velocity wake deficit and cannot predict the anisotropic Reynolds-stresses in the wake. In the current work, nonlinear eddy...... viscosity models (NLEVM) are applied to wind turbine wakes. NLEVMs can model anisotropic turbulence through a nonlinear stress-strain relation, and they can improve the velocity deficit by the use of a variable eddy viscosity coefficient, that delays the wake recovery. Unfortunately, all tested NLEVMs show...... numerically unstable behavior for fine grids, which inhibits a grid dependency study for numerical verification. Therefore, a simpler EVM is proposed, labeled as the k-ε - fp EVM, that has a linear stress-strain relation, but still has a variable eddy viscosity coefficient. The k-ε - fp EVM is numerically...

  14. Modeling and simulation of gamma camera

    International Nuclear Information System (INIS)

    Singh, B.; Kataria, S.K.; Samuel, A.M.

    2002-08-01

    Simulation techniques play a vital role in designing of sophisticated instruments and also for the training of operating and maintenance staff. Gamma camera systems have been used for functional imaging in nuclear medicine. Functional images are derived from the external counting of the gamma emitting radioactive tracer that after introduction in to the body mimics the behavior of native biochemical compound. The position sensitive detector yield the coordinates of the gamma ray interaction with the detector and are used to estimate the point of gamma ray emission within the tracer distribution space. This advanced imaging device is thus dependent on the performance of algorithm for coordinate computing, estimation of point of emission, generation of image and display of the image data. Contemporary systems also have protocols for quality control and clinical evaluation of imaging studies. Simulation of this processing leads to understanding of the basic camera design problems. This report describes a PC based package for design and simulation of gamma camera along with the options of simulating data acquisition and quality control of imaging studies. Image display and data processing the other options implemented in SIMCAM will be described in separate reports (under preparation). Gamma camera modeling and simulation in SIMCAM has preset configuration of the design parameters for various sizes of crystal detector with the option to pack the PMT on hexagon or square lattice. Different algorithm for computation of coordinates and spatial distortion removal are allowed in addition to the simulation of energy correction circuit. The user can simulate different static, dynamic, MUGA and SPECT studies. The acquired/ simulated data is processed for quality control and clinical evaluation of the imaging studies. Results show that the program can be used to assess these performances. Also the variations in performance parameters can be assessed due to the induced

  15. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  16. A matrix model for valuing anesthesia service with the resource-based relative value system.

    Science.gov (United States)

    Sinclair, David R; Lubarsky, David A; Vigoda, Michael M; Birnbach, David J; Harris, Eric A; Behrens, Vicente; Bazan, Richard E; Williams, Steve M; Arheart, Kristopher; Candiotti, Keith A

    2014-01-01

    The purpose of this study was to propose a new crosswalk using the resource-based relative value system (RBRVS) that preserves the time unit component of the anesthesia service and disaggregates anesthesia billing into component parts (preoperative evaluation, intraoperative management, and postoperative evaluation). The study was designed as an observational chart and billing data review of current and proposed payments, in the setting of a preoperative holing area, intraoperative suite, and post anesthesia care unit. In total, 1,195 charts of American Society of Anesthesiology (ASA) physical status 1 through 5 patients were reviewed. No direct patient interventions were undertaken. Spearman correlations between the proposed RBRVS billing matrix payments and the current ASA relative value guide methodology payments were strong (r=0.94-0.96, Pbilling matrix yielded payments that were 3.0%±1.34% less than would have been expected from commercial insurers, using standard rates for commercial ASA relative value units and RBRVS relative value units. Compared with current Medicare reimbursement under the ASA relative value guide, reimbursement would almost double when converting to an RBRVS billing model. The greatest increases in Medicare reimbursement between the current system and proposed billing model occurred as anesthetic management complexity increased. The new crosswalk correlates with existing evaluation and management and intensive care medicine codes in an essentially revenue neutral manner when applied to the market-based rates of commercial insurers. The new system more highly values delivery of care to more complex patients undergoing more complex surgery and better represents the true value of anesthetic case management.

  17. A matrix model for valuing anesthesia service with the resource-based relative value system

    Science.gov (United States)

    Sinclair, David R; Lubarsky, David A; Vigoda, Michael M; Birnbach, David J; Harris, Eric A; Behrens, Vicente; Bazan, Richard E; Williams, Steve M; Arheart, Kristopher; Candiotti, Keith A

    2014-01-01

    Background The purpose of this study was to propose a new crosswalk using the resource-based relative value system (RBRVS) that preserves the time unit component of the anesthesia service and disaggregates anesthesia billing into component parts (preoperative evaluation, intraoperative management, and postoperative evaluation). The study was designed as an observational chart and billing data review of current and proposed payments, in the setting of a preoperative holing area, intraoperative suite, and post anesthesia care unit. In total, 1,195 charts of American Society of Anesthesiology (ASA) physical status 1 through 5 patients were reviewed. No direct patient interventions were undertaken. Results Spearman correlations between the proposed RBRVS billing matrix payments and the current ASA relative value guide methodology payments were strong (r=0.94–0.96, P<0.001 for training, test, and overall). The proposed RBRVS-based billing matrix yielded payments that were 3.0%±1.34% less than would have been expected from commercial insurers, using standard rates for commercial ASA relative value units and RBRVS relative value units. Compared with current Medicare reimbursement under the ASA relative value guide, reimbursement would almost double when converting to an RBRVS billing model. The greatest increases in Medicare reimbursement between the current system and proposed billing model occurred as anesthetic management complexity increased. Conclusion The new crosswalk correlates with existing evaluation and management and intensive care medicine codes in an essentially revenue neutral manner when applied to the market-based rates of commercial insurers. The new system more highly values delivery of care to more complex patients undergoing more complex surgery and better represents the true value of anesthetic case management. PMID:25336964

  18. Using Active Learning for Speeding up Calibration in Simulation Models.

    Science.gov (United States)

    Cevik, Mucahit; Ergun, Mehmet Ali; Stout, Natasha K; Trentham-Dietz, Amy; Craven, Mark; Alagoz, Oguzhan

    2016-07-01

    Most cancer simulation models include unobservable parameters that determine disease onset and tumor growth. These parameters play an important role in matching key outcomes such as cancer incidence and mortality, and their values are typically estimated via a lengthy calibration procedure, which involves evaluating a large number of combinations of parameter values via simulation. The objective of this study is to demonstrate how machine learning approaches can be used to accelerate the calibration process by reducing the number of parameter combinations that are actually evaluated. Active learning is a popular machine learning method that enables a learning algorithm such as artificial neural networks to interactively choose which parameter combinations to evaluate. We developed an active learning algorithm to expedite the calibration process. Our algorithm determines the parameter combinations that are more likely to produce desired outputs and therefore reduces the number of simulation runs performed during calibration. We demonstrate our method using the previously developed University of Wisconsin breast cancer simulation model (UWBCS). In a recent study, calibration of the UWBCS required the evaluation of 378 000 input parameter combinations to build a race-specific model, and only 69 of these combinations produced results that closely matched observed data. By using the active learning algorithm in conjunction with standard calibration methods, we identify all 69 parameter combinations by evaluating only 5620 of the 378 000 combinations. Machine learning methods hold potential in guiding model developers in the selection of more promising parameter combinations and hence speeding up the calibration process. Applying our machine learning algorithm to one model shows that evaluating only 1.49% of all parameter combinations would be sufficient for the calibration. © The Author(s) 2015.

  19. Establishment of virtual three-dimensional model for intravascular interventional devices and its clinical value

    International Nuclear Information System (INIS)

    Wei Xin; Zhong Liming; Xie Xiaodong; Wang Chaohua; You Jian; Hu Hong; Hu Kongqiong; Zhao Xiaowei

    2012-01-01

    Objective: To explore virtual three-dimensional (3D) model for intravascular interventional devices,the method of preoperative simulation and its value in clinical work. Methods: The virtual models including catheter, guide wire, stent and coil were established by using the 3D moulding software of 3D Studio MAX R3. The interventional preoperative simulation was performed on personal computer including 21 patients of cerebral aneurysm embolization (anterior communicating artery 5, posterior communicating artery 10,middle cerebral artery 3, internal carotid artery 2, and vertebral artery 1), during interventional procedures, the surgeon relied on the simulation results for plastic micro-guide wire, catheter and the release of micro-coils and stents. Results: (1) All the virtual instruments and real instruments had similar shape,the overall tine for constructing virtual model was about 20 hours. The preoperative simulation took 50 to 80 minutes. (2) The simulation result of catheter insertion in the 18 cases had relevant value to guide micro-catheter, molding micro-guide wire tip, and shortened the operating time. For embolization, the simulation results of filling coil and releasing stent were similar to surgical results in 76% of the patients (16/21). (3)For teaching and training, 93% (38/41) of doctors in training believed that preoperative simulation facilitated the understanding of surgery. Conclusions: The method of virtual model of intravascular interventional devices was reliable. The preoperative simulation results could be used to guide practical clinical operation with relatively high degree of similarity, and could play a role in promoting researches on interventional virtual operations. (authors)

  20. The Unfolding of Value Sources During Online Business Model Transformation

    Directory of Open Access Journals (Sweden)

    Nadja Hoßbach

    2016-12-01

    Full Text Available Purpose: In the magazine publishing industry, viable online business models are still rare to absent. To prepare for the ‘digital future’ and safeguard their long-term survival, many publishers are currently in the process of transforming their online business model. Against this backdrop, this study aims to develop a deeper understanding of (1 how the different building blocks of an online business model are transformed over time and (2 how sources of value creation unfold during this transformation process. Methodology: To answer our research question, we conducted a longitudinal case study with a leading German business magazine publisher (called BIZ. Data was triangulated from multiple sources including interviews, internal documents, and direct observations. Findings: Based on our case study, we nd that BIZ used the transformation process to differentiate its online business model from its traditional print business model along several dimensions, and that BIZ’s online business model changed from an efficiency- to a complementarity- to a novelty-based model during this process. Research implications: Our findings suggest that different business model transformation phases relate to different value sources, questioning the appropriateness of value source-based approaches for classifying business models. Practical implications: The results of our case study highlight the need for online-offline business model differentiation and point to the important distinction between service and product differentiation. Originality: Our study contributes to the business model literature by applying a dynamic and holistic perspective on the link between online business model changes and unfolding value sources.

  1. A Novel Mean-Value Model of the Cardiovascular System Including a Left Ventricular Assist Device.

    Science.gov (United States)

    Ochsner, Gregor; Amacher, Raffael; Schmid Daners, Marianne

    2017-06-01

    Time-varying elastance models (TVEMs) are often used for simulation studies of the cardiovascular system with a left ventricular assist device (LVAD). Because these models are computationally expensive, they cannot be used for long-term simulation studies. In addition, their equilibria are periodic solutions, which prevent the extraction of a linear time-invariant model that could be used e.g. for the design of a physiological controller. In the current paper, we present a new type of model to overcome these problems: the mean-value model (MVM). The MVM captures the behavior of the cardiovascular system by representative mean values that do not change within the cardiac cycle. For this purpose, each time-varying element is manually converted to its mean-value counterpart. We compare the derived MVM to a similar TVEM in two simulation experiments. In both cases, the MVM is able to fully capture the inter-cycle dynamics of the TVEM. We hope that the new MVM will become a useful tool for researchers working on physiological control algorithms. This paper provides a plant model that enables for the first time the use of tools from classical control theory in the field of physiological LVAD control.

  2. Finite element modeling to analyze TEER values across silicon nanomembranes.

    Science.gov (United States)

    Khire, Tejas S; Nehilla, Barrett J; Getpreecharsawas, Jirachai; Gracheva, Maria E; Waugh, Richard E; McGrath, James L

    2018-01-05

    Silicon nanomembranes are ultrathin, highly permeable, optically transparent and biocompatible substrates for the construction of barrier tissue models. Trans-epithelial/endothelial electrical resistance (TEER) is often used as a non-invasive, sensitive and quantitative technique to assess barrier function. The current study characterizes the electrical behavior of devices featuring silicon nanomembranes to facilitate their application in TEER studies. In conventional practice with commercial systems, raw resistance values are multiplied by the area of the membrane supporting cell growth to normalize TEER measurements. We demonstrate that under most circumstances, this multiplication does not 'normalize' TEER values as is assumed, and that the assumption is worse if applied to nanomembrane chips with a limited active area. To compare the TEER values from nanomembrane devices to those obtained from conventional polymer track-etched (TE) membranes, we develop finite element models (FEM) of the electrical behavior of the two membrane systems. Using FEM and parallel cell-culture experiments on both types of membranes, we successfully model the evolution of resistance values during the growth of endothelial monolayers. Further, by exploring the relationship between the models we develop a 'correction' function, which when applied to nanomembrane TEER, maps to experiments on conventional TE membranes. In summary, our work advances the the utility of silicon nanomembranes as substrates for barrier tissue models by developing an interpretation of TEER values compatible with conventional systems.

  3. Classification of customer lifetime value models using Markov chain

    Science.gov (United States)

    Permana, Dony; Pasaribu, Udjianna S.; Indratno, Sapto W.; Suprayogi

    2017-10-01

    A firm’s potential reward in future time from a customer can be determined by customer lifetime value (CLV). There are some mathematic methods to calculate it. One method is using Markov chain stochastic model. Here, a customer is assumed through some states. Transition inter the states follow Markovian properties. If we are given some states for a customer and the relationships inter states, then we can make some Markov models to describe the properties of the customer. As Markov models, CLV is defined as a vector contains CLV for a customer in the first state. In this paper we make a classification of Markov Models to calculate CLV. Start from two states of customer model, we make develop in many states models. The development a model is based on weaknesses in previous model. Some last models can be expected to describe how real characters of customers in a firm.

  4. Best Practices for Crash Modeling and Simulation

    Science.gov (United States)

    Fasanella, Edwin L.; Jackson, Karen E.

    2002-01-01

    Aviation safety can be greatly enhanced by the expeditious use of computer simulations of crash impact. Unlike automotive impact testing, which is now routine, experimental crash tests of even small aircraft are expensive and complex due to the high cost of the aircraft and the myriad of crash impact conditions that must be considered. Ultimately, the goal is to utilize full-scale crash simulations of aircraft for design evaluation and certification. The objective of this publication is to describe "best practices" for modeling aircraft impact using explicit nonlinear dynamic finite element codes such as LS-DYNA, DYNA3D, and MSC.Dytran. Although "best practices" is somewhat relative, it is hoped that the authors' experience will help others to avoid some of the common pitfalls in modeling that are not documented in one single publication. In addition, a discussion of experimental data analysis, digital filtering, and test-analysis correlation is provided. Finally, some examples of aircraft crash simulations are described in several appendices following the main report.

  5. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  6. An improved finite element model for craniofacial surgery simulation.

    Science.gov (United States)

    Wang, Shengzheng; Yang, Jie

    2009-11-01

    A novel approach is proposed for simulating the deformation of the facial soft tissues in the craniofacial surgery simulation. A nonlinear finite mixed-element model (NFM-EM) based on solid-shell elements and Lagrange principle of virtual work is proposed, which addresses the heterogeneity in geometry and material properties found in the soft tissues of the face. Moreover, after the investigation of the strain-potential models, the biomechanical characteristics of skin, muscles and fat are modeled with the most suitable material properties. In addition, an improved contact algorithm is used to compute the boundary conditions of the soft tissue model. The quantitative validation and the comparative results with other models proved the effectiveness of the approach on the simulation of complex soft tissues. The average absolute value of errors stays below 0.5 mm and the 95% percentiles of the distance map is less than 1.5 mm. NFM-EM promotes the accuracy and effectiveness of the soft tissue deformation, and the effective contact algorithm bridges the bone-related planning and the prediction of the target face.

  7. Mainstreaming Modeling and Simulation to Accelerate Public Health Innovation

    Science.gov (United States)

    Sepulveda, Martin-J.; Mabry, Patricia L.

    2014-01-01

    Dynamic modeling and simulation are systems science tools that examine behaviors and outcomes resulting from interactions among multiple system components over time. Although there are excellent examples of their application, they have not been adopted as mainstream tools in population health planning and policymaking. Impediments to their use include the legacy and ease of use of statistical approaches that produce estimates with confidence intervals, the difficulty of multidisciplinary collaboration for modeling and simulation, systems scientists’ inability to communicate effectively the added value of the tools, and low funding for population health systems science. Proposed remedies include aggregation of diverse data sets, systems science training for public health and other health professionals, changing research incentives toward collaboration, and increased funding for population health systems science projects. PMID:24832426

  8. The reproductive value in distributed optimal control models.

    Science.gov (United States)

    Wrzaczek, Stefan; Kuhn, Michael; Prskawetz, Alexia; Feichtinger, Gustav

    2010-05-01

    We show that in a large class of distributed optimal control models (DOCM), where population is described by a McKendrick type equation with an endogenous number of newborns, the reproductive value of Fisher shows up as part of the shadow price of the population. Depending on the objective function, the reproductive value may be negative. Moreover, we show results of the reproductive value for changing vital rates. To motivate and demonstrate the general framework, we provide examples in health economics, epidemiology, and population biology.

  9. A Continuous-Time Model for Valuing Foreign Exchange Options

    Directory of Open Access Journals (Sweden)

    James J. Kung

    2013-01-01

    Full Text Available This paper makes use of stochastic calculus to develop a continuous-time model for valuing European options on foreign exchange (FX when both domestic and foreign spot rates follow a generalized Wiener process. Using the dollar/euro exchange rate as input for parameter estimation and employing our FX option model as a yardstick, we find that the traditional Garman-Kohlhagen FX option model, which assumes constant spot rates, values incorrectly calls and puts for different values of the ratio of exchange rate to exercise price. Specifically, it undervalues calls when the ratio is between 0.70 and 1.08, and it overvalues calls when the ratio is between 1.18 and 1.30, whereas it overvalues puts when the ratio is between 0.70 and 0.82, and it undervalues puts when the ratio is between 0.86 and 1.30.

  10. Biomechanics trends in modeling and simulation

    CERN Document Server

    Ogden, Ray

    2017-01-01

    The book presents a state-of-the-art overview of biomechanical and mechanobiological modeling and simulation of soft biological tissues. Seven well-known scientists working in that particular field discuss topics such as biomolecules, networks and cells as well as failure, multi-scale, agent-based, bio-chemo-mechanical and finite element models appropriate for computational analysis. Applications include arteries, the heart, vascular stents and valve implants as well as adipose, brain, collagenous and engineered tissues. The mechanics of the whole cell and sub-cellular components as well as the extracellular matrix structure and mechanotransduction are described. In particular, the formation and remodeling of stress fibers, cytoskeletal contractility, cell adhesion and the mechanical regulation of fibroblast migration in healing myocardial infarcts are discussed. The essential ingredients of continuum mechanics are provided. Constitutive models of fiber-reinforced materials with an emphasis on arterial walls ...

  11. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  12. Traffic flow dynamics data, models and simulation

    CERN Document Server

    Treiber, Martin

    2013-01-01

    This textbook provides a comprehensive and instructive coverage of vehicular traffic flow dynamics and modeling. It makes this fascinating interdisciplinary topic, which to date was only documented in parts by specialized monographs, accessible to a broad readership. Numerous figures and problems with solutions help the reader to quickly understand and practice the presented concepts. This book is targeted at students of physics and traffic engineering and, more generally, also at students and professionals in computer science, mathematics, and interdisciplinary topics. It also offers material for project work in programming and simulation at college and university level. The main part, after presenting different categories of traffic data, is devoted to a mathematical description of the dynamics of traffic flow, covering macroscopic models which describe traffic in terms of density, as well as microscopic many-particle models in which each particle corresponds to a vehicle and its driver. Focus chapters on ...

  13. Modelling and Simulation for Major Incidents

    Directory of Open Access Journals (Sweden)

    Eleonora Pacciani

    2015-11-01

    Full Text Available In recent years, there has been a rise in Major Incidents with big impact on the citizens health and the society. Without the possibility of conducting live experiments when it comes to physical and/or toxic trauma, only an accurate in silico reconstruction allows us to identify organizational solutions with the best possible chance of success, in correlation with the limitations on available resources (e.g. medical team, first responders, treatments, transports, and hospitals availability and with the variability of the characteristic of event (e.g. type of incident, severity of the event and type of lesions. Utilizing modelling and simulation techniques, a simplified mathematical model of physiological evolution for patients involved in physical and toxic trauma incident scenarios has been developed and implemented. The model formalizes the dynamics, operating standards and practices of medical response and the main emergency service in the chain of emergency management during a Major Incident.

  14. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  15. Heinrich events modeled in transient glacial simulations

    Science.gov (United States)

    Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe

    2017-04-01

    Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.

  16. Value increasing business model for e-hospital.

    Science.gov (United States)

    Null, Robert; Wei, June

    2009-01-01

    This paper developed a business value increasing model for electronic hospital (e-hospital) based on electronic value chain analysis. From this model, 58 hospital electronic business (e-business) solutions were developed. Additionally, this paper investigated the adoption patterns of these 58 e-business solutions within six US leading hospitals. The findings show that only 36 of 58 or 62% of the e-business solutions are fully or partially implemented within the six hospitals. Ultimately, the research results will be beneficial to managers and executives for accelerating e-business adoptions for e-hospital.

  17. Simulation Model of Mobile Detection Systems

    International Nuclear Information System (INIS)

    Edmunds, T.; Faissol, D.; Yao, Y.

    2009-01-01

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped with 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains

  18. Simulation of arc models with the block modelling method

    NARCIS (Netherlands)

    Thomas, R.; Lahaye, D.J.P.; Vuik, C.; Van der Sluis, L.

    2015-01-01

    Simulation of current interruption is currently performed with non-ideal switching devices for large power systems. Nevertheless, for small networks, non-ideal switching devices can be substituted by arc models. However, this substitution has a negative impact on the computation time. At the same

  19. Modeling lignin polymerization. Part 1: simulation model of dehydrogenation polymers.

    NARCIS (Netherlands)

    F.R.D. van Parijs (Frederik); K. Morreel; J. Ralph; W. Boerjan; R.M.H. Merks (Roeland)

    2010-01-01

    htmlabstractLignin is a heteropolymer that is thought to form in the cell wall by combinatorial radical coupling of monolignols. Here, we present a simulation model of in vitro lignin polymerization, based on the combinatorial coupling theory, which allows us to predict the reaction conditions

  20. Modeling the value of strategic actions in the superior colliculus

    Directory of Open Access Journals (Sweden)

    Dhushan Thevarajah

    2010-02-01

    Full Text Available In learning models of strategic game play, an agent constructs a valuation (action value over possible future choices as a function of past actions and rewards. Choices are then stochastic functions of these action values. Our goal is to uncover a neural signal that correlates with the action value posited by behavioral learning models. We measured activity from neurons in the superior colliculus (SC, a midbrain region involved in planning saccadic eye movements, in monkeys while they performed two saccade tasks. In the strategic task, monkeys competed against a computer in a saccade version of the mixed-strategy game “matching-pennies”. In the instructed task, stochastic saccades were elicited through explicit instruction rather than free choices. In both tasks, neuronal activity and behavior were shaped by past actions and rewards with more recent events exerting a larger influence. Further, SC activity predicted upcoming choices during the strategic task and upcoming reaction times during the instructed task. Finally, we found that neuronal activity in both tasks correlated with an established learning model, the Experience Weighted Attraction model of action valuation (Ho, Camerer, and Chong, 2007. Collectively, our results provide evidence that action values hypothesized by learning models are represented in the motor planning regions of the brain in a manner that could be used to select strategic actions.

  1. Mean Value Engine Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Müller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models what are physically based. Such models are useful for control studies, for engine control system analysis and for model based engine control systems. Very few published MVEMs have included the effects of Exhaust Gas...... Recirculation (EGR). The purpose of this paper is to present a modified MVEM which includes EGR in a physical way. It has been tested using newly developed, very fast manifold pressure, manifold temperature, port and EGR mass flow sensors. Reasonable agreement has been obtained on an experimental engine...

  2. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  3. Software to Enable Modeling & Simulation as a Service

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a Modeling and Simulation as a Service (M&SaaS) software service infrastructure to enable most modeling and simulation (M&S) activities to be...

  4. Modelling and simulation of railway cable systems

    Energy Technology Data Exchange (ETDEWEB)

    Teichelmann, G.; Schaub, M.; Simeon, B. [Technische Univ. Muenchen, Garching (Germany). Zentrum Mathematik M2

    2005-12-15

    Mathematical models and numerical methods for the computation of both static equilibria and dynamic oscillations of railroad catenaries are derived and analyzed. These cable systems form a complex network of string and beam elements and lead to coupled partial differential equations in space and time where constraints and corresponding Lagrange multipliers express the interaction between carrier, contact wire, and pantograph head. For computing static equilibria, three different algorithms are presented and compared, while the dynamic case is treated by a finite element method in space, combined with stabilized time integration of the resulting differential algebraic system. Simulation examples based on reference data from industry illustrate the potential of such computational tools. (orig.)

  5. Simulation model for port shunting yards

    Science.gov (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  6. Catalog of Wargaming and Military Simulation Models

    Science.gov (United States)

    1992-02-07

    produces a LOS data file and target summary file. The target summary file rec9rds possible component interferences and errors encountered in procesoing ...links, messages, and monitors. SimMaster simulacion models are constructed from these object-oriented build- blocks. SiiuM(ster’s Radiation Monitor...Required for decisions as plan is built. Time Procesoing : Plan is static. Treatment of Randomness: Plan is deterministic, using expected value

  7. Modeling VOC transport in simulated waste drums

    International Nuclear Information System (INIS)

    Liekhus, K.J.; Gresham, G.L.; Peterson, E.S.; Rae, C.; Hotz, N.J.; Connolly, M.J.

    1993-06-01

    A volatile organic compound (VOC) transport model has been developed to describe unsteady-state VOC permeation and diffusion within a waste drum. Model equations account for three primary mechanisms for VOC transport from a void volume within the drum. These mechanisms are VOC permeation across a polymer boundary, VOC diffusion across an opening in a volume boundary, and VOC solubilization in a polymer boundary. A series of lab-scale experiments was performed in which the VOC concentration was measured in simulated waste drums under different conditions. A lab-scale simulated waste drum consisted of a sized-down 55-gal metal drum containing a modified rigid polyethylene drum liner. Four polyethylene bags were sealed inside a large polyethylene bag, supported by a wire cage, and placed inside the drum liner. The small bags were filled with VOC-air gas mixture and the VOC concentration was measured throughout the drum over a period of time. Test variables included the type of VOC-air gas mixtures introduced into the small bags, the small bag closure type, and the presence or absence of a variable external heat source. Model results were calculated for those trials where the VOC permeability had been measured. Permeabilities for five VOCs [methylene chloride, 1,1,2-trichloro-1,2,2-trifluoroethane (Freon-113), 1,1,1-trichloroethane, carbon tetrachloride, and trichloroethylene] were measured across a polyethylene bag. Comparison of model and experimental results of VOC concentration as a function of time indicate that model accurately accounts for significant VOC transport mechanisms in a lab-scale waste drum

  8. Can Participatory Action Research Create Value for Business Model Innovation?

    DEFF Research Database (Denmark)

    Sparre, Mogens; Rasmussen, Ole Horn; Fast, Alf Michael

    Abstract: Participatory Action Research (PAR) has a longer academic history compared with the idea of business models (BMs). This paper indicates how industries gain by using the combined methodology. The research question "Can participatory action research create value for Business Model...... Innovation (BMI)?” – has been investigated from five different perspectives based upon The Business Model Cube and The Where to Look Model. Using both established and newly developed tools the paper presents how. Theory and data from two cases are presented and it is demonstrated how industry increase...... their monetary and/or non-monetary value creation doing BMI based upon PAR. The process is essential and using the methodology of PAR creates meaning. Behind the process, the RAR methodology and its link to BM and BMI may contribute to theory construction and creation of a common language in academia around...

  9. Value stream mapping and simulation for implementation of lean manufacturing practices in a footwear company

    Directory of Open Access Journals (Sweden)

    Danilo Felipe Silva de Lima

    2016-03-01

    Full Text Available The development of the Value Stream Mapping (VSM is generally the first step for implementation of Lean Manufacturing (LM. The aim of this paper is to present an application of VSM with simulation in order to analyze the impacts of the LM adoption in the performance of a footwear plant. Therefore, a VSM was designed for the current state and, through the implementation of lean elements, a future state could be designed. Different scenarios were simulated for the future state implementation and the results were compared each other. Transfer, cutting and assembly sections were chosen to be simulated, because it was considered that would be possible to establish a one-piece flow between those processes. After the simulation, the scenario that presented the best results provided a 19% productivity increase over the current state, as well as improvement in all other process variables. The application of simulation as an additional element of VSM has helped to identify the advantages of the joint approach, since it enables to test different alternatives and better define the future state and its implementation strategies.

  10. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  11. Numerical Simulation Modelling for Velocity Measurement of Electromagnetic Flow Meter

    International Nuclear Information System (INIS)

    Wang, J Z; Gong, C L; Tian, G Y; Lucas, G P

    2006-01-01

    An induced voltage EMF in the area of measuring single-phase flow rate in pipes has been used in many industrial areas. To measure the continuous phase velocity profile in multiphase flows where the continuous phase is an electrical conductor, Electrical capacitance and resistance tomography has been comprehensively investigated, except for continuous phase velocity profile measurement. This paper tries to design the numerical simulation model according to the basic electromagnetic induction law and to investigate the relationship between induced electric potential or potential drop and the velocity distribution of the conductive continuous phase in the flow. First, the 3-Dimenssion simulating module for EMF is built. Given the most simple velocity profile of the fluid in the pipe, the value of the induced potential difference between electrodes is obtained by simulation and theoretical computation according to J A Shercliff's weight function. The relative error is 6.066 . This proves that the simulation model is accurate enough to investigate the characteristic of the induced potential difference of EMF. Finally, the relationship between induced potential difference and the velocity profile is analysed in detail where the complicated velocity profile is expressed as vz = 1m/s when 0.022< x2+y2< = 0.02652 and vz = 5m/s when x2+y2< = 0.022

  12. The steady state of epidermis: mathematical modeling and numerical simulations.

    Science.gov (United States)

    Gandolfi, Alberto; Iannelli, Mimmo; Marinoschi, Gabriela

    2016-12-01

    We consider a model with age and space structure for the epidermis evolution. The model, previously presented and analyzed with respect to the suprabasal epidermis, includes different types of cells (proliferating cells, differentiated cells, corneous cells, and apoptotic cells) moving with the same velocity, under the constraint that the local volume fraction occupied by the cells is constant in space and time. Here, we complete the model proposing a mechanism regulating the cell production in the basal layer and we focus on the stationary case of the problem, i.e. on the case corresponding to the normal status of the skin. A numerical scheme to compute the solution of the model is proposed and its convergence is studied. Simulations are provided for realistic values of the parameters, showing the possibility of reproducing the structure of both "thin" and "thick" epidermis.

  13. Simulating the market for automotive fuel efficiency: The SHRSIM model

    Energy Technology Data Exchange (ETDEWEB)

    Greene, D.L.

    1987-02-01

    This report describes a computer model for simulating the effects of uncertainty about future fuel prices and competitors' behavior on the market shares of an automobile manufacturer who is considering introducing technology to increase fuel efficiency. Starting with an initial sales distribution, a pivot-point multinomial logit technique is used to adjust market shares based on changes in the present value of the added fuel efficiency. These shifts are random because the model generates random fuel price projections using parameters supplied by the user. The user also controls the timing of introduction and obsolescence of technology. While the model was designed with automobiles in mind, it has more general applicability to energy using durable goods. The model is written in IBM BASIC for an IBM PC and compiled using the Microsoft QuickBASIC (trademark of the Microsoft corporation) compiler.

  14. A model for measuring value for money in professional sports

    Directory of Open Access Journals (Sweden)

    Vlad ROŞCA

    2013-07-01

    Full Text Available Few to almost none sports teams measure the entertainment value they provide to fans in exchange of the money the latter ones spend on admission fees. Scientific literature oversees the issue as well. The aim of this paper is to present a model that can be used for calculating value for money in the context of spectating sports. The research question asks how can value for money be conceptualized and measured for sports marketing purposes? Using financial and sporting variables, the method calculates how much money, on average, a fan had to spend for receiving quality entertainment – defined as won matches – from his favorite team, during the last season of the Romanian first division football championship. The results only partially confirm the research hypothesis, showing that not just price and sporting performances may influence the value delivered to fans, but other factors as well.

  15. A matrix model for valuing anesthesia service with the resource-based relative value system

    Directory of Open Access Journals (Sweden)

    Sinclair DR

    2014-10-01

    Full Text Available David R Sinclair,1 David A Lubarsky,1 Michael M Vigoda,1 David J Birnbach,1 Eric A Harris,1 Vicente Behrens,1 Richard E Bazan,1 Steve M Williams,1 Kristopher Arheart,2 Keith A Candiotti1 1Department of Anesthesiology, Perioperative Medicine and Pain Management, 2Department of Public Health Sciences, Division of Biostatistics, University of Miami Miller School of Medicine, Miami, FL, USA Background: The purpose of this study was to propose a new crosswalk using the resource-based relative value system (RBRVS that preserves the time unit component of the anesthesia service and disaggregates anesthesia billing into component parts (preoperative evaluation, intraoperative management, and postoperative evaluation. The study was designed as an observational chart and billing data review of current and proposed payments, in the setting of a preoperative holing area, intraoperative suite, and post anesthesia care unit. In total, 1,195 charts of American Society of Anesthesiology (ASA physical status 1 through 5 patients were reviewed. No direct patient interventions were undertaken. Results: Spearman correlations between the proposed RBRVS billing matrix payments and the current ASA relative value guide methodology payments were strong (r=0.94–0.96, P<0.001 for training, test, and overall. The proposed RBRVS-based billing matrix yielded payments that were 3.0%±1.34% less than would have been expected from commercial insurers, using standard rates for commercial ASA relative value units and RBRVS relative value units. Compared with current Medicare reimbursement under the ASA relative value guide, reimbursement would almost double when converting to an RBRVS billing model. The greatest increases in Medicare reimbursement between the current system and proposed billing model occurred as anesthetic management complexity increased. Conclusion: The new crosswalk correlates with existing evaluation and management and intensive care medicine codes in an

  16. IT Business Value Model for Information Intensive Organizations

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Gastaud Maçada

    2012-01-01

    Full Text Available Many studies have highlighted the capacity Information Technology (IT has for generating value for organizations. Investments in IT made by organizations have increased each year. Therefore, the purpose of the present study is to analyze the IT Business Value for Information Intensive Organizations (IIO - e.g. banks, insurance companies and securities brokers. The research method consisted of a survey that used and combined the models from Weill and Broadbent (1998 and Gregor, Martin, Fernandez, Stern and Vitale (2006. Data was gathered using an adapted instrument containing 5 dimensions (Strategic, Informational, Transactional, Transformational and Infra-structure with 27 items. The instrument was refined by employing statistical techniques such as Exploratory and Confirmatory Factorial Analysis through Structural Equations (first and second order Model Measurement. The final model is composed of four factors related to IT Business Value: Strategic, Informational, Transactional and Transformational, arranged in 15 items. The dimension Infra-structure was excluded during the model refinement process because it was discovered during interviews that managers were unable to perceive it as a distinct dimension of IT Business Value.

  17. Molecular models and simulations of layered materials

    International Nuclear Information System (INIS)

    Kalinichev, Andrey G.; Cygan, Randall Timothy; Heinz, Hendrik; Greathouse, Jeffery A.

    2008-01-01

    The micro- to nano-sized nature of layered materials, particularly characteristic of naturally occurring clay minerals, limits our ability to fully interrogate their atomic dispositions and crystal structures. The low symmetry, multicomponent compositions, defects, and disorder phenomena of clays and related phases necessitate the use of molecular models and modern simulation methods. Computational chemistry tools based on classical force fields and quantum-chemical methods of electronic structure calculations provide a practical approach to evaluate structure and dynamics of the materials on an atomic scale. Combined with classical energy minimization, molecular dynamics, and Monte Carlo techniques, quantum methods provide accurate models of layered materials such as clay minerals, layered double hydroxides, and clay-polymer nanocomposites

  18. VISION: Verifiable Fuel Cycle Simulation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Abdellatif M. Yacout; Gretchen E. Matthern; Steven J. Piet; David E. Shropshire

    2009-04-01

    The nuclear fuel cycle is a very complex system that includes considerable dynamic complexity as well as detail complexity. In the nuclear power realm, there are experts and considerable research and development in nuclear fuel development, separations technology, reactor physics and waste management. What is lacking is an overall understanding of the entire nuclear fuel cycle and how the deployment of new fuel cycle technologies affects the overall performance of the fuel cycle. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing and delays in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works and can transition as technologies are changed. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model and some examples of how to use VISION.

  19. Modeling and visual simulation of Microalgae photobioreactor

    Science.gov (United States)

    Zhao, Ming; Hou, Dapeng; Hu, Dawei

    Microalgae is a kind of nutritious and high photosynthetic efficiency autotrophic plant, which is widely distributed in the land and the sea. It can be extensively used in medicine, food, aerospace, biotechnology, environmental protection and other fields. Photobioreactor which is important equipment is mainly used to cultivate massive and high-density microalgae. In this paper, based on the mathematical model of microalgae which grew under different light intensity, three-dimensional visualization model was built and implemented in 3ds max, Virtools and some other three dimensional software. Microalgae is photosynthetic organism, it can efficiently produce oxygen and absorb carbon dioxide. The goal of the visual simulation is to display its change and impacting on oxygen and carbon dioxide intuitively. In this paper, different temperatures and light intensities were selected to control the photobioreactor, and dynamic change of microalgal biomass, Oxygen and carbon dioxide was observed with the aim of providing visualization support for microalgal and photobioreactor research.

  20. A rainfall simulation model for agricultural development in Bangladesh

    Directory of Open Access Journals (Sweden)

    M. Sayedur Rahman

    2000-01-01

    Full Text Available A rainfall simulation model based on a first-order Markov chain has been developed to simulate the annual variation in rainfall amount that is observed in Bangladesh. The model has been tested in the Barind Tract of Bangladesh. Few significant differences were found between the actual and simulated seasonal, annual and average monthly. The distribution of number of success is asymptotic normal distribution. When actual and simulated daily rainfall data were used to drive a crop simulation model, there was no significant difference of rice yield response. The results suggest that the rainfall simulation model perform adequately for many applications.

  1. Modeling lift operations with SASmacr Simulation Studio

    Science.gov (United States)

    Kar, Leow Soo

    2016-10-01

    Lifts or elevators are an essential part of multistorey buildings which provide vertical transportation for its occupants. In large and high-rise apartment buildings, its occupants are permanent, while in buildings, like hospitals or office blocks, the occupants are temporary or users of the buildings. They come in to work or to visit, and thus, the population of such buildings are much higher than those in residential apartments. It is common these days that large office blocks or hospitals have at least 8 to 10 lifts serving its population. In order to optimize the level of service performance, different transportation schemes are devised to control the lift operations. For example, one lift may be assigned to solely service the even floors and another solely for the odd floors, etc. In this paper, a basic lift system is modelled using SAS Simulation Studio to study the effect of factors such as the number of floors, capacity of the lift car, arrival rate and exit rate of passengers at each floor, peak and off peak periods on the system performance. The simulation is applied to a real lift operation in Sunway College's North Building to validate the model.

  2. Plasma simulation studies using multilevel physics models

    International Nuclear Information System (INIS)

    Park, W.; Belova, E.V.; Fu, G.Y.

    2000-01-01

    The question of how to proceed toward ever more realistic plasma simulation studies using ever increasing computing power is addressed. The answer presented here is the M3D (Multilevel 3D) project, which has developed a code package with a hierarchy of physics levels that resolve increasingly complete subsets of phase-spaces and are thus increasingly more realistic. The rationale for the multilevel physics models is given. Each physics level is described and examples of its application are given. The existing physics levels are fluid models (3D configuration space), namely magnetohydrodynamic (MHD) and two-fluids; and hybrid models, namely gyrokinetic-energetic-particle/MHD (5D energetic particle phase-space), gyrokinetic-particle-ion/fluid-electron (5D ion phase-space), and full-kinetic-particle-ion/fluid-electron level (6D ion phase-space). Resolving electron phase-space (5D or 6D) remains a future project. Phase-space-fluid models are not used in favor of delta f particle models. A practical and accurate nonlinear fluid closure for noncollisional plasmas seems not likely in the near future

  3. Plasma simulation studies using multilevel physics models

    Energy Technology Data Exchange (ETDEWEB)

    Park, W.; Belova, E.V.; Fu, G.Y. [and others

    2000-01-19

    The question of how to proceed toward ever more realistic plasma simulation studies using ever increasing computing power is addressed. The answer presented here is the M3D (Multilevel 3D) project, which has developed a code package with a hierarchy of physics levels that resolve increasingly complete subsets of phase-spaces and are thus increasingly more realistic. The rationale for the multilevel physics models is given. Each physics level is described and examples of its application are given. The existing physics levels are fluid models (3D configuration space), namely magnetohydrodynamic (MHD) and two-fluids; and hybrid models, namely gyrokinetic-energetic-particle/MHD (5D energetic particle phase-space), gyrokinetic-particle-ion/fluid-electron (5D ion phase-space), and full-kinetic-particle-ion/fluid-electron level (6D ion phase-space). Resolving electron phase-space (5D or 6D) remains a future project. Phase-space-fluid models are not used in favor of delta f particle models. A practical and accurate nonlinear fluid closure for noncollisional plasmas seems not likely in the near future.

  4. Modeling and numerical simulations of the influenced Sznajd model

    Science.gov (United States)

    Karan, Farshad Salimi Naneh; Srinivasan, Aravinda Ramakrishnan; Chakraborty, Subhadeep

    2017-08-01

    This paper investigates the effects of independent nonconformists or influencers on the behavioral dynamic of a population of agents interacting with each other based on the Sznajd model. The system is modeled on a complete graph using the master equation. The acquired equation has been numerically solved. Accuracy of the mathematical model and its corresponding assumptions have been validated by numerical simulations. Regions of initial magnetization have been found from where the system converges to one of two unique steady-state PDFs, depending on the distribution of influencers. The scaling property and entropy of the stationary system in presence of varying level of influence have been presented and discussed.

  5. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  6. Estimating radiation and temperature data for crop simulation model

    International Nuclear Information System (INIS)

    Ferrer, A.B.; Centeno, H.G.S.; Sheehy, J.E.

    1996-01-01

    Weather (radiation and temperature) and crop characteristics determine the potential production of an irrigated rice crop. Daily weather data are important inputs to ORYZA 1, an eco-physiological crop model. However, in most cases, missing values occur and sometimes daily weather data are not readily available. More than 20 years of historic daily weather data had been collected from six stations in the Philippines -- Albay, Butuan, Munoz, Batac, Aborlan, and Los Banos. Methods to estimate daily weather data values were made by deriving long-term monthly means and (1) using the same value per month, (2) linearly interpolating between months, and (3) using SIMMETEO weather generator. A validated ORYZA 1 was run using actual daily weather data. The model was run again using weather data obtained from each estimation procedure and the predicted yields from the different simulation runs were compared. The yield predicted using the different weather data sets for each site difference by as much as 20 percent. Among the three estimation procedures used, the interpolated monthly mean values of weather data gave results comparable with those of model runs using actual weather data

  7. Modelling and Simulation of Gas Engines Using Aspen HYSYS

    Directory of Open Access Journals (Sweden)

    M. C. Ekwonu

    2013-12-01

    Full Text Available In this paper gas engine model was developed in Aspen HYSYS V7.3 and validated with Waukesha 16V275GL+ gas engine. Fuel flexibility, fuel types and part load performance of the gas engine were investigated. The design variability revealed that the gas engine can operate on poor fuel with low lower heating value (LHV such as landfill gas, sewage gas and biogas with biogas offering potential integration with bottoming cycles when compared to natural gas. The result of the gas engine simulation gave an efficiency 40.7% and power output of 3592kW.

  8. Design, modeling, and simulation of MEMS pressure sensors

    Science.gov (United States)

    Geca, Mateusz; Kociubiński, Andrzej

    2013-10-01

    This paper focuses on the design and analysis of a MEMS piezoresistive pressure sensor. The absolute pressure sensor with a 150μm wide and 3μm thick silicon membrane is modeled and simulated using CoventorWare™ softwareprofiting from a finite element method (FEM) implemented to determine specific electro-mechanical parameter values characterizing MEMS structure being designed. Optimization of piezoresistor parameters has been also performed to determine optimum dimensions of piezoresistors and their location referred to the center on the pressure sensor diaphragm. The output voltage measured on a piezoresistive Wheatstone bridge has been obtained and compared for two different resistor materials along with and linearity error analysis.

  9. Economic value added model upon conditions of banking company

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2008-01-01

    Full Text Available The content of this article is the application of the economic value added model (EVA upon the conditions of a banking company. Due to the character of banking business, which is in a different structure of financial sheet, it is not possible to use the standard model EVA for this banking company. The base of this article is the outlined of basic principles of the EVA mode in a non-banking company. Basic specified banking activity dissimilarities are analysed and a directed methodology adjustment of a model such as this, so that it is possible to use it for a banking company.

  10. International Business Models Developed Through Brokerage Knowledge and Value Creation

    DEFF Research Database (Denmark)

    Petersen, Nicolaj Hannesbo; Rasmussen, Erik Stavnsager

    This paper highlights theoretically and empirically international business model decisions in networks with knowledge sharing and value creation. The paper expands the conceptual in-ternational business model framework for technology-oriented companies to include the focal firm’s network role...... and strategic fit in a global embeddedness. The brokerage role in the in-ternationalization of a network is discussed from both a theoretical and empirical point of view. From a business model and social network analysis perspective, this paper will show how firms and network grow internationally through two...

  11. Cultivating a disease management partnership: a value-chain model.

    Science.gov (United States)

    Murray, Carolyn F; Monroe, Wendy; Stalder, Sharon A

    2003-01-01

    Disease management (DM) is one of the health care industry's more innovative value-chain models, whereby multiple relationships are created to bring complex and time-sensitive services to market. The very nature of comprehensive, seamless DM provided through an outsourced arrangement necessitates a level of cooperation, trust, and synergy that may be lacking from more traditional vendor-customer relationships. This discussion highlights the experience of one health plan and its vendor partner and their approach to the development and delivery of an outsourced heart failure (HF) DM program. The program design and rollout are discussed within principles adapted from the theoretical framework of a value-chain model. Within the value-chain model, added value is created by the convergence and synergistic integration of the partners' discrete strengths. Although each partner brings unique attributes to the relationship, those attributes are significantly enhanced by the value-chain model, thus allowing each party to bring the added value of the relationship to their respective customers. This partnership increases innovation, leverages critical capabilities, and improves market responsiveness. Implementing a comprehensive, outsourced DM program is no small task. DM programs incorporate a broad array of services affecting nearly every department in a health plan's organization. When true seamless integration between multiple organizations with multiple stakeholders is the objective, implementation and ongoing operations can become even more complex. To effectively address the complexities presented by an HF DM program, the parties in this case moved beyond a typical purchaser-vendor relationship to one that is more closely akin to a strategic partnership. This discussion highlights the development of this partnership from the perspective of both organizations, as revealed through contracting and implementation activities. It is intended to provide insight into the program

  12. Geomechanical Simulation of Bayou Choctaw Strategic Petroleum Reserve - Model Calibration.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    A finite element numerical analysis model has been constructed that consists of a realistic mesh capturing the geometries of Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) site and multi - mechanism deformation ( M - D ) salt constitutive model using the daily data of actual wellhead pressure and oil - brine interface. The salt creep rate is not uniform in the salt dome, and the creep test data for BC salt is limited. Therefore, the model calibration is necessary to simulate the geomechanical behavior of the salt dome. The cavern volumetric closures of SPR caverns calculated from CAVEMAN are used for the field baseline measurement. The structure factor, A 2 , and transient strain limit factor, K 0 , in the M - D constitutive model are used for the calibration. The A 2 value obtained experimentally from the BC salt and K 0 value of Waste Isolation Pilot Plant (WIPP) salt are used for the baseline values. T o adjust the magnitude of A 2 and K 0 , multiplication factors A2F and K0F are defined, respectively. The A2F and K0F values of the salt dome and salt drawdown skins surrounding each SPR cavern have been determined through a number of back fitting analyses. The cavern volumetric closures calculated from this model correspond to the predictions from CAVEMAN for six SPR caverns. Therefore, this model is able to predict past and future geomechanical behaviors of the salt dome, caverns, caprock , and interbed layers. The geological concerns issued in the BC site will be explained from this model in a follow - up report .

  13. STAPOL: A Simulation of the Impact of Policy, Values, and Technological and Societal Developments upon the Quality of Life.

    Science.gov (United States)

    Little, Dennis; Feller, Richard

    The Institute for the Future has been conducting research in technological and societal forecasting, social indicators, value change, and simulation gaming. This paper describes an effort to bring together parts of that research into a simulation game ("State Policy," or STAPOL) for analysis of the impact of government policy, social values, and…

  14. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  15. The heuristic value of redundancy models of aging.

    Science.gov (United States)

    Boonekamp, Jelle J; Briga, Michael; Verhulst, Simon

    2015-11-01

    Molecular studies of aging aim to unravel the cause(s) of aging bottom-up, but linking these mechanisms to organismal level processes remains a challenge. We propose that complementary top-down data-directed modelling of organismal level empirical findings may contribute to developing these links. To this end, we explore the heuristic value of redundancy models of aging to develop a deeper insight into the mechanisms causing variation in senescence and lifespan. We start by showing (i) how different redundancy model parameters affect projected aging and mortality, and (ii) how variation in redundancy model parameters relates to variation in parameters of the Gompertz equation. Lifestyle changes or medical interventions during life can modify mortality rate, and we investigate (iii) how interventions that change specific redundancy parameters within the model affect subsequent mortality and actuarial senescence. Lastly, as an example of data-directed modelling and the insights that can be gained from this, (iv) we fit a redundancy model to mortality patterns observed by Mair et al. (2003; Science 301: 1731-1733) in Drosophila that were subjected to dietary restriction and temperature manipulations. Mair et al. found that dietary restriction instantaneously reduced mortality rate without affecting aging, while temperature manipulations had more transient effects on mortality rate and did affect aging. We show that after adjusting model parameters the redundancy model describes both effects well, and a comparison of the parameter values yields a deeper insight in the mechanisms causing these contrasting effects. We see replacement of the redundancy model parameters by more detailed sub-models of these parameters as a next step in linking demographic patterns to underlying molecular mechanisms. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  17. An interval-valued reliability model with bounded failure rates

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, Victor

    2012-01-01

    The approach to deriving interval-valued reliability measures described in this paper is distinctive from other imprecise reliability models in that it overcomes the issue of having to impose an upper bound on time to failure. It rests on the presupposition that a constant interval-valued failure...... rate is known possibly along with other reliability measures, precise or imprecise. The Lagrange method is used to solve the constrained optimization problem to derive new reliability measures of interest. The obtained results call for an exponential-wise approximation of failure probability density...... function if only partial failure information is available. An example is provided. © 2012 Copyright Taylor and Francis Group, LLC....

  18. Tecnomatix Plant Simulation modeling and programming by means of examples

    CERN Document Server

    Bangsow, Steffen

    2015-01-01

    This book systematically introduces the development of simulation models as well as the implementation and evaluation of simulation experiments with Tecnomatix Plant Simulation. It deals with all users of Plant Simulation, who have more complex tasks to handle. It also looks for an easy entry into the program. Particular attention has been paid to introduce the simulation flow language SimTalk and its use in various areas of the simulation. The author demonstrates with over 200 examples how to combine the blocks for simulation models and how to deal with SimTalk for complex control and analys

  19. [Value of laparoscopic virtual reality simulator in laparoscopic suture ability training of catechumen].

    Science.gov (United States)

    Cai, Jian-liang; Zhang, Yi; Sun, Guo-feng; Li, Ning-chen; Zhang, Xiang-hua; Na, Yan-qun

    2012-12-01

    To investigate the value of laparoscopic virtual reality simulator in laparoscopic suture ability training of catechumen. After finishing the virtual reality training of basic laparoscopic skills, 26 catechumen were divided randomly into 2 groups, one group undertook advanced laparoscopic skill (suture technique) training with laparoscopic virtual reality simulator (virtual group), another used laparoscopic box trainer (box group). Using our homemade simulations, before grouping and after training, every trainee performed nephropyeloureterostomy under laparoscopy, the running time, anastomosis quality and proficiency were recorded and assessed. For virtual group, the running time, anastomosis quality and proficiency scores before grouping were (98 ± 11) minutes, 3.20 ± 0.41, 3.47 ± 0.64, respectively, after training were (53 ± 8) minutes, 6.87 ± 0.74, 6.33 ± 0.82, respectively, all the differences were statistically significant (all P training were (52 ± 9) minutes, 6.08 ± 0.90, 6.33 ± 0.78, respectively, all the differences also were statistically significant (all P training, the running time and proficiency scores of virtual group were similar to box group (all P > 0.05), however, anstomosis quality scores in virtual group were higher than in box group (P = 0.02). The laparoscopic virtual reality simulator is better than traditional box trainer in advanced laparoscopic suture ability training of catechumen.

  20. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M

    2011-01-01

    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  1. [Homeostasis model assessment (HOMA) values in Chilean elderly subjects].

    Science.gov (United States)

    Garmendia, María Luisa; Lera, Lydia; Sánchez, Hugo; Uauy, Ricardo; Albala, Cecilia

    2009-11-01

    The homeostasis assessment model for insulin resistance (HOMA-IR) estimates insulin resistance using basal insulin and glucose values and has a good concordance with values obtained with the euglycemic clamp. However it has a high variability that depends on environmental, genetic and physiologic factors. Therefore it is imperative to establish normal HOMA values in different populations. To report HOMA-IR values in Chilean elderly subjects and to determine the best cutoff point to diagnose insulin resistance. Cross sectional study of 1003 subjects older than 60 years of whom 803 (71% women) did not have diabetes. In 154 subjects, an oral glucose tolerance test was also performed. Insulin resistance (IR) was defined as the HOMA value corresponding to percentile 75 of subjects without over or underweight. The behavior of HOMA-IR in metabolic syndrome was studied and receiver operating curves (ROC) were calculated, using glucose intolerance defined as a blood glucose over 140 mg/dl and hyperinsulinemia, defined as a serum insulin over 60 microU/ml, two hours after the glucose load. Median HOMA-IR values were 1.7. Percentile 75 in subjects without obesity or underweight was 2.57. The area under the ROC curve, when comparing HOMA-IR with glucose intolerance and hyperinsulinemia, was 0.8 (95% confidence values 0.72-0.87), with HOMA-IR values ranging from 2.04 to 2.33. HOMA-IR is a useful method to determine insulin resistance in epidemiological studies. The HOMA-IR cutoff point for insulin resistance defined in thi spopulation was 2.6.

  2. Modeling human response errors in synthetic flight simulator domain

    Science.gov (United States)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  3. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  4. Modeling and Simulation Techniques for Large-Scale Communications Modeling

    National Research Council Canada - National Science Library

    Webb, Steve

    1997-01-01

    .... Tests of random number generators were also developed and applied to CECOM models. It was found that synchronization of random number strings in simulations is easy to implement and can provide significant savings for making comparative studies. If synchronization is in place, then statistical experiment design can be used to provide information on the sensitivity of the output to input parameters. The report concludes with recommendations and an implementation plan.

  5. Simulation Model for Foreign Trade During the Crisis in Romania

    Directory of Open Access Journals (Sweden)

    Mirela Diaconescu

    2014-12-01

    Full Text Available The paper proposes to analyze the evolution of foreign trade during the crisis in Romania. The evolution of foreign trade is analyzed using a simulation model. The period of analysis is 2006-2014. The data source is Eurostat and National Bank of Romania. Also, based on these data, we propose an econometric model which can be developed using different scenarios and forecasting of evolution of foreign trade. In period of economic recession, protectionist sentiments against imports competing with domestic products tend to rise. The same phenomenon was manifested in Romania. Thus, our study started from this consideration. Using econometric model we made scenarios predictions and the results are similar to the real values.

  6. Modeling of Pathogen Survival during Simulated Gastric Digestion ▿

    Science.gov (United States)

    Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

    2011-01-01

    The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens. PMID:21131530

  7. Modeling of pathogen survival during simulated gastric digestion.

    Science.gov (United States)

    Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

    2011-02-01

    The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens.

  8. Modelling and Simulation of Search Engine

    Science.gov (United States)

    Nasution, Mahyuddin K. M.

    2017-01-01

    The best tool currently used to access information is a search engine. Meanwhile, the information space has its own behaviour. Systematically, an information space needs to be familiarized with mathematics so easily we identify the characteristics associated with it. This paper reveal some characteristics of search engine based on a model of document collection, which are then estimated the impact on the feasibility of information. We reveal some of characteristics of search engine on the lemma and theorem about singleton and doubleton, then computes statistically characteristic as simulating the possibility of using search engine. In this case, Google and Yahoo. There are differences in the behaviour of both search engines, although in theory based on the concept of documents collection.

  9. Monte Carlo simulation of Markov unreliability models

    International Nuclear Information System (INIS)

    Lewis, E.E.; Boehm, F.

    1984-01-01

    A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)

  10. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  11. Comparing Productivity Simulated with Inventory Data Using Different Modelling Technologies

    Science.gov (United States)

    Klopf, M.; Pietsch, S. A.; Hasenauer, H.

    2009-04-01

    The Lime Stone National Park in Austria was established in 1997 to protect sensible lime stone soils from degradation due to heavy forest management. Since 1997 the management activities were successively reduced and standing volume and coarse woody debris (CWD) increased and degraded soils began to recover. One option to study the rehabilitation process towards natural virgin forest state is the use of modelling technology. In this study we will test two different modelling approaches for their applicability to Lime Stone National Park. We will compare standing tree volume simulated resulting from (i) the individual tree growth model MOSES, and (ii) the species and management sensitive adaptation of the biogeochemical-mechanistic model Biome-BGC. The results from the two models are compared with filed observations form repeated permanent forest inventory plots of the Lime Stone National Park in Austria. The simulated CWD predictions of the BGC-model were compared with dead wood measurements (standing and lying dead wood) recorded at the permanent inventory plots. The inventory was established between 1994 and 1996 and remeasured from 2004 to 2005. For this analysis 40 plots of this inventory were selected which comprise the required dead wood components and are dominated by a single tree species. First we used the distance dependant individual tree growth model MOSES to derive the standing timber and the amount of mortality per hectare. MOSES is initialized with the inventory data at plot establishment and each sampling plot is treated as forest stand. The Biome-BGC is a process based biogeochemical model with extensions for Austrian tree species, a self initialization and a forest management tool. The initialization for the actual simulations with the BGC model was done as follows: We first used spin up runs to derive a balanced forest vegetation, similar to an undisturbed forest. Next we considered the management history of the past centuries (heavy clear cuts

  12. Using Computational Simulations to Confront Students' Mental Models

    Science.gov (United States)

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  13. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...... of the Danish VAT law in Web Ontology Language (OWL) and in Con¿git Product Modeling Language (CPML)....

  14. Representation of Solar Capacity Value in the ReEDS Capacity Expansion Model

    Energy Technology Data Exchange (ETDEWEB)

    Sigrin, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, P. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ibanez, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-03-01

    An important issue for electricity system operators is the estimation of renewables' capacity contributions to reliably meeting system demand, or their capacity value. While the capacity value of thermal generation can be estimated easily, assessment of wind and solar requires a more nuanced approach due to the resource variability. Reliability-based methods, particularly assessment of the Effective Load-Carrying Capacity, are considered to be the most robust and widely-accepted techniques for addressing this resource variability. This report compares estimates of solar PV capacity value by the Regional Energy Deployment System (ReEDS) capacity expansion model against two sources. The first comparison is against values published by utilities or other entities for known electrical systems at existing solar penetration levels. The second comparison is against a time-series ELCC simulation tool for high renewable penetration scenarios in the Western Interconnection. Results from the ReEDS model are found to compare well with both comparisons, despite being resolved at a super-hourly temporal resolution. Two results are relevant for other capacity-based models that use a super-hourly resolution to model solar capacity value. First, solar capacity value should not be parameterized as a static value, but must decay with increasing penetration. This is because -- for an afternoon-peaking system -- as solar penetration increases, the system's peak net load shifts to later in the day -- when solar output is lower. Second, long-term planning models should determine system adequacy requirements in each time period in order to approximate LOLP calculations. Within the ReEDS model we resolve these issues by using a capacity value estimate that varies by time-slice. Within each time period the net load and shadow price on ReEDS's planning reserve constraint signals the relative importance of additional firm capacity.

  15. The Deficit Model and the Forgotten Moral Values

    Directory of Open Access Journals (Sweden)

    Marko Ahteensuu

    2011-03-01

    Full Text Available This paper was presented at the first meeting of the NSU study group “Conceptions of ethical and social values in post-secular society: Towards a new ethical imagination in a cosmopolitan world society”, held on January 28-30, 2011 at Copenhagen Business School. The deficit model explains the general public’s negative attitudes towards science and/or certain scientific applications with the public’s scientific ignorance. The deficit model is commonly criticized for oversimplifying the connection between scientific knowledge and attitudes. Other relevant factors – such as ideology, social identity, trust, culture, and worldviews – should be taken into consideration to a greater extent. We argue that explanations based on the proposed factors sometimes implicitly reintroduce the deficit model type of thinking. The strength of the factors is that they broaden the explanations to concern moral issues. We analyse two central argument types of GMO discussion, and show the central role of moral values in them. Thus, as long as arguments are seen to affect the attitudes of the general public, the role of moral values should be made explicit in the explanations concerning their attitudes.

  16. AskIT Service Desk Support Value Model

    Energy Technology Data Exchange (ETDEWEB)

    Ashcraft, Phillip Lynn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cummings, Susan M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fogle, Blythe G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valdez, Christopher D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-07

    The value model discussed herein provides an accurate and simple calculation of the funding required to adequately staff the AskIT Service Desk (SD).  The model is incremental – only technical labor cost is considered.  All other costs, such as management, equipment, buildings, HVAC, and training are considered common elements of providing any labor related IT Service. Depending on the amount of productivity loss and the number of hours the defect was unresolved, the value of resolving work from the SD is unquestionably an economic winner; the average cost of $16 per SD resolution can commonly translate to cost avoidance exceeding well over $100. Attempting to extract too much from the SD will likely create a significant downside. The analysis used to develop the value model indicates that the utilization of the SD is very high (approximately 90%).  As a benchmark, consider a comment from a manager at Vitalyst (a commercial IT service desk) that their utilization target is approximately 60%.  While high SD utilization is impressive, over the long term it is likely to cause unwanted consequences to staff such as higher turnover, illness, or burnout.  A better solution is to staff the SD so that analysts have time to improve skills through training, develop knowledge, improve processes, collaborate with peers, and improve customer relationship skills.

  17. Conceptual Model of Quantities, Units, Dimensions, and Values

    Science.gov (United States)

    Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar

    2011-01-01

    JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.

  18. Confirming the Value of Swimming-Performance Models for Adolescents.

    Science.gov (United States)

    Dormehl, Shilo J; Robertson, Samuel J; Barker, Alan R; Williams, Craig A

    2017-10-01

    To evaluate the efficacy of existing performance models to assess the progression of male and female adolescent swimmers through a quantitative and qualitative mixed-methods approach. Fourteen published models were tested using retrospective data from an independent sample of Dutch junior national-level swimmers from when they were 12-18 y of age (n = 13). The degree of association by Pearson correlations was compared between the calculated differences from the models and quadratic functions derived from the Dutch junior national qualifying times. Swimmers were grouped based on their differences from the models and compared with their swimming histories that were extracted from questionnaires and follow-up interviews. Correlations of the deviations from both the models and quadratic functions derived from the Dutch qualifying times were all significant except for the 100-m breaststroke and butterfly and the 200-m freestyle for females (P backstroke for males and 200-m freestyle for males and females were almost directly proportional. In general, deviations from the models were accounted for by the swimmers' training histories. Higher levels of retrospective motivation appeared to be synonymous with higher-level career performance. This mixed-methods approach helped confirm the validity of the models that were found to be applicable to adolescent swimmers at all levels, allowing coaches to track performance and set goals. The value of the models in being able to account for the expected performance gains during adolescence enables quantification of peripheral factors that could affect performance.

  19. A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation

    Science.gov (United States)

    Wee, Loo Kang; Goh, Giam Hwee

    2013-01-01

    We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…

  20. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Sheng [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China); Qu, Xiaobo [Griffith School of Engineering, Griffith University, Gold Coast, 4222 Australia (Australia); Xu, Cheng [Department of Transportation Management Engineering, Zhejiang Police College, Hangzhou, 310053 China (China); College of Transportation, Jilin University, Changchun, 130022 China (China); Ma, Dongfang, E-mail: mdf2004@zju.edu.cn [Ocean College, Zhejiang University, Hangzhou, 310058 China (China); Wang, Dianhai [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China)

    2015-10-16

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated.

  1. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    International Nuclear Information System (INIS)

    Jin, Sheng; Qu, Xiaobo; Xu, Cheng; Ma, Dongfang; Wang, Dianhai

    2015-01-01

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated

  2. Using a simulation assistant in modeling manufacturing systems

    Science.gov (United States)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  3. Development of a Generic Didactic Model for Simulator Training

    National Research Council Canada - National Science Library

    Emmerik, M

    1997-01-01

    .... The development of such a model is motivated by the need to control training and instruction factors in research on simulator fidelity, the need to assess the benefit of training simulators, e.g...

  4. Modeling and Simulation in Healthcare Future Directions

    Science.gov (United States)

    2010-07-13

    information all have equal “weight” in the information world Computers Internet Simulation The Future Distribute & communicate Predict, plan & train...Acquire & analyze Third Leg of the Information Age Satava 2 Feb 1999 Simulation Computers Acquire Analyze Simulation Predict, Train Internet Communicate...Serendipity Inspiration FURTHER PROOF: Current evidence is inadequate for Event horizons Cognition Genome Quantum mechanics Memes Etc New

  5. Four Models of In Situ Simulation

    DEFF Research Database (Denmark)

    Musaeus, Peter; Krogh, Kristian; Paltved, Charlotte

    2014-01-01

    Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest that there are f...... to team intervention and philosophies informing what good situated learning research is. This study generates system knowledge that might inform scenario development for in situ simulation.......Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest...... that there are four fruitful approaches to in situ simulation: (1) In situ simulation informed by reported critical incidents and adverse events from emergency departments (ED) in which team training is about to be conducted to write scenarios. (2) In situ simulation through ethnographic studies at the ED. (3) Using...

  6. Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback.

    Science.gov (United States)

    Savoldelli, Georges L; Naik, Viren N; Park, Jason; Joo, Hwan S; Chow, Roger; Hamstra, Stanley J

    2006-08-01

    The debriefing process during simulation-based education has been poorly studied despite its educational importance. Videotape feedback is an adjunct that may enhance the impact of the debriefing and in turn maximize learning. The purpose of this study was to investigate the value of the debriefing process during simulation and to compare the educational efficacy of two types of feedback, oral feedback and videotape-assisted oral feedback, against control (no debriefing). Forty-two anesthesia residents were enrolled in the study. After completing a pretest scenario, participants were randomly assigned to receive no debriefing, oral feedback, or videotape-assisted oral feedback. The debriefing focused on nontechnical skills performance guided by crisis resource management principles. Participants were then required to manage a posttest scenario. The videotapes of all performances were later reviewed by two blinded independent assessors who rated participants' nontechnical skills using a validated scoring system. Participants' nontechnical skills did not improve in the control group, whereas the provision of oral feedback, either assisted or not assisted with videotape review, resulted in significant improvement (P crisis without constructive debriefing by instructors offers little benefit to trainees. The addition of video review did not offer any advantage over oral feedback alone. Valuable simulation training can therefore be achieved even when video technology is not available.

  7. Identifying added value in high-resolution climate simulations over Scandinavia

    DEFF Research Database (Denmark)

    Mayer, Stephania; Fox Maule, Cathrine; Sobolowski, Stefan

    2015-01-01

    High-resolution data are needed in order to assess potential impacts of extreme events on infrastructure in the mid-latitudes. Dynamical downscaling offers one way to obtain this information. However, prior to implementation in any impacts assessment scheme, model output must be validated......-based station observations. In addition to the canonical variables of daily precipitation and temperature, winds were also investigated. The models exhibit systematic cold and wet biases on seasonal time scales (−1 K and +50–100%, respectively). However, frequency-based skill scores for daily precipitation...... and temperature are high, indicating that the distributions of these variables are generally well captured. Wind speeds over the North and Norwegian Seas were simulated more realistically in the models than in the ERA interim reanalysis. However, most importantly, for impacts assessments, the models should...

  8. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Niko Speybroeck

    2013-11-01

    Full Text Available Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks.

  9. Simulation Model developed for a Small-Scale PV-System in a Distribution Network

    DEFF Research Database (Denmark)

    Koch-Ciobotaru, C.; Mihet-Popa, Lucian; Isleifsson, Fridrik Rafn

    2012-01-01

    This paper presents a PV panel simulation model using the single-diode four-parameter model based on data sheet values. The model was implemented first in MATLAB/Simulink, and the results have been compared with the data sheet values and characteristics of the PV panels in standard test conditions...... and implemented in PowerFactory to study load flow, steady-state voltage stability and dynamic behavior of a distributed power system....

  10. Modelization and simulation of capillary barriers

    International Nuclear Information System (INIS)

    Lisbona Cortes, F.; Aguilar Villa, G.; Clavero Gracia, C.; Gracia Lozano, J.L.

    1998-01-01

    Among the different underground transport phenomena, that due to water flows is of great relevance. Water flows in infiltration and percolation processes are responsible of the transport of hazardous wastes towards phreatic layers. From the industrial and geological standpoints, there is a great interest in the design of natural devices to avoid the flows transporting polluting substances. This interest is increased when devices are used to isolate radioactive waste repositories, whose life is to be longer than several hundred years. The so-called natural devices are those based on the superimposition of material with different hydraulic properties. In particular, the flow retention in this kind stratified media, in unsaturated conditions, is basically due to the capillary barrier effect, resulting from placing a low conductivity material over another with a high hydraulic conductivity. Covers designed from the effect above have also to allow a drainage of the upper layer. The lower cost of these covers, with respect to other kinds of protection systems, and the stability in time of their components make them very attractive. However, a previous investigation to determine their effectivity is required. In this report we present the computer code BCSIM, useful for easy simulations of unsaturated flows in a capillary barrier configuration with drainage, and which is intended to serve as a tool for designing efficient covers. The model, the numerical algorithm and several implementation aspects are described. Results obtained in several simulations, confirming the effectivity of capillary barriers as a technique to build safety covers for hazardous waste repositories, are presented. (Author)

  11. Powertrain modeling and simulation for off-road vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Ouellette, S. [McGill Univ., Montreal, PQ (Canada)

    2010-07-01

    Standard forward facing automotive powertrain modeling and simulation methodology did not perform equally for all vehicles in all applications in the 2010 winter Olympics, 2009 world alpine ski championships, summit station in Greenland, the McGill Formula Hybrid, Unicell QuickSider, and lunar mobility. This presentation provided a standard automotive powertrain modeling and simulation flow chart as well as an example. It also provided a flow chart for location based powertrain modeling and simulation and discussed location based powertrain modeling and simulation implementation. It was found that in certain applications, vehicle-environment interactions cannot be neglected in order to have good model fidelity. Powertrain modeling and simulation of off-road vehicles demands a new approach to powertrain modeling and simulation. It was concluded that the proposed location based methodology could improve the results for off-road vehicles. tabs., figs.

  12. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    Science.gov (United States)

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  13. Aircraft vulnerability analysis by modeling and simulation

    Science.gov (United States)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    guidance acceleration and seeker sensitivity. For the purpose of this investigation the aircraft is equipped with conventional pyrotechnic decoy flares and the missile has no counter-countermeasure means (security restrictions on open publication). This complete simulation is used to calculate the missile miss distance, when the missile is launched from different locations around the aircraft. The miss distance data is then graphically presented showing miss distance (aircraft vulnerability) as a function of launch direction and range. The aircraft vulnerability graph accounts for aircraft and missile characteristics, but does not account for missile deployment doctrine. A Bayesian network is constructed to fuse the doctrinal rules with the aircraft vulnerability data. The Bayesian network now provides the capability to evaluate the combined risk of missile launch and aircraft vulnerability. It is shown in this paper that it is indeed possible to predict the aircraft vulnerability to missile attack in a comprehensive modelling and a holistic process. By using the appropriate real-world models, this approach is used to evaluate the effectiveness of specific countermeasure techniques against specific missile threats. The use of a Bayesian network provides the means to fuse simulated performance data with more abstract doctrinal rules to provide a realistic assessment of the aircraft vulnerability.

  14. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    Science.gov (United States)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  15. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  16. Business Value of Information Technology Service Quality Based on Probabilistic Business-Driven Model

    Directory of Open Access Journals (Sweden)

    Jaka Sembiring

    2015-08-01

    Full Text Available The business value of information technology (IT services is often difficult to assess, especially from the point of view of a non-IT manager. This condition could severely impact organizational IT strategic decisions. Various approaches have been proposed to quantify the business value, but some are trapped in technical complexity while others misguide managers into directly and subjectively judging some technical entities outside their domain of expertise. This paper describes a method on how to properly capture both perspectives based on a  probabilistic business-driven model. The proposed model presents a procedure to calculate the business value of IT services. The model also covers IT security services and their business value as an important aspect of IT services that is not covered in previously published researches. The impact of changes in the quality of IT services on business value will also be discussed. A simulation and a case illustration are provided to show the possible application of the proposed model for a simple business process in an enterprise.

  17. Sunspot Modeling: From Simplified Models to Radiative MHD Simulations

    Directory of Open Access Journals (Sweden)

    Rolf Schlichenmaier

    2011-09-01

    Full Text Available We review our current understanding of sunspots from the scales of their fine structure to their large scale (global structure including the processes of their formation and decay. Recently, sunspot models have undergone a dramatic change. In the past, several aspects of sunspot structure have been addressed by static MHD models with parametrized energy transport. Models of sunspot fine structure have been relying heavily on strong assumptions about flow and field geometry (e.g., flux-tubes, "gaps", convective rolls, which were motivated in part by the observed filamentary structure of penumbrae or the necessity of explaining the substantial energy transport required to maintain the penumbral brightness. However, none of these models could self-consistently explain all aspects of penumbral structure (energy transport, filamentation, Evershed flow. In recent years, 3D radiative MHD simulations have been advanced dramatically to the point at which models of complete sunspots with sufficient resolution to capture sunspot fine structure are feasible. Here overturning convection is the central element responsible for energy transport, filamentation leading to fine-structure and the driving of strong outflows. On the larger scale these models are also in the progress of addressing the subsurface structure of sunspots as well as sunspot formation. With this shift in modeling capabilities and the recent advances in high resolution observations, the future research will be guided by comparing observation and theory.

  18. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  19. Mass balances for a biological life support system simulation model

    Science.gov (United States)

    Volk, Tyler; Rummel, John D.

    1987-01-01

    Design decisions to aid the development of future space based biological life support systems (BLSS) can be made with simulation models. The biochemistry stoichiometry was developed for: (1) protein, carbohydrate, fat, fiber, and lignin production in the edible and inedible parts of plants; (2) food consumption and production of organic solids in urine, feces, and wash water by the humans; and (3) operation of the waste processor. Flux values for all components are derived for a steady state system with wheat as the sole food source. The large scale dynamics of a materially closed (BLSS) computer model is described in a companion paper. An extension of this methodology can explore multifood systems and more complex biochemical dynamics while maintaining whole system closure as a focus.

  20. Simulation of upward flux from shallow water-table using UPFLOW model

    Directory of Open Access Journals (Sweden)

    M. H. Ali

    2013-11-01

    Full Text Available The upward movement of water by capillary rise from shallow water-table to the root zone is an important incoming flux. For determining exact amount of irrigation requirement, estimation of capillary flux or upward flux is essential. Simulation model can provide a reliable estimate of upward flux under variable soil and climatic conditions. In this study, the performance of model UPFLOW to estimate upward flux was evaluated. Evaluation of model performance was performed with both graphical display and statistical criteria. In distribution of simulated capillary rise values against observed field data, maximum data points lie around the 1:1 line, which means that the model output is reliable and reasonable. The coefficient of determination between observed and simulated values was 0.806 (r = 0.93, which indicates a good inter-relation between observed and simulated values. The relative error, model efficiency, and index of agreement were found as 27.91%, 85.93% and 0.96, respectively. Considering the graphical display of observed and simulated upward flux and statistical indicators, it can be concluded that the overall performance of the UPFLOW model in simulating actual upward flux from a crop field under variable water-table condition is satisfactory. Thus, the model can be used to estimate capillary rise from shallow water-table for proper estimation of irrigation requirement, which would save valuable water from over-irrigation.

  1. Beyond Modeling: All-Atom Olfactory Receptor Model Simulations

    Directory of Open Access Journals (Sweden)

    Peter C Lai

    2012-05-01

    Full Text Available Olfactory receptors (ORs are a type of GTP-binding protein-coupled receptor (GPCR. These receptors are responsible for mediating the sense of smell through their interaction with odor ligands. OR-odorant interactions marks the first step in the process that leads to olfaction. Computational studies on model OR structures can validate experimental functional studies as well as generate focused and novel hypotheses for further bench investigation by providing a view of these interactions at the molecular level. Here we have shown the specific advantages of simulating the dynamic environment that is associated with OR-odorant interactions. We present a rigorous methodology that ranges from the creation of a computationally-derived model of an olfactory receptor to simulating the interactions between an OR and an odorant molecule. Given the ubiquitous occurrence of GPCRs in the membranes of cells, we anticipate that our OR-developed methodology will serve as a model for the computational structural biology of all GPCRs.

  2. A model-based approach for sustainability and value assessment in the aerospace value chain

    Directory of Open Access Journals (Sweden)

    Marco Bertoni

    2015-06-01

    Full Text Available In the aerospace industry, systems engineering practices have been exercised for years, as a way to turn high-level design objectives into concrete targets on system functionality (e.g. range, noise, and reliability. More difficult is to decompose and clarify sustainability implications in the same way and to compare them against performance-related capabilities already during preliminary design. This article addresses the problem of bringing the important—yet typically high level and complex—sustainability aspects into engineering practices. It proposes a novel integrated model-based method that provides a consistent way of addressing the well-known lack of generic and integrated ways of clarifying both cost and value consequences of sustainability in early phases. It further presents the development and implementation of such approach in two separate case studies conducted in collaboration with a major aero-engine sub-system manufacturer. The first case concerns the assessment of alternative business configurations to maintain scarce materials in closed loops, while the second one concerns the production technology of an aero-engine component. Eventually, this article highlights the learning generated by the development and implementation of these approaches and discusses opportunities for further development of model-based support.

  3. 40 CFR 600.207-93 - Calculation of fuel economy values for a model type.

    Science.gov (United States)

    2010-07-01

    ... Values § 600.207-93 Calculation of fuel economy values for a model type. (a) Fuel economy values for a... update sales projections at the time any model type value is calculated for a label value. (iii) The... those intended for sale in other states, he will calculate fuel economy values for each model type for...

  4. 40 CFR 600.207-86 - Calculation of fuel economy values for a model type.

    Science.gov (United States)

    2010-07-01

    ... Values § 600.207-86 Calculation of fuel economy values for a model type. (a) Fuel economy values for a... update sales projections at the time any model type value is calculated for a label value. (iii) The... the projected sales and fuel economy values for each base level within the model type. (1) If the...

  5. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  6. Federated Modelling and Simulation for Critical Infrastructure Protection

    NARCIS (Netherlands)

    Rome, E.; Langeslag, P.J.H.; Usov, A.

    2014-01-01

    Modelling and simulation is an important tool for Critical Infrastructure (CI) dependency analysis, for testing methods for risk reduction, and as well for the evaluation of past failures. Moreover, interaction of such simulations with external threat models, e.g., a river flood model, or economic

  7. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  8. Simulation models in population breast cancer screening : A systematic review

    NARCIS (Netherlands)

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for

  9. Simulation Modeling of a Facility Layout in Operations Management Classes

    Science.gov (United States)

    Yazici, Hulya Julie

    2006-01-01

    Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…

  10. Maneuver simulation model of an experimental hovercraft for the Antarctic

    Science.gov (United States)

    Murao, Rinichi

    Results of an investigation of a hovercraft model designed for Antarctic conditions are presented. The buoyancy characteristics, the propellant control system, and simulation model control are examined. An ACV (air cushion vehicle) model of the hovercraft is used to examine the flexibility and friction of the skirt. Simulation results are presented which show the performance of the hovercraft.

  11. Historical Development of Simulation Models of Recreation Use

    Science.gov (United States)

    Jan W. van Wagtendonk; David N. Cole

    2005-01-01

    The potential utility of modeling as a park and wilderness management tool has been recognized for decades. Romesburg (1974) explored how mathematical decision modeling could be used to improve decisions about regulation of wilderness use. Cesario (1975) described a computer simulation modeling approach that utilized GPSS (General Purpose Systems Simulator), a...

  12. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  13. A New Model for Simulating TSS Washoff in Urban Areas

    Directory of Open Access Journals (Sweden)

    E. Crobeddu

    2011-01-01

    Full Text Available This paper presents the formulation and validation of the conceptual Runoff Quality Simulation Model (RQSM that was developed to simulate the erosion and transport of solid particles in urban areas. The RQSM assumes that solid particle accumulation on pervious and impervious areas is infinite. The RQSM simulates soil erosion using rainfall kinetic energy and solid particle transport with linear system theory. A sensitivity analysis was conducted on the RQSM to show the influence of each parameter on the simulated load. Total suspended solid (TSS loads monitored at the outlet of the borough of Verdun in Canada and at three catchment outlets of the City of Champaign in the United States were used to validate the RQSM. TSS loads simulated by the RQSM were compared to measured loads and to loads simulated by the Rating Curve model and the Exponential model of the SWMM software. The simulation performance of the RQSM was comparable to the Exponential and Rating Curve models.

  14. Propulsion modeling techniques and applications for the NASA Dryden X-30 real-time simulator

    Science.gov (United States)

    Hicks, John W.

    1991-01-01

    An overview is given of the flight planning activities to date in the current National Aero-Space Plane (NASP) program. The government flight-envelope expansion concept and other design flight operational assessments are discussed. The NASA Dryden NASP real-time simulator configuration is examined and hypersonic flight planning simulation propulsion modeling requirements are described. The major propulsion modeling techniques developed by the Edwards flight test team are outlined, and the application value of techniques for developmental hypersonic vehicles are discussed.

  15. Business Process Simulation: Requirements for Business and Resource Models

    OpenAIRE

    Audrius Rima; Olegas Vasilecas

    2015-01-01

    The purpose of Business Process Model and Notation (BPMN) is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  16. Evaluation of Marine Corps Manpower Computer Simulation Model

    Science.gov (United States)

    2016-12-01

    overall end strength are maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language...maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language. This thesis investigates that...a simulation software that models business practices to assist that business in its “ability to analyze and make decisions on how to improve (their

  17. Ion thruster modeling: Particle simulations and experimental validations

    International Nuclear Information System (INIS)

    Wang, Joseph; Polk, James; Brinza, David

    2003-01-01

    This paper presents results from ion thruster modeling studies performed in support of NASA's Deep Space 1 mission and NSTAR project. Fully 3-dimensional computer particle simulation models are presented for ion optics plasma flow and ion thruster plume. Ion optics simulation results are compared with measurements obtained from ground tests of the NSTAR ion thruster. Plume simulation results are compared with in-flight measurements from the Deep Space 1 spacecraft. Both models show excellent agreement with experimental data

  18. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  19. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  20. Millimeter waves sensor modeling and simulation

    Science.gov (United States)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. One important class of sensors are millimeter waves radar systems that are very efficient for seeing through atmosphere and/or foliage for example. This type of high frequency radar can produce high quality images with very tricky features such as dihedral and trihedral bright points, shadows and lay over effect. Besides, image quality is very dependent on the carrier velocity and trajectory. Such sensors systems are so complex that they need simulation to be tested. This paper presents a state of the Art of millimeter waves sensor models. A short presentation of asymptotic methods shows that physical optics support is mandatory to reach realistic results. SE-Workbench-RF tool is presented and typical examples of results are shown both in the frame of Synthetic Aperture Radar sensors and Real Beam Ground Mapping radars. Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench-RF are showed and commented.

  1. Modeling and Simulation Fundamentals Theoretical Underpinnings and Practical Domains

    CERN Document Server

    Sokolowski, John A

    2010-01-01

    An insightful presentation of the key concepts, paradigms, and applications of modeling and simulation. Modeling and simulation has become an integral part of research and development across many fields of study, having evolved from a tool to a discipline in less than two decades. Modeling and Simulation Fundamentals offers a comprehensive and authoritative treatment of the topic and includes definitions, paradigms, and applications to equip readers with the skills needed to work successfully as developers and users of modeling and simulation. Featuring contributions written by leading experts

  2. Global Information Enterprise (GIE) Modeling and Simulation (GIESIM)

    National Research Council Canada - National Science Library

    Bell, Paul

    2005-01-01

    ... AND S) toolkits into the Global Information Enterprise (GIE) Modeling and Simulation (GIESim) framework to create effective user analysis of candidate communications architectures and technologies...

  3. Modeling, Simulation and Position Control of 3DOF Articulated Manipulator

    Directory of Open Access Journals (Sweden)

    Hossein Sadegh Lafmejani

    2014-08-01

    Full Text Available In this paper, the modeling, simulation and control of 3 degrees of freedom articulated robotic manipulator have been studied. First, we extracted kinematics and dynamics equations of the mentioned manipulator by using the Lagrange method. In order to validate the analytical model of the manipulator we compared the model simulated in the simulation environment of Matlab with the model was simulated with the SimMechanics toolbox. A sample path has been designed for analyzing the tracking subject. The system has been linearized with feedback linearization and then a PID controller was applied to track a reference trajectory. Finally, the control results have been compared with a nonlinear PID controller.

  4. Modeling and Simulation of U-tube Steam Generator

    Science.gov (United States)

    Zhang, Mingming; Fu, Zhongguang; Li, Jinyao; Wang, Mingfei

    2018-03-01

    The U-tube natural circulation steam generator was mainly researched with modeling and simulation in this article. The research is based on simuworks system simulation software platform. By analyzing the structural characteristics and the operating principle of U-tube steam generator, there are 14 control volumes in the model, including primary side, secondary side, down channel and steam plenum, etc. The model depends completely on conservation laws, and it is applied to make some simulation tests. The results show that the model is capable of simulating properly the dynamic response of U-tube steam generator.

  5. Structure simulation of a pre-stressed concrete containment model

    International Nuclear Information System (INIS)

    Grebner, H.; Sievers, J.

    2004-01-01

    An axisymmetric Finite-Element-Model of the 1:4 pre-stressed containment model tested at SANDIA was developed. The model is loaded by the pre-stressing of the tendons and by increasing internal pressure (up to 1.3 MPa). The analyses results in terms of displacements and strains in the liner, the rebars, the tendons and the concrete of the cylindrical part agree well with measured data up to about 0.6 MPa internal pressure (i.e. 1.5 times design pressure). First circumferential micro-cracks in the concrete are found at about 0.75 MPa. With increasing pressure micro-cracks are present through the whole wall. Above about 0.9 MPa the formation of micro-cracks in radial and meridional direction is calculated. At the maximum load (1.3 MPa) almost all concrete parts of the model have micro-cracks which may cause leaks. Nevertheless the failure of the containment model is not expected for loads up to 1.3 MPa without consideration of geometric inhomogeneities due to penetrations in the wall. Although the calculated strains in liner, rebars and tendons show some plastification, the maximum values are below the critical ones. The safety margin against failure is smallest in some hoop tendons. At present parametric studies are performed to investigate the differences between calculations and measured data. Furthermore three-dimensional models are developed for a better simulation of the meridional tendons in the dome region. (orig.)

  6. Nuclear power plant training simulator modeling organization and method

    International Nuclear Information System (INIS)

    Alliston, W.H.

    1975-01-01

    A description is given of a training simulator for the full-scope real-time dynamic operation of a nuclear power plant which utilizes apparatus that includes control consoles having manual and automatic devices corresponding to simulated plant components and indicating devices for monitoring physical values in the simulated plants. A digital computer configuration is connected to the control consoles to calculate the dynamic real-time simulated operation of the plant in accordance with the simulated plant components to provide output data including data for operating the control console indicating devices. The plant simulation is modularized into various plant components or component systems. Simulated plant components or component systems are described by a mathematical equation embodied in a computer program which accepts data from other simulated plant components or systems, calculates output values including values which are used as inputs for simulator calculators by other simulated plant components or systems, and responds in a manner similar to that of its corresponding physical entity in both transient and steady states

  7. Values of Land and Renewable Resources in a Three-Sector Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Zhang Wei-Bin

    2015-04-01

    Full Text Available This paper studies dynamic interdependence of capital, land and resource values in a three sector growth model with endogenous wealth and renewable resources. The model is based on the neoclassical growth theory, Ricardian theory and growth theory with renewable resources. The household’s decision is modeled with an alternative approach proposed by Zhang two decades ago. The economic system consists of the households, industrial, agricultural, and resource sectors. The model describes a dynamic interdependence between wealth accumulation, resource change, and division of labor under perfect competition. We simulate the model to demonstrate the existence of a unique stable equilibrium point and plot the motion of the dynamic system. The study conducts comparative dynamic analysis with regard to changes in the propensity to consume resources, the propensity to consume housing, the propensity to consume agricultural goods, the propensity to consume industrial goods, the propensity to save, the population, and the output elasticity of capital of the resource sector.

  8. Modeling Value Chain Analysis of Distance Education using UML

    Science.gov (United States)

    Acharya, Anal; Mukherjee, Soumen

    2010-10-01

    Distance education continues to grow as a methodology for the delivery of course content in higher education in India as well as abroad. To manage this growing demand and to provide certain flexibility, there must be certain strategic planning about the use of ICT tools. Value chain analysis is a framework for breaking down the sequence of business functions into a set of activities through which utility could be added to service. Thus it can help to determine the competitive advantage that is enjoyed by an institute. To implement these business functions certain visual representation is required. UML allows for this representation by using a set of structural and behavioral diagrams. In this paper, the first section defines a framework for value chain analysis and highlights its advantages. The second section gives a brief overview of related work in this field. The third section gives a brief discussion on distance education. The fourth section very briefly introduces UML. The fifth section models value chain of distance education using UML. Finally we discuss the limitations and the problems posed in this domain.

  9. Monte Carlo simulations of the cellular S-value, lineal energy and RBE for BNCT

    International Nuclear Information System (INIS)

    Liu Chingsheng; Tung Chuanjong

    2006-01-01

    Due to the non-uniform uptake of boron-containing pharmaceuticals in cells and the short-ranged alpha and lithium particles, microdosimetry provides useful information on the cellular dose and response of boron neutron capture therapy (BNCT). Radiation dose and quality in BNCT may be expressed in terms of the cellular S-value and the lineal energy spectrum. In the present work, Monte Carlo simulations were performed to calculate these microdosimetric parameters for different source-target configurations and sizes in cells. The effective relative biological effectiveness (RBE) of the Tsing Hua Open-pool Reactor (THOR) epithermal neutron beam was evaluated using biological weighting functions that depended on the lineal energy. RBE changes with source-target configurations and sizes were analyzed. (author)

  10. Simulation Models in Economic Higher Education

    OpenAIRE

    Paraschiv Dorel Mihai; Belu Mihaela Gabriela; Popa Ioan

    2013-01-01

    The simulation methods are implemented to develop students' professional skills and competencies in the economic field, making the link between the academic and business environments. The paper presents these methods of simulation in areas such as trade, international business, tourism and banking, applied in the European Program POSDRU/90/2.1/S/63442 project.

  11. Snoopy's hybrid simulator: a tool to construct and simulate hybrid biological models.

    Science.gov (United States)

    Herajy, Mostafa; Liu, Fei; Rohr, Christian; Heiner, Monika

    2017-07-28

    Hybrid simulation of (computational) biochemical reaction networks, which combines stochastic and deterministic dynamics, is an important direction to tackle future challenges due to complex and multi-scale models. Inherently hybrid computational models of biochemical networks entail two time scales: fast and slow. Therefore, it is intricate to efficiently and accurately analyse them using only either deterministic or stochastic simulation. However, there are only a few software tools that support such an approach. These tools are often limited with respect to the number as well as the functionalities of the provided hybrid simulation algorithms. We present Snoopy's hybrid simulator, an efficient hybrid simulation software which builds on Snoopy, a tool to construct and simulate Petri nets. Snoopy's hybrid simulator provides a wide range of state-of-the-art hybrid simulation algorithms. Using this tool, a computational model of biochemical networks can be constructed using a (coloured) hybrid Petri net's graphical notations, or imported from other compatible formats (e.g. SBML), and afterwards executed via dynamic or static hybrid simulation. Snoopy's hybrid simulator is a platform-independent tool providing an accurate and efficient simulation of hybrid (biological) models. It can be downloaded free of charge as part of Snoopy from http://www-dssz.informatik.tu-cottbus.de/DSSZ/Software/Snoopy .

  12. MODEL OF HEAT SIMULATOR FOR DATA CENTERS

    Directory of Open Access Journals (Sweden)

    Jan Novotný

    2016-08-01

    Full Text Available The aim of this paper is to present a design and a development of a heat simulator, which will be used for a flow research in data centers. The designed heat simulator is based on an ideological basis of four-processor 1U Supermicro server. The designed heat simulator enables to control the flow and heat output within the range of 10–100 %. The paper covers also the results of testing measurements of mass flow rates and heat flow rates in the simulator. The flow field at the outlet of the server was measured by the stereo PIV method. The heat flow rate was determined, based on measuring the temperature field at the inlet and outlet of the simulator and known mass flow rate.

  13. Simulation Tools Model Icing for Aircraft Design

    Science.gov (United States)

    2012-01-01

    the years from strictly a research tool to one used routinely by industry and other government agencies. Glenn contractor William Wright has been the architect of this development, supported by a team of researchers investigating icing physics, creating validation data, and ensuring development according to standard software engineering practices. The program provides a virtual simulation environment for determining where water droplets strike an airfoil in flight, what kind of ice would result, and what shape that ice would take. Users can enter geometries for specific, two-dimensional cross sections of an airfoil or other airframe surface and then apply a range of inputs - different droplet sizes, temperatures, airspeeds, and more - to model how ice would build up on the surface in various conditions. The program s versatility, ease of use, and speed - LEWICE can run through complex icing simulations in only a few minutes - have contributed to it becoming a popular resource in the aviation industry.

  14. Bonding values of two contemporary ceramic inlay materials to dentin following simulated aging.

    Science.gov (United States)

    Khalil, Ashraf Abdelfattah; Abdelaziz, Khalid Mohamed

    2015-12-01

    To compare the push-out bond strength of feldspar and zirconia-based ceramic inlays bonded to dentin with different resin cements following simulated aging. Occlusal cavities in 80 extracted molars were restored in 2 groups (n=40) with CAD/CAM feldspar (Vitablocs Trilux forte) (FP) and zirconia-based (Ceramill Zi) (ZR) ceramic inlays. The fabricated inlays were luted in 2 subgroups (n=20) with either etch-and-bond (RelyX Ultimate Clicker) (EB) or self-adhesive (RelyX Unicem Aplicap) (SA) resin cement. Ten inlays in each subgroup were subjected to 3,500 thermal cycles and 24,000 loading cycles, while the other 10 served as control. Horizontal 3 mm thick specimens were cut out of the restored teeth for push out bond strength testing. Bond strength data were statistically analyzed using 1-way ANOVA and Tukey's comparisons at α=.05. The mode of ceramic-cement-dentin bond failure for each specimen was also assessed. No statistically significant differences were noticed between FP and ZR bond strength to dentin in all subgroups (ANOVA, P=.05113). No differences were noticed between EB and SA (Tukey's, P>.05) bonded to either type of ceramics. Both adhesive and mixed modes of bond failure were dominant for non-aged inlays. Simulated aging had no significant effect on bond strength values (Tukey's, P>.05) of all ceramic-cement combinations although the adhesive mode of bond failure became more common (60-80%) in aged inlays. The suggested cement-ceramic combinations offer comparable bonding performance to dentin substrate either before or after simulated aging that seems to have no adverse effect on the achieved bond.

  15. Common modelling approaches for training simulators for nuclear power plants

    International Nuclear Information System (INIS)

    1990-02-01

    Training simulators for nuclear power plant operating staff have gained increasing importance over the last twenty years. One of the recommendations of the 1983 IAEA Specialists' Meeting on Nuclear Power Plant Training Simulators in Helsinki was to organize a Co-ordinated Research Programme (CRP) on some aspects of training simulators. The goal statement was: ''To establish and maintain a common approach to modelling for nuclear training simulators based on defined training requirements''. Before adapting this goal statement, the participants considered many alternatives for defining the common aspects of training simulator models, such as the programming language used, the nature of the simulator computer system, the size of the simulation computers, the scope of simulation. The participants agreed that it was the training requirements that defined the need for a simulator, the scope of models and hence the type of computer complex that was required, the criteria for fidelity and verification, and was therefore the most appropriate basis for the commonality of modelling approaches. It should be noted that the Co-ordinated Research Programme was restricted, for a variety of reasons, to consider only a few aspects of training simulators. This report reflects these limitations, and covers only the topics considered within the scope of the programme. The information in this document is intended as an aid for operating organizations to identify possible modelling approaches for training simulators for nuclear power plants. 33 refs

  16. Modelling and simulation of containment on full scope simulator for Qinshan 300 MW Nuclear Power Unit

    International Nuclear Information System (INIS)

    Zou Tingyun

    1996-01-01

    A multi-node containment thermal-hydraulic model has been developed and adapted in Full Scope Simulator for Qinshan 300 MW Nuclear Power Unit with good realtime simulation effects. Containment pressure for LBLOCA calculated by the model is well agreed with those of CONTEMPT-4/MOD3

  17. Medical simulation: Overview, and application to wound modelling and management

    Directory of Open Access Journals (Sweden)

    Dinker R Pai

    2012-01-01

    Full Text Available Simulation in medical education is progressing in leaps and bounds. The need for simulation in medical education and training is increasing because of a overall increase in the number of medical students vis-à-vis the availability of patients; b increasing awareness among patients of their rights and consequent increase in litigations and c tremendous improvement in simulation technology which makes simulation more and more realistic. Simulation in wound care can be divided into use of simulation in wound modelling (to test the effect of projectiles on the body and simulation for training in wound management. Though this science is still in its infancy, more and more researchers are now devising both low-technology and high-technology (virtual reality simulators in this field. It is believed that simulator training will eventually translate into better wound care in real patients, though this will be the subject of further research.

  18. Evaluation of medical countermeasures against organophosphorus compounds: the value of experimental data and computer simulations.

    Science.gov (United States)

    Worek, Franz; Aurbek, Nadine; Herkert, Nadja M; John, Harald; Eddleston, Michael; Eyer, Peter; Thiermann, Horst

    2010-09-06

    Despite extensive research for more than six decades on medical countermeasures against poisoning by organophosphorus compounds (OP) the treatment options are meagre. The presently established acetylcholinesterase (AChE) reactivators (oximes), e.g. obidoxime and pralidoxime, are insufficient against a number of nerve agents and there is ongoing debate on the benefit of oxime treatment in human OP pesticide poisoning. Up to now, the therapeutic efficacy of oximes was mostly evaluated in animal models but substantial species differences prevent direct extrapolation of animal data to humans. Hence, it was considered essential to establish relevant experimental in vitro models for the investigation of oximes as antidotes and to develop computer models for the simulation of oxime efficacy in different scenarios of OP poisoning. Kinetic studies on the various interactions between erythrocyte AChE from various species, structurally different OP and different oximes provided a basis for the initial assessment of the ability of oximes to reactivate inhibited AChE. In the present study, in vitro enzyme-kinetic and pharmacokinetic data from a minipig model of dimethoate poisoning and oxime treatment were used to calculate dynamic changes of AChE activities. It could be shown that there is a close agreement between calculated and in vivo AChE activities. Moreover, computer simulations provided insight into the potential and limitations of oxime treatment. In the end, such data may be a versatile tool for the ongoing discussion of the pros and cons of oxime treatment in human OP pesticide poisoning. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  19. EVT in electricity price modeling : extreme value theory not only on the extreme events

    International Nuclear Information System (INIS)

    Marossy, Z.

    2007-01-01

    The extreme value theory (EVT) is commonly used in electricity and financial risk modeling. In this study, EVT was used to model the distribution of electricity prices. The model was built on the price formation in electricity auction markets. This paper reviewed the 3 main modeling approaches used to describe the distribution of electricity prices. The first approach is based on a stochastic model of the electricity price time series and uses this stochastic model to generate the given distribution. The second approach involves electricity supply and demand factors that determine the price distribution. The third approach involves agent-based models which use simulation techniques to write down the price distribution. A fourth modeling approach was then proposed to describe the distribution of electricity prices. The new approach determines the distribution of electricity prices directly without knowing anything about the data generating process or market driving forces. Empirical data confirmed that the distribution of electricity prices have a generalized extreme value (GEV) distribution. 8 refs., 2 tabs., 5 figs

  20. On the value of water quality data and informative flow states in karst modelling

    Science.gov (United States)

    Hartmann, Andreas; Barberá, Juan Antonio; Andreo, Bartolomé

    2017-11-01

    If properly applied, karst hydrological models are a valuable tool for karst water resource management. If they are able to reproduce the relevant flow and storage processes of a karst system, they can be used for prediction of water resource availability when climate or land use are expected to change. A common challenge to apply karst simulation models is the limited availability of observations to identify their model parameters. In this study, we quantify the value of information when water quality data (NO3- and SO42-) is used in addition to discharge observations to estimate the parameters of a process-based karst simulation model at a test site in southern Spain. We use a three-step procedure to (1) confine an initial sample of 500 000 model parameter sets by discharge and water quality observations, (2) identify alterations of model parameter distributions through the confinement, and (3) quantify the strength of the confinement for the model parameters. We repeat this procedure for flow states, for which the system discharge is controlled by the unsaturated zone, the saturated zone, and the entire time period including times when the spring is influenced by a nearby river. Our results indicate that NO3- provides the most information to identify the model parameters controlling soil and epikarst dynamics during the unsaturated flow state. During the saturated flow state, SO42- and discharge observations provide the best information to identify the model parameters related to groundwater processes. We found reduced parameter identifiability when the entire time period is used as the river influence disturbs parameter estimation. We finally show that most reliable simulations are obtained when a combination of discharge and water quality date is used for the combined unsaturated and saturated flow states.

  1. Forecasting Lightning Threat using Cloud-Resolving Model Simulations

    Science.gov (United States)

    McCaul, Eugene W., Jr.; Goodman, Steven J.; LaCasse, Katherine M.; Cecil, Daniel J.

    2008-01-01

    Two new approaches are proposed and developed for making time and space dependent, quantitative short-term forecasts of lightning threat, and a blend of these approaches is devised that capitalizes on the strengths of each. The new methods are distinctive in that they are based entirely on the ice-phase hydrometeor fields generated by regional cloud-resolving numerical simulations, such as those produced by the WRF model. These methods are justified by established observational evidence linking aspects of the precipitating ice hydrometeor fields to total flash rates. The methods are straightforward and easy to implement, and offer an effective near-term alternative to the incorporation of complex and costly cloud electrification schemes into numerical models. One method is based on upward fluxes of precipitating ice hydrometeors in the mixed phase region at the-15 C level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domain-wide statistics of the peak values of simulated flash rate proxy fields against domain-wide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. Our blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Exploratory tests for selected North Alabama cases show that, because WRF can distinguish the general character of most convective events, our methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because the models tend to have more difficulty in predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single

  2. RETHINKING VALUE: A VALUE-CENTRIC MODEL OF PRODUCT, SERVICE AND BUSINESS DEVELOPMENT

    DEFF Research Database (Denmark)

    Randmaa, Merili; Mougaard, Krestine; Howard, Thomas J.

    2011-01-01

    -thinking customer value within the value system. This article shows how the term “value” is understood in different contexts and fields of economy, to see if these definitions can be merged, in order to understand the concept of value in broader way. The authors argue through literature review and example cases......Globalization and information technologies have made the economical landscape more transparent and customers smarter, more demanding and networked. Companies can see these changes as a threat to their business or as an opportunity to differentiate in the market and be a Prime Mover, by re...... that seeing value from multi-disciplinary viewpoint opens up some unused opportunities for the companies to overcome barriers within a value system, design integrated products and services, work more effectively, co-create value with customers, make use of word-of-mouth promotion and achieve long...

  3. Crashworthiness Simulation of Front Bumper Model of MOROLIPI V2 During Head-on Collision

    Directory of Open Access Journals (Sweden)

    Nugraha Aditya Sukma

    2016-01-01

    Full Text Available It is necessary to conduct an impact test for bumper collision. The use of bumper as a protective components of a vehicle during collision. On this Paper, a crashworthiness simulation of front bumper model with correspond to the size of MOROLIPI V2 is conducted. The purpose of this study was to obtain simulation result used as a reference to predict mechanical behaviour of bumper due to collision. The Simulation result can be predicted deformation after collision, von misses stress criteria after collision with static dummy load. To simulate impact on bumper, ANSYS Explicit Dynamics is used. Simulations were run at three values of mobile robot speeds (5, 10 and 20 m/s. The simulation results also show contact force due to the collision, deformation, stress and internal energy of the bumper beam. It was known that the speed of the vehicle is the dominant parameter determine the results of the crashworthiness simulation.

  4. Value Creation in the Cloud: Understanding Business Model Factors Affecting Value of Cloud Computing

    OpenAIRE

    Morgan, Lorraine; Conboy, Kieran

    2013-01-01

    peer-reviewed Despite the rapid emergence of cloud technology, its prevalence and accessibility to all types of organizations and its potential to predominantly shift competitive landscapes by providing a new platform for creating and delivering business value, empirical research on the business value of cloud computing, and in particular how service providers create value for their customers, is quite limited. Of what little research exists to date, most focuses on technical issu...

  5. Evolution of b-value during the seismic cycle: Insights from laboratory experiments on simulated faults

    Science.gov (United States)

    Rivière, J.; Lv, Z.; Johnson, P. A.; Marone, C.

    2018-01-01

    We investigate the evolution of the frequency-magnitude b-value during stable and unstable frictional sliding experiments. Using a biaxial shear configuration, we record broadband acoustic emissions (AE) while shearing layers of simulated granular fault gouge under normal stresses of 2-8 MPa and shearing velocity of 11 μm/s. AE event amplitude ranges over 3-4 orders of magnitude and we find an inverse correlation between b and shear stress. The reduction of b occurs systematically as shear stress rises prior to stick-slip failure and indicates a greater proportion of large events when faults are more highly stressed. For quasi-periodic stick-slip events, the temporal evolution of b has a characteristic saw-tooth pattern: it slowly drops as shear stress increases and quickly jumps back up at the time of failure. The rate of decrease during the inter-seismic period is independent of normal stress but the average value of b decreases systematically with normal stress. For stable sliding, b is roughly constant during shear, however it exhibits large variability. During irregular stick-slip, we see a mix of both behaviors: b decreases during the interseismic period between events and then remains constant when shear stress stabilizes, until the next event where a co-seismic increase is observed. Our results will help improve seismic hazard assessment and, ultimately, could aid earthquake prediction efforts by providing a process-based understanding of temporal changes in b-value during the seismic cycle.

  6. Vehicle Modeling for Future Generation Transportation Simulation

    Science.gov (United States)

    2009-05-10

    Recent development of inter-vehicular wireless communication technologies have motivated many innovative applications aiming at significantly increasing traffic throughput and improving highway safety. Powerful traffic simulation is an indispensable ...

  7. Modeling phosphorus in the Lake Allatoona watershed using SWAT: I. Developing phosphorus parameter values.

    Science.gov (United States)

    Radcliffe, D E; Lin, Z; Risse, L M; Romeis, J J; Jackson, C R

    2009-01-01

    Lake Allatoona is a large reservoir north of Atlanta, GA, that drains an area of about 2870 km2 scheduled for a phosphorus (P) total maximum daily load (TMDL). The Soil and Water Assessment Tool (SWAT) model has been widely used for watershed-scale modeling of P, but there is little guidance on how to estimate P-related parameters, especially those related to in-stream P processes. In this paper, methods are demonstrated to individually estimate SWAT soil-related P parameters and to collectively estimate P parameters related to stream processes. Stream related parameters were obtained using the nutrient uptake length concept. In a manner similar to experiments conducted by stream ecologists, a small point source is simulated in a headwater sub-basin of the SWAT models, then the in-stream parameter values are adjusted collectively to get an uptake length of P similar to the values measured in the streams in the region. After adjusting the in-stream parameters, the P uptake length estimated in the simulations ranged from 53 to 149 km compared to uptake lengths measured by ecologists in the region of 11 to 85 km. Once the a priori P-related parameter set was developed, the SWAT models of main tributaries to Lake Allatoona were calibrated for daily transport. Models using SWAT P parameters derived from the methods in this paper outperformed models using default parameter values when predicting total P (TP) concentrations in streams during storm events and TP annual loads to Lake Allatoona.

  8. Extreme value modelling of storm damage in Swedish forests

    Directory of Open Access Journals (Sweden)

    A. Bengtsson

    2007-09-01

    Full Text Available Forests cover about 56% of the land area in Sweden and forest damage due to strong winds has been a recurring problem. In this paper we analyse recorded storm damage in Swedish forests for the years 1965–2007. During the period 48 individual storm events with a total damage of 164 Mm³ have been reported with the severe storm on 8 to 9 January 2005, as the worst with 70 Mm³ damaged forest. For the analysis, storm damage data has been normalised to account for the increase in total forest volume over the period.

    We show that, within the framework of statistical extreme value theory, a Poisson point process model can be used to describe these storm damage events. Damage data supports a heavy-tailed distribution with great variability in damage for the worst storm events. According to the model, and in view of available data, the return period for a storm with damage in size of the severe storm of January 2005 is approximately 80 years, i.e. a storm with damage of this magnitude will happen, on average, once every eighty years.

    To investigate a possible temporal trend, models with time-dependent parameters have been analysed but give no conclusive evidence of an increasing trend in the normalised storm damage data for the period. Using a non-parametric approach with a kernel based local-likelihood method gives the same result.

  9. Study of the long-term values and prices of plutonium; a simplified parametrized model

    International Nuclear Information System (INIS)

    Gaussens, J.; Paillot, H.

    1965-01-01

    The authors define the notions of use values and price of plutonium. They give a 'simplified parametrized model' simulating the equilibrium of the offer and the demand in time, concerning the plutonium and the price deriving from the relative scarcity of this metal, taking into account the technical and economic operating parameters of the various reactors confronted. This model is simple enough to allow direct computations and establish clear relations between the various parameters. The use of the linear programmes method allows on the other hand a wide extension of the model. This report includes three main parts: I - General description of the study (without detailed calculations) II - Mathematical development of the simplified parametrized model and application (the basic data and the results of the calculations are given) III - Appendices (giving the detailed computations of part II). (authors) [fr

  10. What do you do when the binomial cannot value real options? The LSM model

    Directory of Open Access Journals (Sweden)

    S. Alonso

    2014-12-01

    Full Text Available The Least-Squares Monte Carlo model (LSM model has emerged as the derivative valuation technique with the greatest impact in current practice. As with other options valuation models, the LSM algorithm was initially posited in the field of financial derivatives and its extension to the realm of real options requires considering certain questions which might hinder understanding of the algorithm and which the present paper seeks to address. The implementation of the LSM model combines Monte Carlo simulation, dynamic programming and statistical regression in a flexible procedure suitable for application to valuing nearly all types of corporate investments. The goal of this paper is to show how the LSM algorithm is applied in the context of a corporate investment, thus contributing to the understanding of the principles of its operation.

  11. Integrated Biosphere Simulator Model (IBIS), Version 2.5

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of...

  12. A simulation model for forecasting downhill ski participation

    Science.gov (United States)

    Daniel J. Stynes; Daniel M. Spotts

    1980-01-01

    The purpose of this paper is to describe progress in the development of a general computer simulation model to forecast future levels of outdoor recreation participation. The model is applied and tested for downhill skiing in Michigan.

  13. Integrated Biosphere Simulator Model (IBIS), Version 2.5

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of processes,...

  14. Six Conductivity Values to Use in the Bidomain Model of Cardiac Tissue.

    Science.gov (United States)

    Johnston, Barbara M

    2016-07-01

    The aim of this work is to produce a consistent set of six conductivity values for use in the bidomain model of cardiac tissue. Studies in 2007 by Hooks et al. and in 2009 by Caldwell et al. have found that, in the directions longitudinal:transverse:normal (l:t:n) to the cardiac fibers, ratios of bulk conductivities and conduction velocities are each approximately in the ratio 4:2:1. These results are used here as the basis for a method that can find sets of six normalized bidomain conductivity values. It is found that the ratios involving transverse and normal conductivities are quite consistent, allowing new light to be shed on conductivity in the normal direction. For example, it is found that the ratio of transverse to normal conductivity is much greater in the intracellular (i) than the extracellular (e) domain. Using parameter values from experimental studies leads to the proposal of a new nominal six conductivity dataset: gil=2.4, gel=2.4, git=0.35, get=2.0, gin=0.08, and gen=1.1 (all in mS/cm). When it is used to model partial thickness ischaemia, this dataset produces epicardial potential distributions in accord with experimental studies in an animal model. It is, therefore, suggested that the dataset is suitable for use in numerical simulations. Since the bidomain approach is the most commonly used method for modeling cardiac electrophysiological phenomena, new information about conductivity in the normal direction, as well as a consistent set of six conductivity values, is valuable for researchers who perform simulation studies.

  15. Modification of SWAT model for simulation of organic matter in Korean watersheds.

    Science.gov (United States)

    Jang, Jae-Ho; Jung, Kwang-Wook; Gyeong Yoon, Chun

    2012-01-01

    The focus of water quality modeling of Korean streams needs to be shifted from dissolved oxygen to algae or organic matter. In particular, the structure of water quality models should be modified to simulate the biochemical oxygen demand (BOD), which is a key factor in calculating total maximum daily loads (TMDLs) in Korea, using 5-day BOD determined in the laboratory (Bottle BOD(5)). Considering the limitations in simulating organic matter under domestic conditions, we attempted to model total organic carbon (TOC) as well as BOD by using a watershed model. For this purpose, the Soil and Water Assessment Tool (SWAT) model was modified and extended to achieve better correspondence between the measured and simulated BOD and TOC concentrations. For simulated BOD in the period 2004-2008, the Nash-Sutcliffe model efficiency coefficient increased from a value of -2.54 to 0.61. Another indicator of organic matter, namely, the simulated TOC concentration showed that the modified SWAT adequately reflected the observed values. The improved model can be used to predict organic matter and hence, may be a potential decision-making tool for TMDLs. However, it needs further testing for longer simulation periods and other catchments.

  16. Simulation-based modeling of building complexes construction management

    Science.gov (United States)

    Shepelev, Aleksandr; Severova, Galina; Potashova, Irina

    2018-03-01

    The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.

  17. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    Science.gov (United States)

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  18. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...

  19. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  20. Active site modeling in copper azurin molecular dynamics simulations

    NARCIS (Netherlands)

    Rizzuti, B; Swart, M; Sportelli, L; Guzzi, R

    Active site modeling in molecular dynamics simulations is investigated for the reduced state of copper azurin. Five simulation runs (5 ns each) were performed at room temperature to study the consequences of a mixed electrostatic/constrained modeling for the coordination between the metal and the

  1. New Simulation Models for Addressing Like X–Aircraft Responses ...

    African Journals Online (AJOL)

    New Simulation Models for Addressing Like X–Aircraft Responses. AS Mohammed, SO Abdulkareem. Abstract. The original Monte Carlo model was previously modified for use in simulating data that conform to certain resource flow constraints. Recent encounters in communication and controls render these data absolute ...

  2. Experimental Design for Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2001-01-01

    This introductory tutorial gives a survey on the use of statistical designs for what if-or sensitivity analysis in simulation.This analysis uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as

  3. Object Oriented Toolbox for Modelling and Simulation of Dynamic Systems

    DEFF Research Database (Denmark)

    Thomsen, Per Grove; Poulsen, Mikael Zebbelin; Wagner, Falko Jens

    1999-01-01

    Design and Implementation of a simulation toolbox based on Object Oriented modelling Techniques.Experimental implementation in C++ using the Godess ODE-solution platform.......Design and Implementation of a simulation toolbox based on Object Oriented modelling Techniques.Experimental implementation in C++ using the Godess ODE-solution platform....

  4. Exploiting Modelling and Simulation in Support of Cyber Defence

    NARCIS (Netherlands)

    Klaver, M.H.A.; Boltjes, B.; Croom-Jonson, S.; Jonat, F.; Çankaya, Y.

    2014-01-01

    The rapidly evolving environment of Cyber threats against the NATO Alliance has necessitated a renewed focus on the development of Cyber Defence policy and capabilities. The NATO Modelling and Simulation Group is looking for ways to leverage Modelling and Simulation experience in research, analysis

  5. Model simulations of rainfall over southern Africa and its eastern ...

    African Journals Online (AJOL)

    Rainfall simulations over southern and tropical Africa in the form of low-resolution Atmospheric Model Intercomparison Project (AMIP) simulations and higher resolution National Centre for Environmental Prediction (NCEP) reanalysis downscalings are presented and evaluated in this paper. The model used is the ...

  6. ASSETS MANAGEMENT - A CONCEPTUAL MODEL DECOMPOSING VALUE FOR THE CUSTOMER AND A QUANTITATIVE MODEL

    Directory of Open Access Journals (Sweden)

    Susana Nicola

    2015-03-01

    Full Text Available In this paper we describe de application of a modeling framework, the so-called Conceptual Model Decomposing Value for the Customer (CMDVC, in a Footwear Industry case study, to ascertain the usefulness of this approach. The value networks were used to identify the participants, both tangible and intangible deliverables/endogenous and exogenous assets, and the analysis of their interactions as the indication for an adequate value proposition. The quantitative model of benefits and sacrifices, using the Fuzzy AHP method, enables the discussion of how the CMDVC can be applied and used in the enterprise environment and provided new relevant relations between perceived benefits (PBs.

  7. Modeling and simulation of a wheatstone bridge pressure sensor in high temperature with VHDL-AMS

    OpenAIRE

    Baccar, Sahbi; Levi, Timothée; Dallet, Dominique; Barbara, François

    2013-01-01

    International audience; This paper presents a model of a Wheatstone bridge sensor in VHDL-AMS. This model is useful to take into account the temperature effect on the sensor accuracy. The model is developed on the basis of a resistor model. Simulations are performed for three different combinations of parameters values. They confirm the resistors mismatch effect on the sensor accuracy in high temperature (HT).

  8. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  9. An Updated Geophysical Model for AMSR-E and SSMIS Brightness Temperature Simulations over Oceans

    Directory of Open Access Journals (Sweden)

    Elizaveta Zabolotskikh

    2014-03-01

    Full Text Available In this study, we considered the geophysical model for microwave brightness temperature (BT simulation for the Atmosphere-Ocean System under non-precipitating conditions. The model is presented as a combination of atmospheric absorption and ocean emission models. We validated this model for two satellite instruments—for Advanced Microwave Sounding Radiometer-Earth Observing System (AMSR-E onboard Aqua satellite and for Special Sensor Microwave Imager/Sounder (SSMIS onboard F16 satellite of Defense Meteorological Satellite Program (DMSP series. We compared simulated BT values with satellite BT measurements for different combinations of various water vapor and oxygen absorption models and wind induced ocean emission models. A dataset of clear sky atmospheric and oceanic parameters, collocated in time and space with satellite measurements, was used for the comparison. We found the best model combination, providing the least root mean square error between calculations and measurements. A single combination of models ensured the best results for all considered radiometric channels. We also obtained the adjustments to simulated BT values, as averaged differences between the model simulations and satellite measurements. These adjustments can be used in any research based on modeling data for removing model/calibration inconsistencies. We demonstrated the application of the model by means of the development of the new algorithm for sea surface wind speed retrieval from AMSR-E data.

  10. Modeling Source Water Threshold Exceedances with Extreme Value Theory

    Science.gov (United States)

    Rajagopalan, B.; Samson, C.; Summers, R. S.

    2016-12-01

    Variability in surface water quality, influenced by seasonal and long-term climate changes, can impact drinking water quality and treatment. In particular, temperature and precipitation can impact surface water quality directly or through their influence on streamflow and dilution capacity. Furthermore, they also impact land surface factors, such as soil moisture and vegetation, which can in turn affect surface water quality, in particular, levels of organic matter in surface waters which are of concern. All of these will be exacerbated by anthropogenic climate change. While some source water quality parameters, particularly Total Organic Carbon (TOC) and bromide concentrations, are not directly regulated for drinking water, these parameters are precursors to the formation of disinfection byproducts (DBPs), which are regulated in drinking water distribution systems. These DBPs form when a disinfectant, added to the water to protect public health against microbial pathogens, most commonly chlorine, reacts with dissolved organic matter (DOM), measured as TOC or dissolved organic carbon (DOC), and inorganic precursor materials, such as bromide. Therefore, understanding and modeling the extremes of TOC and Bromide concentrations is of critical interest for drinking water utilities. In this study we develop nonstationary extreme value analysis models for threshold exceedances of source water quality parameters, specifically TOC and bromide concentrations. In this, the threshold exceedances are modeled as Generalized Pareto Distribution (GPD) whose parameters vary as a function of climate and land surface variables - thus, enabling to capture the temporal nonstationarity. We apply these to model threshold exceedance of source water TOC and bromide concentrations at two locations with different climate and find very good performance.

  11. Improving the simulation of convective dust storms in regional-to-global models

    Science.gov (United States)

    Foroutan, Hosein; Pleim, Jonathan E.

    2017-09-01

    Convective dust storms have significant impacts on atmospheric conditions and air quality and are a major source of dust uplift in summertime. However, regional-to-global models generally do not accurately simulate these storms, a limitation that can be attributed to (1) using a single mean value for wind speed per grid box, i.e., not accounting for subgrid wind variability and (2) using convective parametrizations that poorly simulate cold pool outflows. This study aims to improve the simulation of convective dust storms by tackling these two issues. Specifically, we incorporate a probability distribution function for surface wind in each grid box to account for subgrid wind variability due to dry and moist convection. Furthermore, we use lightning assimilation to increase the accuracy of the convective parameterization and simulated cold pool outflows. This updated model framework is used to simulate a massive convective dust storm that hit Phoenix, AZ, on 6 July 2011. The results show that lightning assimilation provides a more realistic simulation of precipitation features, including timing and location, and the resulting cold pool outflows that generated the dust storm. When those results are combined with a dust model that accounts for subgrid wind variability, the prediction of dust uplift and concentrations are considerably improved compared to the default model results. This modeling framework could potentially improve the simulation of convective dust storms in global models, regional climate simulations, and retrospective air quality studies.

  12. A Tower Model for Lightning Overvoltage Studies Based on the Result of an FDTD Simulation

    Science.gov (United States)

    Noda, Taku

    This paper describes a method for deriving a transmission tower model for EMTP lightning overvoltage studies from a numerical electromagnetic simulation result obtained by the FDTD (Finite Difference Time Domain) method. The FDTD simulation carried out in this paper takes into account the following items which have been ignored or over-simplified in previously-presented simulations: (i) resistivity of the ground soil; (ii) arms, major slant elements, and foundations of the tower; (iii) development speed of the lightning return stroke. For validation purpose a pulse test of a 500-kV transmission tower is simulated, and a comparison with the measured result shows that the present FDTD simulation gives a sufficiently accurate result. Using this validated FDTD-based simulation method the insulator-string voltages of a tower for a lightning stroke are calculated, and based on the simulation result the parameter values of the proposed tower model for EMTP studies are determined in a systematic way. Since previously-presented models include trial-and-error process in the parameter determination, it can be said that the proposed model is more general in this regard. As an illustrative example, the 500-kV transmission tower mentioned above is modeled, and it is shown that the derived model closely reproduces the FDTD simulation result.

  13. A multi-surface plasticity model for ductile fracture simulations

    Science.gov (United States)

    Keralavarma, Shyam M.

    2017-06-01

    The growth and coalescence of micro-voids in a material undergoing ductile fracture depends strongly on the loading path. Void growth occurs by diffuse plasticity in the material and is sensitive to the hydrostatic stress, while void coalescence occurs by the localization of plastic deformation in the inter-void ligaments under a combination of normal and shear stresses on the localization plane. In this paper, a micromechanics-based plasticity model is developed for an isotropic porous material, accounting for both diffuse and localized modes of plasticity at the micro-scale. A multi-surface approach is adopted, and two existing plasticity models that separately account for the two modes of yielding, above, are synthesized to propose an effective isotropic yield criterion and associated state evolution equations. The yield criterion is validated by comparison with quasi-exact numerical yield loci computed using a finite elements based limit analysis procedure. It is shown that the new criterion is in better agreement with the numerical loci than the Gurson model, particularly for large values of the porosity for which the loading path dependence of the yield stress is well predicted by the new model. Even at small porosities, it is shown that the new model predicts marginally lower yield stresses under low triaxiality shear dominated loadings compared to the Gurson model, in agreement with the numerical limit analysis data. Predictions for the strains to the onset of coalescence under proportional loading, obtained by numerically integrating the model, indicate that void coalescence tends to occur at relatively small plastic strain and porosity levels under shear dominated loadings. Implications on the prediction of ductility using the new model in fracture simulations are discussed.

  14. Battery System Modeling for a Military Electric Propulsion Vehicle with a Fault Simulation

    Directory of Open Access Journals (Sweden)

    Hyeongcheol Lee

    2013-10-01

    Full Text Available This paper describes the development process and results of a battery system model with a fault simulation for electric propulsion vehicles. The developed battery system model can be used to verify control and fault diagnosis strategies of the supervisory controller in an electric propulsion vehicle. To develop this battery system model, three sub-models, including a battery model, a relay assembly model, and a battery management system (BMS model, are connected together like in the target real battery system. Comparison results between the real battery system hardware and the battery system model show a similar tendency and values. Furthermore, the fault injection test of the model shows that the proposed battery system model can simulate a failure situation consistent with a real system. It is possible for the model to emulate the battery characteristics and fault situation if it is used in the development process of a BMS or for supervisory control strategies for electric propulsion systems.

  15. Value at risk (VaR in uncertainty: Analysis with parametric method and black & scholes simulations

    Directory of Open Access Journals (Sweden)

    Humberto Banda Ortiz

    2014-07-01

    Full Text Available VaR is the most accepted risk measure worldwide and the leading reference in any risk management assessment. However, its methodology has important limitations which makes it unreliable in contexts of crisis or high uncertainty. For this reason, the aim of this work is to test the VaR accuracy when is employed in contexts of volatility, for which we compare the VaR outcomes in scenarios of both stability and uncertainty, using the parametric method and a historical simulation based on data generated with the Black & Scholes model. VaR main objective is the prediction of the highest expected loss for any given portfolio, but even when it is considered a useful tool for risk management under conditions of markets stability, we found that it is substantially inaccurate in contexts of crisis or high uncertainty. In addition, we found that the Black & Scholes simulations lead to underestimate the expected losses, in comparison with the parametric method and we also found that those disparities increase substantially in times of crisis. In the first section of this work we present a brief context of risk management in finance. In section II we present the existent literature relative to the VaR concept, its methods and applications. In section III we describe the methodology and assumptions used in this work. Section IV is dedicated to expose the findings. And finally, in Section V we present our conclusions.

  16. Cooperatif Learning Models Simulation : From Abstract to Concrete

    Directory of Open Access Journals (Sweden)

    Agustini Ketut

    2018-01-01

    Full Text Available This study aimed to develop a simulation of cooperative learning model that used students as prospective teachers in improving the quality of learning, especially for preparedness in the classroom of the microteaching learning. A wider range of outcomes can be used more widely by teachers and lecturers in order to improve the professionalism as educators. The method used is research and development (R&D, using Dick & Carey development model. To produce as expected, there are several steps that must be done through global research, among others, do steps (a conduct in-depth theoretical study related to the simulation software that will be generated based on cooperative learning models to be developed , (b formulate figure simulation software system is based on the results of theoretical study and (c conduct a formative evaluation is done by content expert, design expert, and media expert to the validity of the simulation media, one to one student evaluation, small group evaluation and field trial evaluation. Simulation results showed that the Cooperative Learning Model can simulated three models by well. Student response through the simulation models is very positive by 60 % and 40% positive. The implication of this research result is that student of teacher candidate can apply cooperative learning model well when teaching real in training school hence student need to be given real simulation example how cooperative learning is implemented in class.

  17. A Java simulator of Rescorla and Wagner's prediction error model and configural cue extensions.

    Science.gov (United States)

    Alonso, Eduardo; Mondragón, Esther; Fernández, Alberto

    2012-10-01

    In this paper we present the "R&W Simulator" (version 3.0), a Java simulator of Rescorla and Wagner's prediction error model of learning. It is able to run whole experimental designs, and compute and display the associative values of elemental and compound stimuli simultaneously, as well as use extra configural cues in generating compound values; it also permits change of the US parameters across phases. The simulator produces both numerical and graphical outputs, and includes a functionality to export the results to a data processor spreadsheet. It is user-friendly, and built with a graphical interface designed to allow neuroscience researchers to input the data in their own "language". It is a cross-platform simulator, so it does not require any special equipment, operative system or support program, and does not need installation. The "R&W Simulator" (version 3.0) is available free. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  18. RETHINKING VALUE: A VALUE-CENTRIC MODEL OF PRODUCT, SERVICE AND BUSINESS DEVELOPMENT

    DEFF Research Database (Denmark)

    Randmaa, Merili; Mougaard, Krestine; Howard, Thomas J.

    2011-01-01

    Globalization and information technologies have made the economical landscape more transparent and customers smarter, more demanding and networked. Companies can see these changes as a threat to their business or as an opportunity to differentiate in the market and be a Prime Mover, by re...... that seeing value from multi-disciplinary viewpoint opens up some unused opportunities for the companies to overcome barriers within a value system, design integrated products and services, work more effectively, co-create value with customers, make use of word-of-mouth promotion and achieve long...

  19. Fault diagnostics in power transformer model winding for different alpha values

    Directory of Open Access Journals (Sweden)

    G.H. Kusumadevi

    2015-09-01

    Full Text Available Transient overvoltages appearing at line terminal of power transformer HV windings can cause failure of winding insulation. The failure can be from winding to ground or between turns or sections of winding. In most of the cases, failure from winding to ground can be detected by changes in the wave shape of surge voltage appearing at line terminal. However, detection of insulation failure between turns may be difficult due to intricacies involved in identifications of faults. In this paper, simulation investigations carried out on a power transformer model winding for identifying faults between turns of winding has been reported. The power transformer HV winding has been represented by 8 sections, 16 sections and 24 sections. Neutral current waveform has been analyzed for same model winding represented by different number of sections. The values of α (‘α’ value is the square root of total ground capacitance to total series capacitance of winding considered for windings are 5, 10 and 20. Standard lightning impulse voltage (1.2/50 μs wave shape have been considered for analysis. Computer simulations have been carried out using software PSPICE version 10.0. Neutral current and frequency response analysis methods have been used for identification of faults within sections of transformer model winding.

  20. MOVES (MOTOR VEHICLE EMISSION SIMULATOR) MODEL ...

    Science.gov (United States)

    A computer model, intended to eventually replace the MOBILE model and to incorporate the NONROAD model, that will provide the ability to estimate criteria and toxic air pollutant emission factors and emission inventories that are specific to the areas and time periods of interest, at scales ranging from local to national. Development of a new emission factor and inventory model for mobile source emissions. The model will be used by air pollution modelers within EPA, and at the State and local levels.

  1. Sensitivity of soil water content simulation to different methods of soil hydraulic parameter characterization as initial input values

    Science.gov (United States)

    Rezaei, Meisam; Seuntjens, Piet; Shahidi, Reihaneh; Joris, Ingeborg; Boënne, Wesley; Cornelis, Wim

    2016-04-01

    Genuchten parameters αvG and n as αG ≈ αvG n. The laboratory measurement of Kls yielded 2 - 30 times higher values than the field method Kfs from top to subsoil layers, while there was a significant correlation between both Ks values (r = 0.75). We found significant differences of MVG parameters θs, n and α values between laboratory and field measurements, but again a significant correlation was observed between laboratory and field MVG parameters Ks, n, θs (r≥0.59). Assessment of the parameter relevance in 1-D model simulations, illustrated a better simulation performance when using laboratory data set from middle to deeper depths (30 to 60 cm). In contrast, field experiment parameter sets, which were achieved in a fast and simple way (less time consuming and labor intensive), resulted in slightly better soil-water content simulation performance in the topsoil (10 and 20 cm) where the plant roots are concentrated. Generally, in view of precision agriculture, field measurements and inverse optimization approaches are preferred to determine soil hydraulic properties. But based on the results, it is not possible to judge whether laboratory or field methods should be preferred and what is the most appropriate data set to predict soil water fluctuations in a complete soil profile.

  2. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained. It is fo......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...... to a tunny generator through a shaft....

  3. Optical modeling and simulation of thin-film photovoltaic devices

    CERN Document Server

    Krc, Janez

    2013-01-01

    In wafer-based and thin-film photovoltaic (PV) devices, the management of light is a crucial aspect of optimization since trapping sunlight in active parts of PV devices is essential for efficient energy conversions. Optical modeling and simulation enable efficient analysis and optimization of the optical situation in optoelectronic and PV devices. Optical Modeling and Simulation of Thin-Film Photovoltaic Devices provides readers with a thorough guide to performing optical modeling and simulations of thin-film solar cells and PV modules. It offers insight on examples of existing optical models

  4. The invaluable benefits of modeling and simulation in our lives

    Energy Technology Data Exchange (ETDEWEB)

    Lorencez, C., E-mail: carlos.lorencez@opg.com [Ontario Power Generation, Nuclear Safety Div., Pickering, Ontario (Canada)

    2015-07-01

    'Full text:' In general terms, we associate the words 'modeling and simulation' with semi-ideal mathematical models reproducing complex Engineering problems. However, the use of modeling and simulation is much more extensive than that: it is applied on a daily basis in almost every front of Science, from sociology and biology to climate change, medicine, robotics, war strategies, etc. It is also being applied by our frontal lobe when we make decisions. The results of these exercises on modeling and simulation have had invaluable benefits on our well being, and we are just at the beginning. (author)

  5. The invaluable benefits of modeling and simulation in our lives

    International Nuclear Information System (INIS)

    Lorencez, C.

    2015-01-01

    'Full text:' In general terms, we associate the words 'modeling and simulation' with semi-ideal mathematical models reproducing complex Engineering problems. However, the use of modeling and simulation is much more extensive than that: it is applied on a daily basis in almost every front of Science, from sociology and biology to climate change, medicine, robotics, war strategies, etc. It is also being applied by our frontal lobe when we make decisions. The results of these exercises on modeling and simulation have had invaluable benefits on our well being, and we are just at the beginning. (author)

  6. Dynamic models of staged gasification processes. Documentation of gasification simulator; Dynamiske modeller a f trinopdelte forgasningsprocesser. Dokumentation til forgasser simulator

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-02-15

    In connection with the ERP project 'Dynamic modelling of staged gasification processes' a gasification simulator has been constructed. The simulator consists of: a mathematical model of the gasification process developed at Technical University of Denmark, a user interface programme, IGSS, and a communication interface between the two programmes. (BA)

  7. A study for production simulation model generation system based on data model at a shipyard

    Directory of Open Access Journals (Sweden)

    Myung-Gi Back

    2016-09-01

    Full Text Available Simulation technology is a type of shipbuilding product lifecycle management solution used to support production planning or decision-making. Normally, most shipbuilding processes are consisted of job shop production, and the modeling and simulation require professional skills and experience on shipbuilding. For these reasons, many shipbuilding companies have difficulties adapting simulation systems, regardless of the necessity for the technology. In this paper, the data model for shipyard production simulation model generation was defined by analyzing the iterative simulation modeling procedure. The shipyard production simulation data model defined in this study contains the information necessary for the conventional simulation modeling procedure and can serve as a basis for simulation model generation. The efficacy of the developed system was validated by applying it to the simulation model generation of the panel block production line. By implementing the initial simulation model generation process, which was performed in the past with a simulation modeler, the proposed system substantially reduced the modeling time. In addition, by reducing the difficulties posed by different modeler-dependent generation methods, the proposed system makes the standardization of the simulation model quality possible.

  8. Application of Hidden Markov Models in Biomolecular Simulations.

    Science.gov (United States)

    Shukla, Saurabh; Shamsi, Zahra; Moffett, Alexander S; Selvam, Balaji; Shukla, Diwakar

    2017-01-01

    Hidden Markov models (HMMs) provide a framework to analyze large trajectories of biomolecular simulation datasets. HMMs decompose the conformational space of a biological molecule into finite number of states that interconvert among each other with certain rates. HMMs simplify long timescale trajectories for human comprehension, and allow comparison of simulations with experimental data. In this chapter, we provide an overview of building HMMs for analyzing bimolecular simulation datasets. We demonstrate the procedure for building a Hidden Markov model for Met-enkephalin peptide simulation dataset and compare the timescales of the process.

  9. A Review of the Wood Pellet Value Chain, Modern Value/Supply Chain Management Approaches, and Value/Supply Chain Models

    Directory of Open Access Journals (Sweden)

    Natalie M. Hughes

    2014-01-01

    Full Text Available We reviewed 153 peer-reviewed sources to provide identification of modern supply chain management techniques and exploration of supply chain modeling, to offer decision support to managers. Ultimately, the review is intended to assist member-companies of supply chains, mainly producers, improve their current management approaches, by directing them to studies that may be suitable for direct application to their supply chains and value chains for improved efficiency and profitability. We found that information on supply chain management and modeling techniques in general is available. However, few Canadian-based published studies exist regarding a demand-driven modeling approach to value/supply chain management for wood pellet production. Only three papers were found specifically on wood pellet value chain analysis. We propose that more studies should be carried out on the value chain of wood pellet manufacturing, as well as demand-driven management and modeling approaches with improved demand forecasting methods.

  10. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  11. Calibration of the simulation model of the VINCY cyclotron magnet

    Directory of Open Access Journals (Sweden)

    Ćirković Saša

    2002-01-01

    Full Text Available The MERMAID program will be used to isochronise the nominal magnetic field of the VINCY Cyclotron. This program simulates the response, i. e. calculates the magnetic field, of a previously defined model of a magnet. The accuracy of 3D field calculation depends on the density of the grid points in the simulation model grid. The size of the VINCY Cyclotron and the maximum number of grid points in the XY plane limited by MERMAID define the maximumobtainable accuracy of field calculations. Comparisons of the field simulated with maximum obtainable accuracy with the magnetic field measured in the first phase of the VINCY Cyclotron magnetic field measurements campaign has shown that the difference between these two fields is not as small as required. Further decrease of the difference between these fields is obtained by the simulation model calibration, i. e. by adjusting the current through the main coils in the simulation model.

  12. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  13. Modeling and simulation of Indus-2 RF feedback control system

    International Nuclear Information System (INIS)

    Sharma, D.; Bagduwal, P.S.; Tiwari, N.; Lad, M.; Hannurkar, P.R.

    2012-01-01

    Indus-2 synchrotron radiation source has four RF stations along with their feedback control systems. For higher beam energy and current operation amplitude and phase feedback control systems of Indus-2 are being upgraded. To understand the behaviour of amplitude and phase control loop under different operating conditions, modelling and simulation of RF feedback control system is done. RF cavity baseband I/Q model has been created due to its close correspondence with actual implementation and better computational efficiency which makes the simulation faster. Correspondence between cavity baseband and RF model is confirmed by comparing their simulation results. Low Level RF (LLRF) feedback control system simulation is done using the same cavity baseband I/Q model. Error signals are intentionally generated and response of the closed loop system is observed. Simulation will help us in optimizing parameters of upgraded LLRF system for higher beam energy and current operation. (author)

  14. MODELING AND SIMULATION OF INDUSTRIAL FORMALDEHYDE ABSORBERS

    NARCIS (Netherlands)

    WINKELMAN, JGM; SIJBRING, H; BEENACKERS, AACM; DEVRIES, ET

    1992-01-01

    The industrially important process of formaldehyde absorption in water constitutes a case of multicomponent mass transfer with multiple reactions and considerable heat effects. A stable solution algorithm is developed to simulate the performance of industrial absorbers for this process using a

  15. Modelling and simulation of surface water waves

    NARCIS (Netherlands)

    van Groesen, Embrecht W.C.; Westhuis, J.H.

    2002-01-01

    The evolution of waves on the surface of a layer of fluid is governed by non-linear effects from surface deformations and dispersive effects from the interaction with the interior fluid motion. Several simulation tools are described in this paper and compared with real life experiments in large

  16. Model and simulation of Krause model in dynamic open network

    Science.gov (United States)

    Zhu, Meixia; Xie, Guangqiang

    2017-08-01

    The construction of the concept of evolution is an effective way to reveal the formation of group consensus. This study is based on the modeling paradigm of the HK model (Hegsekmann-Krause). This paper analyzes the evolution of multi - agent opinion in dynamic open networks with member mobility. The results of the simulation show that when the number of agents is constant, the interval distribution of the initial distribution will affect the number of the final view, The greater the distribution of opinions, the more the number of views formed eventually; The trust threshold has a decisive effect on the number of views, and there is a negative correlation between the trust threshold and the number of opinions clusters. The higher the connectivity of the initial activity group, the more easily the subjective opinion in the evolution of opinion to achieve rapid convergence. The more open the network is more conducive to the unity of view, increase and reduce the number of agents will not affect the consistency of the group effect, but not conducive to stability.

  17. A preclinical simulated dataset of S-values and investigation of the impact of rescaled organ masses using the MOBY phantom

    International Nuclear Information System (INIS)

    Kostou, Theodora; Papadimitroulas, Panagiotis; Kagadis, George C; Loudos, George

    2016-01-01

    Nuclear medicine and radiation therapy, although well established, are still rapidly evolving, by exploiting animal models, aiming to define precise dosimetry in molecular imaging protocols. The purpose of the present study was to create a dataset based on the MOBY phantom for the calculation of organ-to-organ S-values of commonly used radionuclides. S-values of most crucial organs were calculated using specific biodistributions with a whole-body heterogeneous source. In order to determine the impact of the varying organs’ size on the S-values, and based on the fact that the anatomic properties of the organs are correlated with S-values, dosimetric calculations were performed by simulating the MOBY-version 2 model with different whole-body masses. The GATE Monte Carlo simulation toolkit was used for all simulations. Two mouse models of different body masses were developed to calculate the S-values of eight commonly used radioisotopes in nuclear imaging studies, namely 18 F, 68 Ga, 131 I, 111 In, 177 Lu, and 99m Tc, 90 Y and 188 Re. The impact of modified mass of the source organs in S-values was investigated with 18 F, and 90 Y in five different scalings of the source organs. Based on realistic preclinical exams, three mouse models, 22, 28 and 34 g, were used as input in the GATE simulator based on realistic preclinical exams to calculate the S-values of the six radioisotopes used. Whole body activity distributions were used as the source organ. The simulation procedure was validated in terms of extracting individual organ-to-organ S-values, and consequently in calculating the new S-values using a heterogeneous activity distribution as a source. The calculation was validated with 18 F source in a 30 g mouse model. For the generation of the new S-values with heterogeneous activity sources, four organs were used for the calculation of a single S-value. The absorbed doses per organ were compared with previously published reports. The validation procedure of 18 F

  18. A preclinical simulated dataset of S-values and investigation of the impact of rescaled organ masses using the MOBY phantom

    Science.gov (United States)

    Kostou, Theodora; Papadimitroulas, Panagiotis; Loudos, George; Kagadis, George C.

    2016-03-01

    Nuclear medicine and radiation therapy, although well established, are still rapidly evolving, by exploiting animal models, aiming to define precise dosimetry in molecular imaging protocols. The purpose of the present study was to create a dataset based on the MOBY phantom for the calculation of organ-to-organ S-values of commonly used radionuclides. S-values of most crucial organs were calculated using specific biodistributions with a whole-body heterogeneous source. In order to determine the impact of the varying organs’ size on the S-values, and based on the fact that the anatomic properties of the organs are correlated with S-values, dosimetric calculations were performed by simulating the MOBY-version 2 model with different whole-body masses. The GATE Monte Carlo simulation toolkit was used for all simulations. Two mouse models of different body masses were developed to calculate the S-values of eight commonly used radioisotopes in nuclear imaging studies, namely 18F, 68Ga, 131I, 111In, 177Lu, and 99mTc, 90Y and 188Re. The impact of modified mass of the source organs in S-values was investigated with 18F, and 90Y in five different scalings of the source organs. Based on realistic preclinical exams, three mouse models, 22, 28 and 34 g, were used as input in the GATE simulator based on realistic preclinical exams to calculate the S-values of the six radioisotopes used. Whole body activity distributions were used as the source organ. The simulation procedure was validated in terms of extracting individual organ-to-organ S-values, and consequently in calculating the new S-values using a heterogeneous activity distribution as a source. The calculation was validated with 18F source in a 30 g mouse model. For the generation of the new S-values with heterogeneous activity sources, four organs were used for the calculation of a single S-value. The absorbed doses per organ were compared with previously published reports. The validation procedure of 18F indicates

  19. MODELING OF HIGH STORAGE SHEET DEPOT WITH PLANT SIMULATION

    Directory of Open Access Journals (Sweden)

    Andrzej Jardzioch

    2013-03-01

    Full Text Available Manufacturing processes are becoming increasingly automated. Introduction of innovative solutions often necessitate processing very large number of signals from various devices. Correctness tests of the components configuration becomes a compiled operation requiring vast expenditure of time and knowledge. The models may be a mathematical reflection of the actual object. Many actions can be computer-assisted to varying degree. One example is construction of simulation models. These can also be simulation models developed in advanced software. The stages of creating a model may be purely random. This paper aims at a closer analysis of the simulation model based on the high storage sheet depot modeling using Plant Simulation software. The results of analysis can be used for optimization, but this stage is a separate issue.

  20. Simulation of daily rainfall through markov chain modeling

    International Nuclear Information System (INIS)

    Sadiq, N.

    2015-01-01

    Being an agricultural country, the inhabitants of dry land in cultivated areas mainly rely on the daily rainfall for watering their fields. A stochastic model based on first order Markov Chain was developed to simulate daily rainfall data for Multan, D. I. Khan, Nawabshah, Chilas and Barkhan for the period 1981-2010. Transitional probability matrices of first order Markov Chain was utilized to generate the daily rainfall occurrence while gamma distribution was used to generate the daily rainfall amount. In order to achieve the parametric values of mentioned cities, method of moments is used to estimate the shape and scale parameters which lead to synthetic sequence generation as per gamma distribution. In this study, unconditional and conditional probabilities of wet and dry days in sum with means and standard deviations are considered as the essential parameters for the simulated stochastic generation of daily rainfalls. It has been found that the computerized synthetic rainfall series concurred pretty well with the actual observed rainfall series. (author)