WorldWideScience

Sample records for model simulated values

  1. Simulating WTP Values from Random-Coefficient Models

    OpenAIRE

    Maurus Rischatsch

    2009-01-01

    Discrete Choice Experiments (DCEs) designed to estimate willingness-to-pay (WTP) values are very popular in health economics. With increased computation power and advanced simulation techniques, random-coefficient models have gained an increasing importance in applied work as they allow for taste heterogeneity. This paper discusses the parametrical derivation of WTP values from estimated random-coefficient models and shows how these values can be simulated in cases where they do not have a kn...

  2. Digitized Onondaga Lake Dissolved Oxygen Concentrations and Model Simulated Values using Bayesian Monte Carlo Methods

    Data.gov (United States)

    U.S. Environmental Protection Agency — The dataset is lake dissolved oxygen concentrations obtained form plots published by Gelda et al. (1996) and lake reaeration model simulated values using Bayesian...

  3. ARM Cloud Radar Simulator Package for Global Climate Models Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yuying [North Carolina State Univ., Raleigh, NC (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-05-01

    It has been challenging to directly compare U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility ground-based cloud radar measurements with climate model output because of limitations or features of the observing processes and the spatial gap between model and the single-point measurements. To facilitate the use of ARM radar data in numerical models, an ARM cloud radar simulator was developed to converts model data into pseudo-ARM cloud radar observations that mimic the instrument view of a narrow atmospheric column (as compared to a large global climate model [GCM] grid-cell), thus allowing meaningful comparison between model output and ARM cloud observations. The ARM cloud radar simulator value-added product (VAP) was developed based on the CloudSat simulator contained in the community satellite simulator package, the Cloud Feedback Model Intercomparison Project (CFMIP) Observation Simulator Package (COSP) (Bodas-Salcedo et al., 2011), which has been widely used in climate model evaluation with satellite data (Klein et al., 2013, Zhang et al., 2010). The essential part of the CloudSat simulator is the QuickBeam radar simulator that is used to produce CloudSat-like radar reflectivity, but is capable of simulating reflectivity for other radars (Marchand et al., 2009; Haynes et al., 2007). Adapting QuickBeam to the ARM cloud radar simulator within COSP required two primary changes: one was to set the frequency to 35 GHz for the ARM Ka-band cloud radar, as opposed to 94 GHz used for the CloudSat W-band radar, and the second was to invert the view from the ground to space so as to attenuate the beam correctly. In addition, the ARM cloud radar simulator uses a finer vertical resolution (100 m compared to 500 m for CloudSat) to resolve the more detailed structure of clouds captured by the ARM radars. The ARM simulator has been developed following the COSP workflow (Figure 1) and using the capabilities available in COSP

  4. Dynamic Value at Risk: A Comparative Study Between Heteroscedastic Models and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    José Lamartine Távora Junior

    2006-12-01

    Full Text Available The objective of this paper was to analyze the risk management of a portfolio composed by Petrobras PN, Telemar PN and Vale do Rio Doce PNA stocks. It was verified if the modeling of Value-at-Risk (VaR through the place Monte Carlo simulation with volatility of GARCH family is supported by hypothesis of efficient market. The results have shown that the statistic evaluation in inferior to dynamics, evidencing that the dynamic analysis supplies support to the hypothesis of efficient market of the Brazilian share holding market, in opposition of some empirical evidences. Also, it was verified that the GARCH models of volatility is enough to accommodate the variations of the shareholding Brazilian market, since the model is capable to accommodate the great dynamic of the Brazilian market.

  5. Simulating the value of electric-vehicle-grid integration using a behaviourally realistic model

    Science.gov (United States)

    Wolinetz, Michael; Axsen, Jonn; Peters, Jotham; Crawford, Curran

    2018-02-01

    Vehicle-grid integration (VGI) uses the interaction between electric vehicles and the electrical grid to provide benefits that may include reducing the cost of using intermittent renwable electricity or providing a financial incentive for electric vehicle ownerhip. However, studies that estimate the value of VGI benefits have largely ignored how consumer behaviour will affect the magnitude of the impact. Here, we simulate the long-term impact of VGI using behaviourally realistic and empirically derived models of vehicle adoption and charging combined with an electricity system model. We focus on the case where a central entity manages the charging rate and timing for participating electric vehicles. VGI is found not to increase the adoption of electric vehicles, but does have a a small beneficial impact on electricity prices. By 2050, VGI reduces wholesale electricity prices by 0.6-0.7% (0.7 MWh-1, 2010 CAD) relative to an equivalent scenario without VGI. Excluding consumer behaviour from the analysis inflates the value of VGI.

  6. Investigating added value of regional climate modeling in North American winter storm track simulations

    Science.gov (United States)

    Poan, E. D.; Gachon, P.; Laprise, R.; Aider, R.; Dueymes, G.

    2018-03-01

    Extratropical Cyclone (EC) characteristics depend on a combination of large-scale factors and regional processes. However, the latter are considered to be poorly represented in global climate models (GCMs), partly because their resolution is too coarse. This paper describes a framework using possibilities given by regional climate models (RCMs) to gain insight into storm activity during winter over North America (NA). Recent past climate period (1981-2005) is considered to assess EC activity over NA using the NCEP regional reanalysis (NARR) as a reference, along with the European reanalysis ERA-Interim (ERAI) and two CMIP5 GCMs used to drive the Canadian Regional Climate Model—version 5 (CRCM5) and the corresponding regional-scale simulations. While ERAI and GCM simulations show basic agreement with NARR in terms of climatological storm track patterns, detailed bias analyses show that, on the one hand, ERAI presents statistically significant positive biases in terms of EC genesis and therefore occurrence while capturing their intensity fairly well. On the other hand, GCMs present large negative intensity biases in the overall NA domain and particularly over NA eastern coast. In addition, storm occurrence over the northwestern topographic regions is highly overestimated. When the CRCM5 is driven by ERAI, no significant skill deterioration arises and, more importantly, all storm characteristics near areas with marked relief and over regions with large water masses are significantly improved with respect to ERAI. Conversely, in GCM-driven simulations, the added value contributed by CRCM5 is less prominent and systematic, except over western NA areas with high topography and over the Western Atlantic coastlines where the most frequent and intense ECs are located. Despite this significant added-value on seasonal-mean characteristics, a caveat is raised on the RCM ability to handle storm temporal `seriality', as a measure of their temporal variability at a given

  7. Very high resolution regional climate model simulations over Greenland: Identifying added value

    DEFF Research Database (Denmark)

    Lucas-Picher, P.; Wulff-Nielsen, M.; Christensen, J.H.

    2012-01-01

    models. However, the bias between the simulations and the few available observations does not reduce with higher resolution. This is partly explained by the lack of observations in regions where the higher resolution is expected to improve the simulated climate. The RCM simulations show......This study presents two simulations of the climate over Greenland with the regional climate model (RCM) HIRHAM5 at 0.05° and 0.25° resolution driven at the lateral boundaries by the ERA-Interim reanalysis for the period 1989–2009. These simulations are validated against observations from...... that the temperature has increased the most in the northern part of Greenland and at lower elevations over the period 1989–2009. Higher resolution increases the relief variability in the model topography and causes the simulated precipitation to be larger on the coast and smaller over the main ice sheet compared...

  8. Learning from Noisy and Delayed Rewards: The Value of Reinforcement Learning to Defense Modeling and Simulation

    Science.gov (United States)

    2012-09-01

    decision making within sim- ulation models (Lattal, 2010; Thorndike , 1911). Cognitive architectures are simulation- oriented models of individual human...human, behavior (Lattal, 2010; Thorndike , 1911). Of several responses made to the same situation, those which are accompanied or closely followed by...strengthening or weakening of the bond. ( Thorndike , 1911) Thorndike’s research on the Law of Effect influenced Skinner’s research in operant conditioning

  9. The Potential Value of Clostridium difficile Vaccine: An Economic Computer Simulation Model

    Science.gov (United States)

    Lee, Bruce Y.; Popovich, Michael J.; Tian, Ye; Bailey, Rachel R.; Ufberg, Paul J.; Wiringa, Ann E.; Muder, Robert R.

    2010-01-01

    Efforts are currently underway to develop a vaccine against Clostridium difficile infection (CDI). We developed two decision analytic Monte Carlo computer simulation models: (1) an Initial Prevention Model depicting the decision whether to administer C. difficile vaccine to patients at-risk for CDI and (2) a Recurrence Prevention Model depicting the decision whether to administer C. difficile vaccine to prevent CDI recurrence. Our results suggest that a C. difficile vaccine could be cost-effective over a wide range of C. difficile risk, vaccine costs, and vaccine efficacies especially when being used post-CDI treatment to prevent recurrent disease. PMID:20541582

  10. The impact of MCS models and EFAC values on the dose simulation for a proton pencil beam

    International Nuclear Information System (INIS)

    Chen, Shih-Kuan; Chiang, Bing-Hao; Lee, Chung-Chi; Tung, Chuan-Jong; Hong, Ji-Hong; Chao, Tsi-Chian

    2017-01-01

    The Multiple Coulomb Scattering (MCS) model plays an important role in accurate MC simulation, especially for small field applications. The Rossi model is used in MCNPX 2.7.0, and the Lewis model in Geant4.9.6.p02. These two models may generate very different angular and spatial distributions in small field proton dosimetry. Beside angular and spatial distributions, step size is also an important issue that causes path length effects. The Energy Fraction (EFAC) value can be used in MCNPX 2.7.0 to control step sizes of MCS. In this study, we use MCNPX 2.7.0, Geant4.9.6.p02, and one pencil beam algorithm to evaluate the effect of dose deposition because of different MCS models and different EFAC values in proton disequilibrium situation. Different MCS models agree well with each other under a proton equilibrium situation. Under proton disequilibrium situations, the MCNPX and Geant4 results, however, show a significant deviation (up to 43%). In addition, the path length effects are more significant when EFAC is equal to 0.917 and 0.94 in small field proton dosimetry, and using a 0.97 EFAC value is the best for both accuracy and efficiency - Highlights: • MCS and EFAC are important in accurate MC simulation for proton pencil beams. • Bragg curves of MCNPX and Geant4 have a dose deviation up to 43%. • Lateral profiles from MCNPX is wider than those from Geant4. • Large EFAC caused path length effect, but no effects on lateral profiles. • 0.97 EFAC value is the best for both accuracy and efficiency.

  11. Verification of Fourier phase and amplitude values from simulated heart motion using a hydrodynamic cardiac model

    International Nuclear Information System (INIS)

    Yiannikas, J.; Underwood, D.A.; Takatani, Setsuo; Nose, Yukihiko; MacIntyre, W.J.; Cook, S.A.; Go, R.T.; Golding, L.; Loop, F.D.

    1986-01-01

    Using pusher-plate-type artificial hearts, changes in the degree of synchrony and stroke volume were compared to phase and amplitude calculations from the first Fourier component of individual-pixel time-activity curves generated from gated radionuclide images (RNA) of these hearts. In addition, the ability of Fourier analysis to quantify paradoxical volume shifts was tested using a ventricular aneurysm model by which the Fourier amplitude was correlated to known increments of paradoxical volume. Predetermined phase-angle differences (incremental increases in asynchrony) and the mean phase-angle difference calculated from RNAs showed an agreement of -7 0 +-4.4 0 (mean +-SD). A strong correlation was noted between stroke volume and Fourier amplitude (r=0.98; P<0.0001) as well as between the paradoxical volume accepted by the 'aneurysm' and the Fourier amplitude (r=0.97; P<0.0001). The degree of asynchrony and changes in stroke volume were accurately reflected by the Fourier phase and amplitude values, respectively. In the specific case of ventricular aneurysms, the data demonstrate that using this method, the paradoxically moving areas may be localized, and the expansile volume within these regions can be quantified. (orig.)

  12. Verification of Fourier phase and amplitude values from simulated heart motion using a hydrodynamic cardiac model

    Energy Technology Data Exchange (ETDEWEB)

    Yiannikas, J; Underwood, D A; Takatani, Setsuo; Nose, Yukihiko; MacIntyre, W J; Cook, S A; Go, R T; Golding, L; Loop, F D

    1986-02-01

    Using pusher-plate-type artificial hearts, changes in the degree of synchrony and stroke volume were compared to phase and amplitude calculations from the first Fourier component of individual-pixel time-activity curves generated from gated radionuclide images (RNA) of these hearts. In addition, the ability of Fourier analysis to quantify paradoxical volume shifts was tested using a ventricular aneurysm model by which the Fourier amplitude was correlated to known increments of paradoxical volume. Predetermined phase-angle differences (incremental increases in asynchrony) and the mean phase-angle difference calculated from RNAs showed an agreement of -7/sup 0/ +- 4.4/sup 0/ (mean +- SD). A strong correlation was noted between stroke volume and Fourier amplitude as well as between the paradoxical volume accepted by the 'aneurysm' and the Fourier amplitude. The degree of asynchrony and changes in stroke volume were accurately reflected by the Fourier phase and amplitude values, respectively. In the specific case of ventricular aneurysms, the data demonstrate that using this method, the paradoxically moving areas may be localized, and the expansile volume within these regions can be quantified. (orig.).

  13. THE VALUE OF NUDGING IN THE METEOROLOGY MODEL FOR RETROSPECTIVE CMAQ SIMULATIONS

    Science.gov (United States)

    Using a nudging-based data assimilation approach throughout a meteorology simulation (i.e., as a "dynamic analysis") is considered valuable because it can provide a better overall representation of the meteorology than a pure forecast. Dynamic analysis is often used in...

  14. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    Science.gov (United States)

    Hu, Ming-Che

    Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received

  15. Potential for added value in precipitation simulated by high-resolution nested Regional Climate Models and observations

    Energy Technology Data Exchange (ETDEWEB)

    Di Luca, Alejandro; Laprise, Rene [Universite du Quebec a Montreal (UQAM), Centre ESCER (Etude et Simulation du Climat a l' Echelle Regionale), Departement des Sciences de la Terre et de l' Atmosphere, PK-6530, Succ. Centre-ville, B.P. 8888, Montreal, QC (Canada); De Elia, Ramon [Universite du Quebec a Montreal, Ouranos Consortium, Centre ESCER (Etude et Simulation du Climat a l' Echelle Regionale), Montreal (Canada)

    2012-03-15

    Regional Climate Models (RCMs) constitute the most often used method to perform affordable high-resolution regional climate simulations. The key issue in the evaluation of nested regional models is to determine whether RCM simulations improve the representation of climatic statistics compared to the driving data, that is, whether RCMs add value. In this study we examine a necessary condition that some climate statistics derived from the precipitation field must satisfy in order that the RCM technique can generate some added value: we focus on whether the climate statistics of interest contain some fine spatial-scale variability that would be absent on a coarser grid. The presence and magnitude of fine-scale precipitation variance required to adequately describe a given climate statistics will then be used to quantify the potential added value (PAV) of RCMs. Our results show that the PAV of RCMs is much higher for short temporal scales (e.g., 3-hourly data) than for long temporal scales (16-day average data) due to the filtering resulting from the time-averaging process. PAV is higher in warm season compared to cold season due to the higher proportion of precipitation falling from small-scale weather systems in the warm season. In regions of complex topography, the orographic forcing induces an extra component of PAV, no matter the season or the temporal scale considered. The PAV is also estimated using high-resolution datasets based on observations allowing the evaluation of the sensitivity of changing resolution in the real climate system. The results show that RCMs tend to reproduce relatively well the PAV compared to observations although showing an overestimation of the PAV in warm season and mountainous regions. (orig.)

  16. Clinical value of virtual three-dimensional instrument and cerebral aneurysm models in the interventional preoperative simulation

    International Nuclear Information System (INIS)

    Wei Xin; Xie Xiaodong; Wang Chaohua

    2007-01-01

    Objective: To establish virtual three-dimensional instrument and cerebral aneurysm models by using three-dimensional moulding software, and to explore the effect of the models in interventional preoperative simulation. Methods: The virtual individual models including cerebral arteries and aneurysms were established by using the three-dimensional moulding software of 3D Studio MAX R3 based on standard virtual cerebral aneurysm models and individual DSA image. The virtual catheter, guide wire, stent and coil were also established. The study of interventional preoperative simulation was run in personal computer, and included 3 clinical cases. Results: The simulation results of the working angle and the moulding angle of the head of catheter and guide wire in 3 cases were identical with that of operation results. The simulation results of the requirement of number and size of coil in 1 case of anterior communicating aneurysm and 1 case of posterior communicating aneurysm were identical with that of operation results. The simulation results of coil for aneurysmal shape in 1 case of giant internal carotid artery aneurysm were more than 2 three-dimensional coils with size of 3 mm x 3 cm from the operation results, and the position of the second coil in aneurysmal neck was adjusted according to the results of real-time simulation. The results of retrospective simulation of operation procedure indicated that the simulation methods for regular and small aneurysms could become a routine simulation means but more simulation experience was needed to build up for the giant aneurysms. Conclusions: The virtual three-dimensional instrument and cerebral aneurysm models established by the general software provided a new study method for neuro-interventional preoperative simulation, and it played an important guidance role in developing neuro-interventional operation. (authors)

  17. Optimal set values of zone modeling in the simulation of a walking beam type reheating furnace on the steady-state operating regime

    International Nuclear Information System (INIS)

    Yang, Zhi; Luo, Xiaochuan

    2016-01-01

    Highlights: • The adjoint equation is introduced to the PDE optimal control problem. • Lipschitz continuity for the gradient of the cost functional is derived. • The simulation time and iterations reduce by a large margin in the simulations. • The model validation and comparison are made to verify the proposed math model. - Abstract: In this paper, this study proposed a new method to solve the PDE optimal control problem by introducing the adjoint problem to the optimization model, which was used to get the reference values for the optimal furnace zone temperatures and the optimal temperature distribution of steel slabs in the reheating furnace on the steady-state operating regime. It was proved that the gradient of the cost functional could be written via the weak solution of this adjoint problem and then Lipschitz continuity of the gradient was derived. Model validation and comparison between the mathematics model and the experiment results indicated that the present heat transfer model worked well for the prediction of thermal behavior about a slab in the reheating furnace. Iterations and simulation time had shown a significant decline in the simulations of 20MnSi slab, and it was shown by numerical simulations for 0.4 m thick slabs that the proposed method was better applied in the medium and heavy plate plant, leading to better performance in terms of productivity, energy efficiency and other features of reheating furnaces.

  18. Valuing improvements in comfort from domestic energy-efficiency retrofits using a trade-off simulation model

    International Nuclear Information System (INIS)

    Clinch, J. Peter; Healy, John D.

    2003-01-01

    There are a number of stimuli behind energy efficiency, not least the Kyoto Protocol. The domestic sector has been highlighted as a key potential area. Improving energy efficiency in this sector also assists alleviating fuel poverty, for research is now demonstrating the strong relationship between poor domestic thermal efficiency, high fuel poverty and poor health and comfort status. Previous research has modelled the energy consumption and technical potential for energy saving resulting from energy-efficiency upgrades in this sector. However, there is virtually no work evaluating the economic benefit of improving households' thermal comfort post-retrofit. This paper does this for Ireland using a computer-simulation program. A dynamic modelling process is employed which projects into the future predicting the extent to which energy savings are forgone for improvements in comfort

  19. Cross-sector diversification in financial conglomerates: simulations with a fair-value assets and liabilities model

    Directory of Open Access Journals (Sweden)

    Jacob A. Bikker

    2002-12-01

    Full Text Available Risk diversification is one of the many reasons for cross-sector mergers of financialinstitutes. This paper presents a fair-value type asset and liability model in order to identify diversification effects for financial conglomerates (PCs under various shocks. My analysis for the Netherlands reveals that diversification effects on PCs of especially interest rate shocks are very strong. In principle, substantial diversificationeffects argue for lower capital requirements for PCs. However, there are other non-negligible risks run by PCs to consider, namely contagion risk, regulatory arbitrage andcross-sector and TBTF moral hazard risks, which have not yet been quantified.

  20. Values and behaviour model

    International Nuclear Information System (INIS)

    Anon

    2011-01-01

    Occupational injuries, accidents, trips of equipment, emergencies, and idle times represent a loss from each megawatt hour which we could have supplied to the network, or other costs related to settlement or compensation for damages. All of it can be caused by short lack of attention while doing a routine job, ignoring safety indicators, and rules. Such behaviour would not be a characteristic of a professional. People working at the nuclear power plants are the first ones to learn about the Values and Behaviour Model. (author)

  1. A simulation model to quantify the value of implementing whole-herd Bovine viral diarrhea virus testing strategies in beef cow-calf herds.

    Science.gov (United States)

    Nickell, Jason S; White, Brad J; Larson, Robert L; Renter, David G; Sanderson, Mike W

    2011-03-01

    Although numerous diagnostic tests are available to identify cattle persistently infected (PI) with Bovine viral diarrhea virus (BVDV) in cow-calf herds, data are sparse when evaluating the economic viability of individual tests or diagnostic strategies. Multiple factors influence BVDV testing in determining if testing should be performed and which strategy to use. A stochastic model was constructed to estimate the value of implementing various whole-herd BVDV cow-calf testing protocols. Three common BVDV tests (immunohistochemistry, antigen-capture enzyme-linked immunosorbent assay, and polymerase chain reaction) performed on skin tissue were evaluated as single- or two-test strategies. The estimated testing value was calculated for each strategy at 3 herd sizes that reflect typical farm sizes in the United States (50, 100, and 500 cows) and 3 probabilities of BVDV-positive herd status (0.077, 0.19, 0.47) based upon the literature. The economic value of testing was the difference in estimated gross revenue between simulated cow-calf herds that either did or did not apply the specific testing strategy. Beneficial economic outcomes were more frequently observed when the probability of a herd being BVDV positive was 0.47. Although the relative value ranking of many testing strategies varied by each scenario, the two-test strategy composed of immunohistochemistry had the highest estimated value in all but one herd size-herd prevalence permutation. These data indicate that the estimated value of applying BVDV whole-herd testing strategies is influenced by the selected strategy, herd size, and the probability of herd BVDV-positive status; therefore, these factors should be considered when designing optimum testing strategies for cow-calf herds.

  2. Coal value chain - simulation model

    CSIR Research Space (South Africa)

    Fourie, M

    2005-08-01

    Full Text Available m u la tio n M o de l M e la n ie Fo u rie (S a so l T ec hn o lo gy ) a n d Jo ha n Ja n se va n R en sb u rg (C SI R ) co py rig ht re se rv e d 20 05 , Sa so l T e ch n... o lo gy & Sa so l M in in g 19 th SA IIE a n d 35 th O R SS A Co n fe re n ce 20 05 O u tli n e Ba ck gr o u n d Si m u la tio n o bje ct iv e s Si m u la tio n m o de l M...

  3. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  4. A Monte Carlo Simulation Comparing the Statistical Precision of Two High-Stakes Teacher Evaluation Methods: A Value-Added Model and a Composite Measure

    Science.gov (United States)

    Spencer, Bryden

    2016-01-01

    Value-added models are a class of growth models used in education to assign responsibility for student growth to teachers or schools. For value-added models to be used fairly, sufficient statistical precision is necessary for accurate teacher classification. Previous research indicated precision below practical limits. An alternative approach has…

  5. Aviation Safety Simulation Model

    Science.gov (United States)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  6. The perceived value of using BIM for energy simulation

    Science.gov (United States)

    Lewis, Anderson M.

    Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using

  7. Value encounters - Modeling and analyzing co-creation of value

    NARCIS (Netherlands)

    Weigand, H.; Godart, C.; Gronau, N.; Sharma, S.; Canals, G.

    2009-01-01

    Recent marketing and management literature has introduced the concept of co-creation of value. Current value modeling approaches such as e3-value focus on the exchange of value rather than co-creation. In this paper, an extension to e3-value is proposed in the form of a “value encounter”. Value

  8. Value encounters : Modelling and analyzing co-creation of value

    NARCIS (Netherlands)

    Weigand, H.; Jayasinghe Arachchig, J.

    2009-01-01

    Recent marketing and management literature has introduced the concept of co-creation of value. Current value modeling approaches such as e3-value focus on the exchange of value rather than co-creation. In this paper, an extension to e3-value is proposed in the form of a “value encounter”. Value

  9. Numerical simulation of Higgs models

    International Nuclear Information System (INIS)

    Jaster, A.

    1995-10-01

    The SU(2) Higgs and the Schwinger model on the lattice were analysed. Numerical simulations of the SU(2) Higgs model were performed to study the finite temperature electroweak phase transition. With the help of the multicanonical method the distribution of an order parameter at the phase transition point was measured. This was used to obtain the order of the phase transition and the value of the interface tension with the histogram method. Numerical simulations were also performed at zero temperature to perform renormalization. The measured values for the Wilson loops were used to determine the static potential and from this the renormalized gauge coupling. The Schwinger model was simulated at different gauge couplings to analyse the properties of the Kaplan-Shamir fermions. The prediction that the mass parameter gets only multiplicative renormalization was tested and verified. (orig.)

  10. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  11. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  12. Participatory Systems Modeling to Explore Sustainable Solutions: Triple-Value Simulation Modeling Cases Tackle Nutrient and Watershed Management from a Socio-Ecological Systems (ses) Perspective

    Science.gov (United States)

    Buchholtz ten Brink, M. R.; Heineman, K.; Foley, G. J.; Ruder, E.; Tanners, N.; Bassi, A.; Fiksel, J.

    2016-12-01

    Decision makers often need assistance in understanding dynamic interactions and linkages among economic, environmental and social systems in coastal watersheds. They also need scientific input to better evaluate potential costs and benefits of alternative policy interventions. The US EPA is applying sustainability science to address these needs. Triple Value (3V) Scoping and Modeling projects bring a systems approach to understand complex environmental problems, incorporate local knowledge, and allow decision-makers to explore policy scenarios. This leads to better understanding of feedbacks and outcomes to both human and environmental systems.The Suffolk County, NY (eastern Long Island) 3V Case uses SES interconnections to explore possible policy options and scenarios for intervention to mitigate the effects of excess nitrogen (N) loading to ground, surface, and estuarine waters. Many of the environmental impacts of N pollution have adverse effects on social and economic well-being and productivity. Key are loss of enjoyment and recreational use of local beach environments and loss of income and revenues from tourism and local fisheries. Stakeholders generated this Problem Statement: Suffolk County is experiencing widespread degradation to groundwater and the coastal marine environment caused by excess nitrogen. How can local stakeholders and decision makers in Suffolk County arrest and reverse this degradation, restore conditions to support a healthy thriving ecosystem, strengthen the County's resilience to emerging and expected environmental threats from global climate change, support and promote economic growth, attract a vibrant and sustainable workforce, and maintain and enhance quality of life and affordability for all County residents? They then built a Causal Loop Diagram of indicators and relationships that reflect these issues and identified a set of alternative policy interventions to address them. The project team conducted an extensive review of

  13. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  14. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  15. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  16. Mathematical modeling and simulation in animal health - Part II: principles, methods, applications, and value of physiologically based pharmacokinetic modeling in veterinary medicine and food safety assessment.

    Science.gov (United States)

    Lin, Z; Gehring, R; Mochel, J P; Lavé, T; Riviere, J E

    2016-10-01

    This review provides a tutorial for individuals interested in quantitative veterinary pharmacology and toxicology and offers a basis for establishing guidelines for physiologically based pharmacokinetic (PBPK) model development and application in veterinary medicine. This is important as the application of PBPK modeling in veterinary medicine has evolved over the past two decades. PBPK models can be used to predict drug tissue residues and withdrawal times in food-producing animals, to estimate chemical concentrations at the site of action and target organ toxicity to aid risk assessment of environmental contaminants and/or drugs in both domestic animals and wildlife, as well as to help design therapeutic regimens for veterinary drugs. This review provides a comprehensive summary of PBPK modeling principles, model development methodology, and the current applications in veterinary medicine, with a focus on predictions of drug tissue residues and withdrawal times in food-producing animals. The advantages and disadvantages of PBPK modeling compared to other pharmacokinetic modeling approaches (i.e., classical compartmental/noncompartmental modeling, nonlinear mixed-effects modeling, and interspecies allometric scaling) are further presented. The review finally discusses contemporary challenges and our perspectives on model documentation, evaluation criteria, quality improvement, and offers solutions to increase model acceptance and applications in veterinary pharmacology and toxicology. © 2016 John Wiley & Sons Ltd.

  17. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  18. Value Encounters - Modeling and Analyzing Co-creation of Value

    Science.gov (United States)

    Weigand, Hans

    Recent marketing and management literature has introduced the concept of co-creation of value. Current value modeling approaches such as e3-value focus on the exchange of value rather than co-creation. In this paper, an extension to e3-value is proposed in the form of a “value encounter”. Value encounters are defined as interaction spaces where a group of actors meet and derive value by each one bringing in some of its own resources. They can be analyzed from multiple strategic perspectives, including knowledge management, social network management and operational management. Value encounter modeling can be instrumental in the context of service analysis and design.

  19. Simulating cyber warfare and cyber defenses: information value considerations

    Science.gov (United States)

    Stytz, Martin R.; Banks, Sheila B.

    2011-06-01

    Simulating cyber warfare is critical to the preparation of decision-makers for the challenges posed by cyber attacks. Simulation is the only means we have to prepare decision-makers for the inevitable cyber attacks upon the information they will need for decision-making and to develop cyber warfare strategies and tactics. Currently, there is no theory regarding the strategies that should be used to achieve objectives in offensive or defensive cyber warfare, and cyber warfare occurs too rarely to use real-world experience to develop effective strategies. To simulate cyber warfare by affecting the information used for decision-making, we modify the information content of the rings that are compromised during in a decision-making context. The number of rings affected and value of the information that is altered (i.e., the closeness of the ring to the center) is determined by the expertise of the decision-maker and the learning outcome(s) for the simulation exercise. We determine which information rings are compromised using the probability that the simulated cyber defenses that protect each ring can be compromised. These probabilities are based upon prior cyber attack activity in the simulation exercise as well as similar real-world cyber attacks. To determine which information in a compromised "ring" to alter, the simulation environment maintains a record of the cyber attacks that have succeeded in the simulation environment as well as the decision-making context. These two pieces of information are used to compute an estimate of the likelihood that the cyber attack can alter, destroy, or falsify each piece of information in a compromised ring. The unpredictability of information alteration in our approach adds greater realism to the cyber event. This paper suggests a new technique that can be used for cyber warfare simulation, the ring approach for modeling context-dependent information value, and our means for considering information value when assigning cyber

  20. Analysis of Macro-micro Simulation Models for Service-Oriented Public Platform: Coordination of Networked Services and Measurement of Public Values

    Science.gov (United States)

    Kinoshita, Yumiko

    When service sectors are a major driver for the growth of the world economy, we are challenged to implement service-oriented infrastructure as e-Gov platform to achieve further growth and innovation for both developed and developing countries. According to recent trends in service industry, it is clarified that main factors for the growth of service sectors are investment into knowledge, trade, and the enhanced capacity of micro, small, and medium-sized enterprises (MSMEs). In addition, the design and deployment of public service platform require appropriate evaluation methodology. Reflecting these observations, this paper proposes macro-micro simulation approach to assess public values (PV) focusing on MSMEs. Linkage aggregate variables (LAVs) are defined to show connection between macro and micro impacts of public services. As a result, the relationship of demography, business environment, macro economy, and socio-economic impact are clarified and their values are quantified from the behavioral perspectives of citizens and firms.

  1. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Science.gov (United States)

    Braithwaite, R Scott; Omokaro, Cynthia; Justice, Amy C; Nucifora, Kimberly; Roberts, Mark S

    2010-02-16

    Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (value services ($100,000-$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase

  2. Models and simulations

    International Nuclear Information System (INIS)

    Lee, M.J.; Sheppard, J.C.; Sullenberger, M.; Woodley, M.D.

    1983-09-01

    On-line mathematical models have been used successfully for computer controlled operation of SPEAR and PEP. The same model control concept is being implemented for the operation of the LINAC and for the Damping Ring, which will be part of the Stanford Linear Collider (SLC). The purpose of this paper is to describe the general relationships between models, simulations and the control system for any machine at SLAC. The work we have done on the development of the empirical model for the Damping Ring will be presented as an example

  3. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  4. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  5. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    Directory of Open Access Journals (Sweden)

    R Scott Braithwaite

    2010-02-01

    Full Text Available BACKGROUND: Evidence suggests that cost sharing (i.e.,copayments and deductibles decreases health expenditures but also reduces essential care. Value-based insurance design (VBID has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. METHODS AND FINDINGS: We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1 applying VBID solely to pharmacy benefits and (2 applying VBID to both pharmacy benefits and other health care services (e.g., devices. We assumed that cost sharing would be eliminated for high-value services ($300,000 per life-year. All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80% of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would increase the benefit conferred by health care by 1.21 life-years, a 31% increase. CONCLUSION: Broader diffusion of VBID may amplify benefits from

  6. NET PRESENT VALUE SIMULATING WITH A SPREADSHEET

    Directory of Open Access Journals (Sweden)

    Maria CONSTANTINESCU

    2010-01-01

    Full Text Available Decision making has always been a difficult process, based on various combinations if objectivity (when scientific tools were used and subjectivity (considering that decisions are finally made by people, with their strengths and weaknesses. The IT revolution has also reached the areas of management and decision making, helping managers make better and more informed decisions by providing them with a variety of tools, from the personal computers to the specialized software. Most simulations are performed in a spreadsheet, because the number of calculations required soon overwhelms human capability.

  7. PDOP values for simulated GPS/Galileo positioning

    DEFF Research Database (Denmark)

    Cederholm, Jens Peter

    2005-01-01

    The paper illustrates satellite coverage and PDOP values for a simulated combined GPS/Galileo system. The designed GPS satellite constellation and the planned Galileo satellite constellation are presented. The combined system is simulated and the number of visible satellites and PDOP values...

  8. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  9. Wake modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, G.C.; Aagaard Madsen, H.; Larsen, T.J.; Troldborg, N.

    2008-07-15

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, however, have the potential to include also mutual wake interaction phenomenons. The basic conjecture behind the dynamic wake meandering (DWM) model is that wake transportation in the atmospheric boundary layer is driven by the large scale lateral- and vertical turbulence components. Based on this conjecture a stochastic model of the downstream wake meandering is formulated. In addition to the kinematic formulation of the dynamics of the 'meandering frame of reference', models characterizing the mean wake deficit as well as the added wake turbulence, described in the meandering frame of reference, are an integrated part the DWM model complex. For design applications, the computational efficiency of wake deficit prediction is a key issue. A computationally low cost model is developed for this purpose. Likewise, the character of the added wake turbulence, generated by the up-stream turbine in the form of shed and trailed vorticity, has been approached by a simple semi-empirical model essentially based on an eddy viscosity philosophy. Contrary to previous attempts to model wake loading, the DWM approach opens for a unifying description in the sense that turbine power- and load aspects can be treated simultaneously. This capability is a direct and attractive consequence of the model being based on the underlying physical process, and it potentially opens for optimization of wind farm topology, of wind farm operation as well as of control strategies for the individual turbine. To establish an integrated modeling tool, the DWM methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjaereborg wind farm, have

  10. Achieving Value in Primary Care: The Primary Care Value Model.

    Science.gov (United States)

    Rollow, William; Cucchiara, Peter

    2016-03-01

    The patient-centered medical home (PCMH) model provides a compelling vision for primary care transformation, but studies of its impact have used insufficiently patient-centered metrics with inconsistent results. We propose a framework for defining patient-centered value and a new model for value-based primary care transformation: the primary care value model (PCVM). We advocate for use of patient-centered value when measuring the impact of primary care transformation, recognition, and performance-based payment; for financial support and research and development to better define primary care value-creating activities and their implementation; and for use of the model to support primary care organizations in transformation. © 2016 Annals of Family Medicine, Inc.

  11. Biomolecular modelling and simulations

    CERN Document Server

    Karabencheva-Christova, Tatyana

    2014-01-01

    Published continuously since 1944, the Advances in Protein Chemistry and Structural Biology series is the essential resource for protein chemists. Each volume brings forth new information about protocols and analysis of proteins. Each thematically organized volume is guest edited by leading experts in a broad range of protein-related topics. Describes advances in biomolecular modelling and simulations Chapters are written by authorities in their field Targeted to a wide audience of researchers, specialists, and students The information provided in the volume is well supported by a number of high quality illustrations, figures, and tables.

  12. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    Science.gov (United States)

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Thresholds of Extinction: Simulation Strategies in Environmental Values Education.

    Science.gov (United States)

    Glew, Frank

    1990-01-01

    Describes a simulation exercise for campers and an accompanying curriculum unit--"Thresholds of Extinction"--that addresses the issues of endangered species. Uses this context to illustrate steps in the process of values development: awareness, gathering data, resolution (decision making), responsibility (acting on values), and…

  14. Applying the Expectancy-Value Model to understand health values.

    Science.gov (United States)

    Zhang, Xu-Hao; Xie, Feng; Wee, Hwee-Lin; Thumboo, Julian; Li, Shu-Chuen

    2008-03-01

    Expectancy-Value Model (EVM) is the most structured model in psychology to predict attitudes by measuring attitudinal attributes (AAs) and relevant external variables. Because health value could be categorized as attitude, we aimed to apply EVM to explore its usefulness in explaining variances in health values and investigate underlying factors. Focus group discussion was carried out to identify the most common and significant AAs toward 5 different health states (coded as 11111, 11121, 21221, 32323, and 33333 in EuroQol Five-Dimension (EQ-5D) descriptive system). AAs were measured in a sum of multiplications of subjective probability (expectancy) and perceived value of attributes with 7-point Likert scales. Health values were measured using visual analog scales (VAS, range 0-1). External variables (age, sex, ethnicity, education, housing, marital status, and concurrent chronic diseases) were also incorporated into survey questionnaire distributed by convenience sampling among eligible respondents. Univariate analyses were used to identify external variables causing significant differences in VAS. Multiple linear regression model (MLR) and hierarchical regression model were used to investigate the explanatory power of AAs and possible significant external variable(s) separately or in combination, for each individual health state and a mixed scenario of five states, respectively. Four AAs were identified, namely, "worsening your quality of life in terms of health" (WQoL), "adding a burden to your family" (BTF), "making you less independent" (MLI) and "unable to work or study" (UWS). Data were analyzed based on 232 respondents (mean [SD] age: 27.7 [15.07] years, 49.1% female). Health values varied significantly across 5 health states, ranging from 0.12 (33333) to 0.97 (11111). With no significant external variables identified, EVM explained up to 62% of the variances in health values across 5 health states. The explanatory power of 4 AAs were found to be between 13

  15. Proving the ecosystem value through hydrological modelling

    International Nuclear Information System (INIS)

    Dorner, W; Spachinger, K; Metzka, R; Porter, M

    2008-01-01

    Ecosystems provide valuable functions. Also natural floodplains and river structures offer different types of ecosystem functions such as habitat function, recreational area and natural detention. From an economic stand point the loss (or rehabilitation) of these natural systems and their provided natural services can be valued as a damage (or benefit). Consequently these natural goods and services must be economically valued in project assessments e.g. cost-benefit-analysis or cost comparison. Especially in smaller catchments and river systems exists significant evidence that natural flood detention reduces flood risk and contributes to flood protection. Several research projects evaluated the mitigating effect of land use, river training and the loss of natural flood plains on development, peak and volume of floods. The presented project analysis the hypothesis that ignoring natural detention and hydrological ecosystem services could result in economically inefficient solutions for flood protection and mitigation. In test areas, subcatchments of the Danube in Germany, a combination of hydrological and hydrodynamic models with economic evaluation techniques was applied. Different forms of land use, river structure and flood protection measures were assed and compared from a hydrological and economic point of view. A hydrodynamic model was used to simulate flows to assess the extent of flood affected areas and damages to buildings and infrastructure as well as to investigate the impacts of levees and river structure on a local scale. These model results provided the basis for an economic assessment. Different economic valuation techniques, such as flood damage functions, cost comparison method and substation-approach were used to compare the outcomes of different hydrological scenarios from an economic point of view and value the ecosystem service. The results give significant evidence that natural detention must be evaluated as part of flood mitigation projects

  16. Mean Value Modelling of Turbocharged SI Engines

    DEFF Research Database (Denmark)

    Müller, Martin; Hendricks, Elbert; Sorenson, Spencer C.

    1998-01-01

    The development of a computer simulation to predict the performance of a turbocharged spark ignition engine during transient operation. New models have been developed for the turbocharged and the intercooling system. An adiabatic model for the intake manifold is presented.......The development of a computer simulation to predict the performance of a turbocharged spark ignition engine during transient operation. New models have been developed for the turbocharged and the intercooling system. An adiabatic model for the intake manifold is presented....

  17. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  18. E3value to BPMN model transformation

    NARCIS (Netherlands)

    Fatemi, Hassan; van Sinderen, Marten J.; Wieringa, Roelf J.; Wieringa, P.A.; Camarinha-Matos, Luis M.; Pereira Klen, Alexandra; Afsarmanesh, Hamidesh

    2011-01-01

    Business value and coordination process perspectives need to be taken into consideration while modeling business collaborations. The need for these two models stems from the importance of separating the how from the what concerns. A business value model shows what is offered by whom to whom while a

  19. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operati...

  20. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  1. Incorporating Customer Lifetime Value into Marketing Simulation Games

    Science.gov (United States)

    Cannon, Hugh M.; Cannon, James N.; Schwaiger, Manfred

    2010-01-01

    Notwithstanding the emerging prominence of customer lifetime value (CLV) and customer equity (CE) in the marketing literature during the past decade, virtually nothing has been done to address these concepts in the literature on simulation and gaming. This article addresses the failing, discussing the nature of CLV and CE and demonstrating how…

  2. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    trials. However, if simulation models would be used, good quality input data must be available. To model FMD, several disease spread models are available. For this project, we chose three simulation model; Davis Animal Disease Spread (DADS), that has been upgraded to DTU-DADS, InterSpread Plus (ISP......Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and field...... trials to investigate the effect of alternative conditions or actions on a specific system. Nonetheless, field trials are expensive and sometimes not possible to conduct, as in case of foot-and-mouth disease (FMD). Instead, simulation models can be a good and cheap substitute for experiments and field...

  3. An Interval-Valued Approach to Business Process Simulation Based on Genetic Algorithms and the BPMN

    Directory of Open Access Journals (Sweden)

    Mario G.C.A. Cimino

    2014-05-01

    Full Text Available Simulating organizational processes characterized by interacting human activities, resources, business rules and constraints, is a challenging task, because of the inherent uncertainty, inaccuracy, variability and dynamicity. With regard to this problem, currently available business process simulation (BPS methods and tools are unable to efficiently capture the process behavior along its lifecycle. In this paper, a novel approach of BPS is presented. To build and manage simulation models according to the proposed approach, a simulation system is designed, developed and tested on pilot scenarios, as well as on real-world processes. The proposed approach exploits interval-valued data to represent model parameters, in place of conventional single-valued or probability-valued parameters. Indeed, an interval-valued parameter is comprehensive; it is the easiest to understand and express and the simplest to process, among multi-valued representations. In order to compute the interval-valued output of the system, a genetic algorithm is used. The resulting process model allows forming mappings at different levels of detail and, therefore, at different model resolutions. The system has been developed as an extension of a publicly available simulation engine, based on the Business Process Model and Notation (BPMN standard.

  4. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  5. Value function in economic growth model

    Science.gov (United States)

    Bagno, Alexander; Tarasyev, Alexandr A.; Tarasyev, Alexander M.

    2017-11-01

    Properties of the value function are examined in an infinite horizon optimal control problem with an unlimited integrand index appearing in the quality functional with a discount factor. Optimal control problems of such type describe solutions in models of economic growth. Necessary and sufficient conditions are derived to ensure that the value function satisfies the infinitesimal stability properties. It is proved that value function coincides with the minimax solution of the Hamilton-Jacobi equation. Description of the growth asymptotic behavior for the value function is provided for the logarithmic, power and exponential quality functionals and an example is given to illustrate construction of the value function in economic growth models.

  6. Adding Value in Construction Design Management by Using Simulation Approach

    OpenAIRE

    Doloi, Hemanta

    2008-01-01

    Simulation modelling has been introduced as a decision support tool for front end planning and design analysis of projects. An integrated approach has been discussed linking project scope, end product or project facility performance and the strategic project objectives at the early stage of projects. The case study example on tram network demonstrates that application of simulation helps assessing performance of project operation and making appropriate investment decisions over life cycle of ...

  7. Modeling Business Strategy: A Consumer Value Perspective

    OpenAIRE

    Svee , Eric-Oluf; Giannoulis , Constantinos; Zdravkovic , Jelena

    2011-01-01

    Part 3: Business Modeling; International audience; Business strategy lays out the plan of an enterprise to achieve its vision by providing value to its customers. Typically, business strategy focuses on economic value and its relevant exchanges with customers and does not directly address consumer values. However, consumer values drive customers’ choices and decisions to use a product or service, and therefore should have a direct impact on business strategy. This paper explores whether and h...

  8. The Mixed Instrumental Controller: Using Value of Information to Combine Habitual Choice and Mental Simulation

    Directory of Open Access Journals (Sweden)

    Giovanni ePezzulo

    2013-03-01

    Full Text Available Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available "cached" value of actions (linked to model-free mechanisms or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms. Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated "Value of Information" exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus - ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation.

  9. The mixed instrumental controller: using value of information to combine habitual choice and mental simulation.

    Science.gov (United States)

    Pezzulo, Giovanni; Rigoli, Francesco; Chersi, Fabian

    2013-01-01

    Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available "cached" value of actions (linked to model-free mechanisms) or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms). Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated "Value of Information" exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus - ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation.

  10. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

  11. Rights and Intentions in Value Modeling

    Science.gov (United States)

    Johannesson, Paul; Bergholtz, Maria

    In order to manage increasingly complex business and IT environments, organizations need effective instruments for representing and understanding this complexity. Essential among these instruments are enterprise models, i.e. computational representations of the structure, processes, information, resources, and intentions of organizations. One important class of enterprise models are value models, which focus on the business motivations and intentions behind business processes and describe them in terms of high level notions like actors, resources, and value exchanges. The essence of these value exchanges is often taken to be an ownership transfer. However, some value exchanges cannot be analyzed in this way, e.g. the use of a service does not influence ownership. The goal of this chapter is to offer an analysis of the notion of value exchanges, based on Hohfeld's classification of rights, and to propose notation and practical modeling guidelines that make use of this analysis.

  12. ECONOMIC MODELING STOCKS CONTROL SYSTEM: SIMULATION MODEL

    OpenAIRE

    Климак, М.С.; Войтко, С.В.

    2016-01-01

    Considered theoretical and applied aspects of the development of simulation models to predictthe optimal development and production systems that create tangible products andservices. It isproved that theprocessof inventory control needs of economicandmathematical modeling in viewof thecomplexity of theoretical studies. A simulation model of stocks control that allows make managementdecisions with production logistics

  13. Progress in modeling and simulation.

    Science.gov (United States)

    Kindler, E

    1998-01-01

    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  14. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  15. FASTBUS simulation models in VHDL

    International Nuclear Information System (INIS)

    Appelquist, G.

    1992-11-01

    Four hardware simulation models implementing the FASTBUS protocol are described. The models are written in the VHDL hardware description language to obtain portability, i.e. without relations to any specific simulator. They include two complete FASTBUS devices, a full-duplex segment interconnect and ancillary logic for the segment. In addition, master and slave models using a high level interface to describe FASTBUS operations, are presented. With these models different configurations of FASTBUS systems can be evaluated and the FASTBUS transactions of new devices can be verified. (au)

  16. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  17. Greenhouse simulation models.

    NARCIS (Netherlands)

    Bot, G.P.A.

    1989-01-01

    A model is a representation of a real system to describe some properties i.e. internal factors of that system (out-puts) as function of some external factors (inputs). It is impossible to describe the relation between all internal factors (if even all internal factors could be defined) and all

  18. Comparison of perceived value structural models

    Directory of Open Access Journals (Sweden)

    Sunčana Piri Rajh

    2012-07-01

    Full Text Available Perceived value has been considered an important determinant of consumer shopping behavior and studied as such for a long period of time. According to one research stream, perceived value is a variable determined by perceived quality and perceived sacrifice. Another research stream suggests that the perception of value is a result of the consumer risk perception. This implies the presence of two somewhat independent research streams that are integrated by a third research stream – the one suggesting that perceived value is a result of perceived quality and perceived sacrifices while perceived (performance and financial risk mediates the relationship between perceived quality and perceived sacrifices on the one hand, and perceived value on the other. This paper describes the three approaches (models that have been mentioned. The aim of the paper is to determine which of the observed models show the most acceptable level of fit to the empirical data. Using the survey method, research involving three product categories has been conducted on a sample of Croatian consumers. Collected data was analyzed by the structural equation modeling (SEM method. Research has shown an appropriate level of fit of each observed model to the empirical data. However, the model measuring the effect of perceived risk on perceived value indicates the best level of fit, which implies that perceived performance risk and perceived financial risk are the best predictors of perceived value.

  19. Calculation for simulation of archery goal value using a web camera and ultrasonic sensor

    Science.gov (United States)

    Rusjdi, Darma; Abdurrasyid, Wulandari, Dewi Arianti

    2017-08-01

    Development of the device simulator digital indoor archery-based embedded systems as a solution to the limitations of the field or open space is adequate, especially in big cities. Development of the device requires simulations to calculate the value of achieving the target based on the approach defined by the parabolic motion variable initial velocity and direction of motion of the arrow reaches the target. The simulator device should be complemented with an initial velocity measuring device using ultrasonic sensors and measuring direction of the target using a digital camera. The methodology uses research and development of application software from modeling and simulation approach. The research objective to create simulation applications calculating the value of the achievement of the target arrows. Benefits as a preliminary stage for the development of the simulator device of archery. Implementation of calculating the value of the target arrows into the application program generates a simulation game of archery that can be used as a reference development of the digital archery simulator in a room with embedded systems using ultrasonic sensors and web cameras. Applications developed with the simulation calculation comparing the outer radius of the circle produced a camera from a distance of three meters.

  20. A VRLA battery simulation model

    International Nuclear Information System (INIS)

    Pascoe, Phillip E.; Anbuky, Adnan H.

    2004-01-01

    A valve regulated lead acid (VRLA) battery simulation model is an invaluable tool for the standby power system engineer. The obvious use for such a model is to allow the assessment of battery performance. This may involve determining the influence of cells suffering from state of health (SOH) degradation on the performance of the entire string, or the running of test scenarios to ascertain the most suitable battery size for the application. In addition, it enables the engineer to assess the performance of the overall power system. This includes, for example, running test scenarios to determine the benefits of various load shedding schemes. It also allows the assessment of other power system components, either for determining their requirements and/or vulnerabilities. Finally, a VRLA battery simulation model is vital as a stand alone tool for educational purposes. Despite the fundamentals of the VRLA battery having been established for over 100 years, its operating behaviour is often poorly understood. An accurate simulation model enables the engineer to gain a better understanding of VRLA battery behaviour. A system level multipurpose VRLA battery simulation model is presented. It allows an arbitrary battery (capacity, SOH, number of cells and number of strings) to be simulated under arbitrary operating conditions (discharge rate, ambient temperature, end voltage, charge rate and initial state of charge). The model accurately reflects the VRLA battery discharge and recharge behaviour. This includes the complex start of discharge region known as the coup de fouet

  1. p-values for model evaluation

    International Nuclear Information System (INIS)

    Beaujean, F.; Caldwell, A.; Kollar, D.; Kroeninger, K.

    2011-01-01

    Deciding whether a model provides a good description of data is often based on a goodness-of-fit criterion summarized by a p-value. Although there is considerable confusion concerning the meaning of p-values, leading to their misuse, they are nevertheless of practical importance in common data analysis tasks. We motivate their application using a Bayesian argumentation. We then describe commonly and less commonly known discrepancy variables and how they are used to define p-values. The distribution of these are then extracted for examples modeled on typical data analysis tasks, and comments on their usefulness for determining goodness-of-fit are given.

  2. A collision model in plasma particle simulations

    International Nuclear Information System (INIS)

    Ma Yanyun; Chang Wenwei; Yin Yan; Yue Zongwu; Cao Lihua; Liu Daqing

    2000-01-01

    In order to offset the collisional effects reduced by using finite-size particles, β particle clouds are used in particle simulation codes (β is the ratio of charge or mass of modeling particles to real ones). The method of impulse approximation (strait line orbit approximation) is used to analyze the scattering cross section of β particle clouds plasmas. The authors can obtain the relation of the value of a and β and scattering cross section (a is the radius of β particle cloud). By using this relation the authors can determine the value of a and β so that the collisional effects of the modeling system is correspondent with the real one. The authors can also adjust the values of a and β so that the authors can enhance or reduce the collisional effects fictitiously. The results of simulation are in good agreement with the theoretical ones

  3. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  4. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  5. Dual Value Creation and Business Model Design

    DEFF Research Database (Denmark)

    Turcan, Romeo V.

    This ethnographic research explores the process of business model design in the context of an NGO internationalizing to an emerging market. It contributes to the business model literature by investigating how this NGO - targeting multiple key stakeholders - was experimenting (1) with value...

  6. A Simulation Model Articulation of the REA Ontology

    Science.gov (United States)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  7. Value for money in particle-mesh plasma simulations

    International Nuclear Information System (INIS)

    Eastwood, J.W.

    1976-01-01

    The established particle-mesh method of simulating a collisionless plasma is discussed. Problems are outlined, and it is stated that given constraints on mesh size and particle number, the only way to adjust the compromise between dispersive forces, collision time and heating time is by altering the force calculating cycle. In 'value for money', schemes, matching of parts of the force calculation cycle is optimized. Interparticle forces are considered. Optimized combinations of elements of the force calculation cycle are compared. Following sections cover the dispersion relation, and comparisons with other schemes. (U.K.)

  8. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  9. A virtual laboratory notebook for simulation models.

    Science.gov (United States)

    Winfield, A J

    1998-01-01

    In this paper we describe how we have adopted the laboratory notebook as a metaphor for interacting with computer simulation models. This 'virtual' notebook stores the simulation output and meta-data (which is used to record the scientist's interactions with the simulation). The meta-data stored consists of annotations (equivalent to marginal notes in a laboratory notebook), a history tree and a log of user interactions. The history tree structure records when in 'simulation' time, and from what starting point in the tree changes are made to the parameters by the user. Typically these changes define a new run of the simulation model (which is represented as a new branch of the history tree). The tree shows the structure of the changes made to the simulation and the log is required to keep the order in which the changes occurred. Together they form a record which you would normally find in a laboratory notebook. The history tree is plotted in simulation parameter space. This shows the scientist's interactions with the simulation visually and allows direct manipulation of the parameter information presented, which in turn is used to control directly the state of the simulation. The interactions with the system are graphical and usually involve directly selecting or dragging data markers and other graphical control devices around in parameter space. If the graphical manipulators do not provide precise enough control then textual manipulation is still available which allows numerical values to be entered by hand. The Virtual Laboratory Notebook, by providing interesting interactions with the visual view of the history tree, provides a mechanism for giving the user complex and novel ways of interacting with biological computer simulation models.

  10. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  11. Plasma modelling and numerical simulation

    International Nuclear Information System (INIS)

    Van Dijk, J; Kroesen, G M W; Bogaerts, A

    2009-01-01

    Plasma modelling is an exciting subject in which virtually all physical disciplines are represented. Plasma models combine the electromagnetic, statistical and fluid dynamical theories that have their roots in the 19th century with the modern insights concerning the structure of matter that were developed throughout the 20th century. The present cluster issue consists of 20 invited contributions, which are representative of the state of the art in plasma modelling and numerical simulation. These contributions provide an in-depth discussion of the major theories and modelling and simulation strategies, and their applications to contemporary plasma-based technologies. In this editorial review, we introduce and complement those papers by providing a bird's eye perspective on plasma modelling and discussing the historical context in which it has surfaced. (editorial review)

  12. Anybody can do Value at Risk: A Teaching Study using Parametric Computation and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Yun Hsing Cheung

    2012-12-01

    Full Text Available The three main Value at Risk (VaR methodologies are historical, parametric and Monte Carlo Simulation.Cheung & Powell (2012, using a step-by-step teaching study, showed how a nonparametric historical VaRmodel could be constructed using Excel, thus benefitting teachers and researchers by providing them with areadily useable teaching study and an inexpensive and flexible VaR modelling option. This article extends thatwork by demonstrating how parametric and Monte Carlo Simulation VaR models can also be constructed inExcel, thus providing a total Excel modelling package encompassing all three VaR methods.

  13. A SIMULATION MODEL OF THE GAS COMPLEX

    Directory of Open Access Journals (Sweden)

    Sokolova G. E.

    2016-06-01

    Full Text Available The article considers the dynamics of gas production in Russia, the structure of sales in the different market segments, as well as comparative dynamics of selling prices on these segments. Problems of approach to the creation of the gas complex using a simulation model, allowing to estimate efficiency of the project and determine the stability region of the obtained solutions. In the presented model takes into account the unit repayment of the loan, allowing with the first year of simulation to determine the possibility of repayment of the loan. The model object is a group of gas fields, which is determined by the minimum flow rate above which the project is cost-effective. In determining the minimum source flow rate for the norm of discount is taken as a generalized weighted average percentage on debt and equity taking into account risk premiums. He also serves as the lower barrier to internal rate of return below which the project is rejected as ineffective. Analysis of the dynamics and methods of expert evaluation allow to determine the intervals of variation of the simulated parameters, such as the price of gas and the exit gas complex at projected capacity. Calculated using the Monte Carlo method, for each random realization of the model simulated values of parameters allow to obtain a set of optimal for each realization of values minimum yield of wells, and also allows to determine the stability region of the solution.

  14. Modeling and simulation of photovoltaic solar panel

    International Nuclear Information System (INIS)

    Belarbi, M.; Haddouche, K.; Midoun, A.

    2006-01-01

    In this article, we present a new approach for estimating the model parameters of a photovoltaic solar panel according to the irradiance and temperature. The parameters of the one diode model are given from the knowledge of three operating points: short-circuit, open circuit, and maximum power. In the first step, the adopted approach concerns the resolution of the system of equations constituting the three operating points to write all the model parameters according to series resistance. Secondly, we make an iterative resolution at the optimal operating point by using the Newton-Raphson method to calculate the series resistance value as well as the model parameters. Once the panel model is identified, we consider other equations for taking into account the irradiance and temperature effect. The simulation results show the convergence speed of the model parameters and the possibility of visualizing the electrical behaviour of the panel according to the irradiance and temperature. Let us note that a sensitivity of the algorithm at the optimal operating point was observed owing to the fact that a small variation of the optimal voltage value leads to a very great variation of the identified parameters values. With the identified model, we can develop algorithms of maximum power point tracking, and make simulations of a solar water pumping system.(Author)

  15. The Values of College Students in Business Simulation Game: A Means-End Chain Approach

    Science.gov (United States)

    Lin, Yu-Ling; Tu, Yu-Zu

    2012-01-01

    Business simulation games (BSGs) enable students to practice making decisions in a virtual environment, accumulate experience in application of strategies, and train themselves in modes of decision-making. This study examines the value sought by players of BSG. In this study, a means-end chain (MEC) model was adopted as the basis, and ladder…

  16. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance...... eigenfunctions and estimates of the distributions of the corresponding expansion coefficients. The simulation method utilizes the eigenfunction expansion procedure to produce preliminary time histories of the three velocity components simultaneously. As a final step, a spectral shaping procedure is then applied....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  17. New Trends, News Values, and New Models.

    Science.gov (United States)

    Higgins, Mary Anne

    1996-01-01

    Explores implications of the prediction that in the next millennium the public will experience a scarcity of knowledge and a surplus of information. Reviews research suggesting that journalists focus on these news values: emphasizing how/why, devaluing immediacy, specializing/analyzing, representing a constituency. Examines two new models of…

  18. Mathematical models for photovoltaic solar panel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Jose Airton A. dos; Gnoatto, Estor; Fischborn, Marcos; Kavanagh, Edward [Universidade Tecnologica Federal do Parana (UTFPR), Medianeira, PR (Brazil)], Emails: airton@utfpr.edu.br, gnoatto@utfpr.edu.br, fisch@utfpr.edu.br, kavanagh@utfpr.edu.br

    2008-07-01

    A photovoltaic generator is subject to several variations of solar intensity, ambient temperature or load, that change your point of operation. This way, your behavior should be analyzed by such alterations, to optimize your operation. The present work sought to simulate a photovoltaic generator, of polycrystalline silicon, by characteristics supplied by the manufacturer, and to compare the results of two mathematical models with obtained values of field, in the city of Cascavel, for a period of one year. (author)

  19. Compression of magnetohydrodynamic simulation data using singular value decomposition

    International Nuclear Information System (INIS)

    Castillo Negrete, D. del; Hirshman, S.P.; Spong, D.A.; D'Azevedo, E.F.

    2007-01-01

    Numerical calculations of magnetic and flow fields in magnetohydrodynamic (MHD) simulations can result in extensive data sets. Particle-based calculations in these MHD fields, needed to provide closure relations for the MHD equations, will require communication of this data to multiple processors and rapid interpolation at numerous particle orbit positions. To facilitate this analysis it is advantageous to compress the data using singular value decomposition (SVD, or principal orthogonal decomposition, POD) methods. As an example of the compression technique, SVD is applied to magnetic field data arising from a dynamic nonlinear MHD code. The performance of the SVD compression algorithm is analyzed by calculating Poincare plots for electron orbits in a three-dimensional magnetic field and comparing the results with uncompressed data

  20. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  1. Modeling and Simulation for Safeguards

    International Nuclear Information System (INIS)

    Swinhoe, Martyn T.

    2012-01-01

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R and D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  2. Modeling and Simulation of Nanoindentation

    Science.gov (United States)

    Huang, Sixie; Zhou, Caizhi

    2017-11-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  3. An Alignment Model for Collaborative Value Networks

    Science.gov (United States)

    Bremer, Carlos; Azevedo, Rodrigo Cambiaghi; Klen, Alexandra Pereira

    This paper presents parts of the work carried out in several global organizations through the development of strategic projects with high tactical and operational complexity. By investing in long-term relationships, strongly operating in the transformation of the competitive model and focusing on the value chain management, the main aim of these projects was the alignment of multiple value chains. The projects were led by the Axia Transformation Methodology as well as by its Management Model and following the principles of Project Management. As a concrete result of the efforts made in the last years in the Brazilian market this work also introduces the Alignment Model which supports the transformation process that the companies undergo.

  4. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  5. NRTA simulation by modeling PFPF

    International Nuclear Information System (INIS)

    Asano, Takashi; Fujiwara, Shigeo; Takahashi, Saburo; Shibata, Junichi; Totsu, Noriko

    2003-01-01

    In PFPF, NRTA system has been applied since 1991. It has been confirmed by evaluating facility material accountancy data provided from operator in each IIV that a significant MUF was not generated. In case of throughput of PFPF scale, MUF can be evaluated with a sufficient detection probability by the present NRTA evaluation manner. However, by increasing of throughput, the uncertainty of material accountancy will increase, and the detection probability will decline. The relationship between increasing of throughput and declining of detection probability and the maximum throughput upon application of following measures with a sufficient detection probability were evaluated by simulation of NRTA system. This simulation was performed by modeling of PFPF. Measures for increasing detection probability are shown as follows. Shortening of the evaluation interval. Segmentation of evaluation area. This report shows the results of these simulations. (author)

  6. The "resident's dilemma"? Values and strategies of medical residents for education interactions: a cellular automata simulation.

    Science.gov (United States)

    Heckerling, P S; Gerber, B S; Weiner, S J

    2006-01-01

    Medical residents engage in formal and informal education interactions with fellow residents during the working day, and can choose whether to spend time and effort on such interactions. Time and effort spent on such interactions can bring learning and personal satisfaction to residents, but may also delay completion of clinical work. Using hypothetical cases, we assessed the values and strategies of internal medicine residents at one hospital for both cooperative and non-cooperative education interactions with fellow residents. We then used these data and cellular automata models of two-person games to simulate repeated interactions between residents, and to determine which strategies resulted in greatest accrued value. We conducted sensitivity analyses on several model parameters, to test the robustness of dominant strategies to model assumptions. Twenty-nine of the 57 residents (50.9%) valued cooperation more than non-cooperation no matter what the other resident did during the current interaction. Similarly, thirty-six residents (63.2%) endorsed an unconditional always-cooperate strategy no matter what the other resident had done during their previous interaction. In simulations, an always-cooperate strategy accrued more value (776.42 value units) than an aggregate of strategies containing non-cooperation components (675.0 value units, p = 0.052). Only when the probability of strategy errors reached 50%, or when values were re-ordered to match those of a Prisoner's Dilemma, did non-cooperation-based strategies accrue the most value. Cooperation-based values and strategies were most frequent among our residents, and dominated in simulations of repeated education interactions between them.

  7. Repository simulation model: Final report

    International Nuclear Information System (INIS)

    1988-03-01

    This report documents the application of computer simulation for the design analysis of the nuclear waste repository's waste handling and packaging operations. The Salt Repository Simulation Model was used to evaluate design alternatives during the conceptual design phase of the Salt Repository Project. Code development and verification was performed by the Office of Nuclear Waste Isolation (ONWL). The focus of this report is to relate the experience gained during the development and application of the Salt Repository Simulation Model to future repository design phases. Design of the repository's waste handling and packaging systems will require sophisticated analysis tools to evaluate complex operational and logistical design alternatives. Selection of these design alternatives in the Advanced Conceptual Design (ACD) and License Application Design (LAD) phases must be supported by analysis to demonstrate that the repository design will cost effectively meet DOE's mandated emplacement schedule and that uncertainties in the performance of the repository's systems have been objectively evaluated. Computer simulation of repository operations will provide future repository designers with data and insights that no other analytical form of analysis can provide. 6 refs., 10 figs

  8. Simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  9. Standard for Models and Simulations

    Science.gov (United States)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  10. The value of simulations and games for tertiary education

    NARCIS (Netherlands)

    Overmans, J.F.A.|info:eu-repo/dai/nl/375780718; Bakker, W.E.|info:eu-repo/dai/nl/080095291; van Zeeland, Y.R.A.|info:eu-repo/dai/nl/314101160; van der Ree, G.; Jeuring, J.T.|info:eu-repo/dai/nl/075189771; van Mil, M.H.W.; Glas, M.A.J.|info:eu-repo/dai/nl/330981447; van de Grint, E.J.M.; Bastings, M.A.S.|info:eu-repo/dai/nl/133948676; de Smale, S.; Dictus, W.J.A.G.

    Simulations and games play an important role in how young people learn. Through simulations and games you can practice skills that are relevant for professional practice. Through simulations and games you can learn to deal with complexity and diversity. Simulations and games already play a role in

  11. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  12. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  13. A critical look at the kinetic parameter values used in simulating the thermoluminescence glow-curve

    Energy Technology Data Exchange (ETDEWEB)

    Sadek, A.M., E-mail: dr_amrsadek@hotmail.com [Ionizing Radiation Metrology Department, National Institute for Standards, El-Haram, Giza (Egypt); Kitis, G. [Nuclear Physics and Elementary Particles Physics Section, Physics Department, Aristotle University of Thessaloniki, 54124 Thessaloniki, Makedonia (Greece)

    2017-03-15

    Objections against utilizing the peak fitting method in computing the kinetics parameters of thermoluminescence (TL) glow-peaks were discussed previously in the literature. These objections came through testing the accuracy of the peak fitting by applying on simulated peaks. The results showed that in some cases the simulated peaks may have unusual geometrical properties and do not reflect the real properties of TL peaks. Thereby, estimating the accuracy of the peak fitting by applying on such peaks would be misleading. Two main reasons may lead to unrealistic simulated peaks; the improper selection of the simulation inputs, and performing the TL simulation process via the heating stage only. It has been proved that considering the irradiation and the relaxation stages in the simulation process is crucial. However, there are other cases in which the analytical methods were not able to reveal the real values of the simulated peaks. These cases were successfully resolved using analytical expressions derived from the one trap-one recombination (OTOR) level model and the non-interactive multiple trap system (NMTS) model. A general conclusion can be drawn that the accuracy of the peak fitting method is critically dependent on the TL analytical expressions utilized in this method. The failure of this method in estimating the TL kinetic parameters should be attributed to the TL model equation utilized in fitting process. - Highlights: • Objections against using the TL peak fitting method are discussed. • Improper selection of simulation inputs may lead to non realistic TL peaks. • Considering the irradiation and the relaxation stages in simulation is crucial. • TL expressions could not describe TL peaks with unrealistic geometrical properties. • The accuracy of the peak fitting method depends on the model used in the fitting.

  14. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  15. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  16. Continuous Spatial Process Models for Spatial Extreme Values

    KAUST Repository

    Sang, Huiyan

    2010-01-28

    We propose a hierarchical modeling approach for explaining a collection of point-referenced extreme values. In particular, annual maxima over space and time are assumed to follow generalized extreme value (GEV) distributions, with parameters μ, σ, and ξ specified in the latent stage to reflect underlying spatio-temporal structure. The novelty here is that we relax the conditionally independence assumption in the first stage of the hierarchial model, an assumption which has been adopted in previous work. This assumption implies that realizations of the the surface of spatial maxima will be everywhere discontinuous. For many phenomena including, e. g., temperature and precipitation, this behavior is inappropriate. Instead, we offer a spatial process model for extreme values that provides mean square continuous realizations, where the behavior of the surface is driven by the spatial dependence which is unexplained under the latent spatio-temporal specification for the GEV parameters. In this sense, the first stage smoothing is viewed as fine scale or short range smoothing while the larger scale smoothing will be captured in the second stage of the modeling. In addition, as would be desired, we are able to implement spatial interpolation for extreme values based on this model. A simulation study and a study on actual annual maximum rainfall for a region in South Africa are used to illustrate the performance of the model. © 2009 International Biometric Society.

  17. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    In the present work a framework for optimizing the design of boilers for dynamic operation has been developed. A cost function to be minimized during the optimization has been formulated and for the present design variables related to the Boiler Volume and the Boiler load Gradient (i.e. ring rate...... on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...

  18. Norms and values in sociohydrological models

    Directory of Open Access Journals (Sweden)

    M. Roobavannan

    2018-02-01

    Full Text Available Sustainable water resources management relies on understanding how societies and water systems coevolve. Many place-based sociohydrology (SH modeling studies use proxies, such as environmental degradation, to capture key elements of the social component of system dynamics. Parameters of assumed relationships between environmental degradation and the human response to it are usually obtained through calibration. Since these relationships are not yet underpinned by social-science theories, confidence in the predictive power of such place-based sociohydrologic models remains low. The generalizability of SH models therefore requires major advances in incorporating more realistic relationships, underpinned by appropriate hydrological and social-science data and theories. The latter is a critical input, since human culture – especially values and norms arising from it – influences behavior and the consequences of behaviors. This paper reviews a key social-science theory that links cultural factors to environmental decision-making, assesses how to better incorporate social-science insights to enhance SH models, and raises important questions to be addressed in moving forward. This is done in the context of recent progress in sociohydrological studies and the gaps that remain to be filled. The paper concludes with a discussion of challenges and opportunities in terms of generalization of SH models and the use of available data to allow future prediction and model transfer to ungauged basins.

  19. Norms and values in sociohydrological models

    Science.gov (United States)

    Roobavannan, Mahendran; van Emmerik, Tim H. M.; Elshafei, Yasmina; Kandasamy, Jaya; Sanderson, Matthew R.; Vigneswaran, Saravanamuthu; Pande, Saket; Sivapalan, Murugesu

    2018-02-01

    Sustainable water resources management relies on understanding how societies and water systems coevolve. Many place-based sociohydrology (SH) modeling studies use proxies, such as environmental degradation, to capture key elements of the social component of system dynamics. Parameters of assumed relationships between environmental degradation and the human response to it are usually obtained through calibration. Since these relationships are not yet underpinned by social-science theories, confidence in the predictive power of such place-based sociohydrologic models remains low. The generalizability of SH models therefore requires major advances in incorporating more realistic relationships, underpinned by appropriate hydrological and social-science data and theories. The latter is a critical input, since human culture - especially values and norms arising from it - influences behavior and the consequences of behaviors. This paper reviews a key social-science theory that links cultural factors to environmental decision-making, assesses how to better incorporate social-science insights to enhance SH models, and raises important questions to be addressed in moving forward. This is done in the context of recent progress in sociohydrological studies and the gaps that remain to be filled. The paper concludes with a discussion of challenges and opportunities in terms of generalization of SH models and the use of available data to allow future prediction and model transfer to ungauged basins.

  20. SEMI Modeling and Simulation Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Hermina, W.L.

    2000-10-02

    With the exponential growth in the power of computing hardware and software, modeling and simulation is becoming a key enabler for the rapid design of reliable Microsystems. One vision of the future microsystem design process would include the following primary software capabilities: (1) The development of 3D part design, through standard CAD packages, with automatic design rule checks that guarantee the manufacturability and performance of the microsystem. (2) Automatic mesh generation, for 3D parts as manufactured, that permits computational simulation of the process steps, and the performance and reliability analysis for the final microsystem. (3) Computer generated 2D layouts for process steps that utilize detailed process models to generate the layout and process parameter recipe required to achieve the desired 3D part. (4) Science-based computational tools that can simulate the process physics, and the coupled thermal, fluid, structural, solid mechanics, electromagnetic and material response governing the performance and reliability of the microsystem. (5) Visualization software that permits the rapid visualization of 3D parts including cross-sectional maps, performance and reliability analysis results, and process simulation results. In addition to these desired software capabilities, a desired computing infrastructure would include massively parallel computers that enable rapid high-fidelity analysis, coupled with networked compute servers that permit computing at a distance. We now discuss the individual computational components that are required to achieve this vision. There are three primary areas of focus: design capabilities, science-based capabilities and computing infrastructure. Within each of these areas, there are several key capability requirements.

  1. Models for setting ATM parameter values

    DEFF Research Database (Denmark)

    Blaabjerg, Søren; Gravey, A.; Romæuf, L.

    1996-01-01

    essential to set traffic characteristic values that are relevant to the considered cell stream, and that ensure that the amount of non-conforming traffic is small. Using a queueing model representation for the GCRA formalism, several methods are available for choosing the traffic characteristics. This paper......In ATM networks, a user should negotiate at connection set-up a traffic contract which includes traffic characteristics and requested QoS. The traffic characteristics currently considered are the Peak Cell Rate, the Sustainable Cell Rate, the Intrinsic Burst Tolerance and the Cell Delay Variation...... (CDV) tolerance(s). The values taken by these traffic parameters characterize the so-called ''Worst Case Traffic'' that is used by CAC procedures for accepting a new connection and allocating resources to it. Conformance to the negotiated traffic characteristics is defined, at the ingress User...

  2. Photovoltaic array performance simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Menicucci, D. F.

    1986-09-15

    The experience of the solar industry confirms that, despite recent cost reductions, the profitability of photovoltaic (PV) systems is often marginal and the configuration and sizing of a system is a critical problem for the design engineer. Construction and evaluation of experimental systems are expensive and seldom justifiable. A mathematical model or computer-simulation program is a desirable alternative, provided reliable results can be obtained. Sandia National Laboratories, Albuquerque (SNLA), has been studying PV-system modeling techniques in an effort to develop an effective tool to be used by engineers and architects in the design of cost-effective PV systems. This paper reviews two of the sources of error found in previous PV modeling programs, presents the remedies developed to correct these errors, and describes a new program that incorporates these improvements.

  3. Simulation guided value stream mapping and lean improvement: A case study of a tubular machining facility

    Directory of Open Access Journals (Sweden)

    Wei Xia

    2013-06-01

    Full Text Available Purpose: This paper describes a typical Value stream mapping (VSM application enhanced by the discrete event simulation (DES to a dedicated tubular manufacturing process. Design/Methodology/Approach: VSM is prescribed as part of lean production portfolio of tools, not only highlights process inefficiencies, transactional and communication mismatches, but also guides improvement areas. Meanwhile, DES is used to reduce uncertainty and create consensus by visualizing dynamic process views. It is served as a complementary tool for the traditional VSM to provide sufficient justification and quantifiable evidence needed to convince the lean approaches. A simulation model is developed to replicate the operation of an existing system, and that of a proposed system that modifies the existing design to incorporate lean manufacturing shop floor principles. Findings: A comprehensive model for the tubular manufacturing process is constructed, and distinctive scenarios are derived to uncover an optimal future state of the process. Various simulation scenarios are developed. The simulated results are acquired and investigated, and they are well matched with the real production data. Originality/Value: DES is demonstrated as a guided tool to assist organizations with the decision to implement lean approaches by quantifying benefits from applying the VSM. A roadmap is provided to illustrate how the VSM is used to design a desired future state. The developed simulation scenarios mimic the behavior of the actual manufacturing process in an intuitive manner.

  4. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  5. Estimating oil price 'Value at Risk' using the historical simulation approach

    International Nuclear Information System (INIS)

    Cabedo, J.D.; Moya, I.

    2003-01-01

    In this paper we propose using Value at Risk (VaR) for oil price risk quantification. VaR provides an estimation for the maximum oil price change associated with a likelihood level, and can be used for designing risk management strategies. We analyse three VaR calculation methods: the historical simulation standard approach, the historical simulation with ARMA forecasts (HSAF) approach. developed in this paper, and the variance-covariance method based on autoregressive conditional heteroskedasticity models forecasts. The results obtained indicate that HSAF methodology provides a flexible VaR quantification, which fits the continuous oil price movements well and provides an efficient risk quantification. (author)

  6. Estimating oil price 'Value at Risk' using the historical simulation approach

    International Nuclear Information System (INIS)

    David Cabedo, J.; Moya, Ismael

    2003-01-01

    In this paper we propose using Value at Risk (VaR) for oil price risk quantification. VaR provides an estimation for the maximum oil price change associated with a likelihood level, and can be used for designing risk management strategies. We analyse three VaR calculation methods: the historical simulation standard approach, the historical simulation with ARMA forecasts (HSAF) approach, developed in this paper, and the variance-covariance method based on autoregressive conditional heteroskedasticity models forecasts. The results obtained indicate that HSAF methodology provides a flexible VaR quantification, which fits the continuous oil price movements well and provides an efficient risk quantification

  7. Clinical value of homodynamic numerical simulation applied in the treatment of cerebral aneurysm.

    Science.gov (United States)

    Zhang, Hailin; Li, Li; Cheng, Chongjie; Sun, Xiaochuan

    2017-12-01

    Our objective was to evaluate the clinical value of numerical simulation in diagnosing cerebral aneurysm based on the analysis of numerical simulation of hemodynamic model. The experimental method used was the numerical model of cerebral aneurysm hemodynamic, and the numerical value of blood flow at each point was analyzed. The results showed that, the wall shear stress (WSS) value on the top of CA1 was significantly lower than that of the top (Pvalue of each point on the CA2 tumor was significantly lower than that of tumor neck (Pvalue on the tumor top and tumor neck between CA1 and CA2 had no significant difference (P>0.05); the unsteady index of shear (UIS) value at the points of 20 had distinctly changed, the wave range was 0.6-1.5; the unsteady index of pressure value of every point was significantly lower than UIS value, the wave range was 0.25-0.40. In conclusion, the application of cerebral aneurysm hemodynamic research can help doctors to diagnose cerebral aneurysm more precisely and to grasp the opportunity of treatment during the formulating of the treatment strategies.

  8. MODELING AND SIMULATION OF A HYDROCRACKING UNIT

    Directory of Open Access Journals (Sweden)

    HASSAN A. FARAG

    2016-06-01

    Full Text Available Hydrocracking is used in the petroleum industry to convert low quality feed stocks into high valued transportation fuels such as gasoline, diesel, and jet fuel. The aim of the present work is to develop a rigorous steady state two-dimensional mathematical model which includes conservation equations of mass and energy for simulating the operation of a hydrocracking unit. Both the catalyst bed and quench zone have been included in this integrated model. The model equations were numerically solved in both axial and radial directions using Matlab software. The presented model was tested against a real plant data in Egypt. The results indicated that a very good agreement between the model predictions and industrial values have been reported for temperature profiles, concentration profiles, and conversion in both radial and axial directions at the hydrocracking unit. Simulation of the quench zone conversion and temperature profiles in the quench zone was also included and gave a low deviation from the actual ones. In concentration profiles, the percentage deviation in the first reactor was found to be 9.28 % and 9.6% for the second reactor. The effect of several parameters such as: Pellet Heat Transfer Coefficient, Effective Radial Thermal Conductivity, Wall Heat Transfer Coefficient, Effective Radial Diffusivity, and Cooling medium (quench zone has been included in this study. The variation of Wall Heat Transfer Coefficient, Effective Radial Diffusivity for the near-wall region, gave no remarkable changes in the temperature profiles. On the other hand, even small variations of Effective Radial Thermal Conductivity, affected the simulated temperature profiles significantly, and this effect could not be compensated by the variations of the other parameters of the model.

  9. Operations planning simulation: Model study

    Science.gov (United States)

    1974-01-01

    The use of simulation modeling for the identification of system sensitivities to internal and external forces and variables is discussed. The technique provides a means of exploring alternate system procedures and processes, so that these alternatives may be considered on a mutually comparative basis permitting the selection of a mode or modes of operation which have potential advantages to the system user and the operator. These advantages are measurements is system efficiency are: (1) the ability to meet specific schedules for operations, mission or mission readiness requirements or performance standards and (2) to accomplish the objectives within cost effective limits.

  10. Advancing Material Models for Automotive Forming Simulations

    International Nuclear Information System (INIS)

    Vegter, H.; An, Y.; Horn, C.H.L.J. ten; Atzema, E.H.; Roelofsen, M.E.

    2005-01-01

    Simulations in automotive industry need more advanced material models to achieve highly reliable forming and springback predictions. Conventional material models implemented in the FEM-simulation models are not capable to describe the plastic material behaviour during monotonic strain paths with sufficient accuracy. Recently, ESI and Corus co-operate on the implementation of an advanced material model in the FEM-code PAMSTAMP 2G. This applies to the strain hardening model, the influence of strain rate, and the description of the yield locus in these models. A subsequent challenge is the description of the material after a change of strain path.The use of advanced high strength steels in the automotive industry requires a description of plastic material behaviour of multiphase steels. The simplest variant is dual phase steel consisting of a ferritic and a martensitic phase. Multiphase materials also contain a bainitic phase in addition to the ferritic and martensitic phase. More physical descriptions of strain hardening than simple fitted Ludwik/Nadai curves are necessary.Methods to predict plastic behaviour of single-phase materials use a simple dislocation interaction model based on the formed cells structures only. At Corus, a new method is proposed to predict plastic behaviour of multiphase materials have to take hard phases into account, which deform less easily. The resulting deformation gradients create geometrically necessary dislocations. Additional micro-structural information such as morphology and size of hard phase particles or grains is necessary to derive the strain hardening models for this type of materials.Measurements available from the Numisheet benchmarks allow these models to be validated. At Corus, additional measured values are available from cross-die tests. This laboratory test can attain critical deformations by large variations in blank size and processing conditions. The tests are a powerful tool in optimising forming simulations prior

  11. A simulation model for material accounting systems

    International Nuclear Information System (INIS)

    Coulter, C.A.; Thomas, K.E.

    1987-01-01

    A general-purpose model that was developed to simulate the operation of a chemical processing facility for nuclear materials has been extended to describe material measurement and accounting procedures as well. The model now provides descriptors for material balance areas, a large class of measurement instrument types and their associated measurement errors for various classes of materials, the measurement instruments themselves with their individual calibration schedules, and material balance closures. Delayed receipt of measurement results (as for off-line analytical chemistry assay), with interim use of a provisional measurement value, can be accurately represented. The simulation model can be used to estimate inventory difference variances for processing areas that do not operate at steady state, to evaluate the timeliness of measurement information, to determine process impacts of measurement requirements, and to evaluate the effectiveness of diversion-detection algorithms. Such information is usually difficult to obtain by other means. Use of the measurement simulation model is illustrated by applying it to estimate inventory difference variances for two material balance area structures of a fictitious nuclear material processing line

  12. Impulse pumping modelling and simulation

    International Nuclear Information System (INIS)

    Pierre, B; Gudmundsson, J S

    2010-01-01

    Impulse pumping is a new pumping method based on propagation of pressure waves. Of particular interest is the application of impulse pumping to artificial lift situations, where fluid is transported from wellbore to wellhead using pressure waves generated at wellhead. The motor driven element of an impulse pumping apparatus is therefore located at wellhead and can be separated from the flowline. Thus operation and maintenance of an impulse pump are facilitated. The paper describes the different elements of an impulse pumping apparatus, reviews the physical principles and details the modelling of the novel pumping method. Results from numerical simulations of propagation of pressure waves in water-filled pipelines are then presented for illustrating impulse pumping physical principles, and validating the described modelling with experimental data.

  13. Simulation model of a PWR power plant

    International Nuclear Information System (INIS)

    Larsen, N.

    1987-03-01

    A simulation model of a hypothetical PWR power plant is described. A large number of disturbances and failures in plant function can be simulated. The model is written as seven modules to the modular simulation system for continuous processes DYSIM and serves also as a user example of this system. The model runs in Fortran 77 on the IBM-PC-AT. (author)

  14. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  15. Advanced empirical estimate of information value for credit scoring models

    Directory of Open Access Journals (Sweden)

    Martin Řezáč

    2011-01-01

    Full Text Available Credit scoring, it is a term for a wide spectrum of predictive models and their underlying techniques that aid financial institutions in granting credits. These methods decide who will get credit, how much credit they should get, and what further strategies will enhance the profitability of the borrowers to the lenders. Many statistical tools are avaiable for measuring quality, within the meaning of the predictive power, of credit scoring models. Because it is impossible to use a scoring model effectively without knowing how good it is, quality indexes like Gini, Kolmogorov-Smirnov statisic and Information value are used to assess quality of given credit scoring model. The paper deals primarily with the Information value, sometimes called divergency. Commonly it is computed by discretisation of data into bins using deciles. One constraint is required to be met in this case. Number of cases have to be nonzero for all bins. If this constraint is not fulfilled there are some practical procedures for preserving finite results. As an alternative method to the empirical estimates one can use the kernel smoothing theory, which allows to estimate unknown densities and consequently, using some numerical method for integration, to estimate value of the Information value. The main contribution of this paper is a proposal and description of the empirical estimate with supervised interval selection. This advanced estimate is based on requirement to have at least k, where k is a positive integer, observations of socres of both good and bad client in each considered interval. A simulation study shows that this estimate outperform both the empirical estimate using deciles and the kernel estimate. Furthermore it shows high dependency on choice of the parameter k. If we choose too small value, we get overestimated value of the Information value, and vice versa. Adjusted square root of number of bad clients seems to be a reasonable compromise.

  16. Creating Value in Marketing and Business Simulations: An Author's Viewpoint

    Science.gov (United States)

    Cadotte, Ernest R.

    2016-01-01

    Simulations are a form of competitive training that can provide transformational learning. Participants are pushed by the competition and their own desire to win as well as the continual feedback, encouragement, and guidance of a Business Coach. Simulations enable students to apply their knowledge and practice their business skills over and over.…

  17. Stochastic models to simulate paratuberculosis in dairy herds

    DEFF Research Database (Denmark)

    Nielsen, Søren Saxmose; Weber, M.F.; Kudahl, Anne Margrethe Braad

    2011-01-01

    Stochastic simulation models are widely accepted as a means of assessing the impact of changes in daily management and the control of different diseases, such as paratuberculosis, in dairy herds. This paper summarises and discusses the assumptions of four stochastic simulation models and their use...... the models are somewhat different in their underlying principles and do put slightly different values on the different strategies, their overall findings are similar. Therefore, simulation models may be useful in planning paratuberculosis strategies in dairy herds, although as with all models caution...

  18. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantification of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to define parts...

  19. Galaxy Alignments: Theory, Modelling & Simulations

    Science.gov (United States)

    Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais

    2015-11-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.

  20. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  1. Dissemination of Cultural Norms and Values: Agent-Based Modeling

    Directory of Open Access Journals (Sweden)

    Denis Andreevich Degterev

    2016-12-01

    Full Text Available This article shows how agent-based modeling allows us to explore the mechanisms of the dissemination of cultural norms and values both within one country and in the whole world. In recent years, this type of simulation is particularly prevalent in the analysis of international relations, becoming more popular than the system dynamics and discrete event simulation. The use of agent-based modeling in the analysis of international relations is connected with the agent-structure problem in international relations. Structure and agents act as interdependent and dynamically changing in the process of interaction between entities. Agent-structure interaction could be modeled by means of the theory of complex adaptive systems with the use of agent-based modeling techniques. One of the first examples of the use of agent-based modeling in political science is a model of racial segregation T. Shellinga. On the basis of this model, the author shows how the change in behavioral patterns at micro-level impacts on the macro-level. Patterns are changing due to the dynamics of cultural norms and values, formed by mass-media and other social institutes. The author shows the main areas of modern application of agent-based modeling in international studies including the analysis of ethnic conflicts, the formation of international coalitions. Particular attention is paid to Robert Axelrod approach based on the use of genetic algorithms to the spread of cultural norms and values. Agent-based modeling shows how to how to create such conditions that the norms that originally are not shared by a significant part of the population, eventually spread everywhere. Practical application of these algorithms is shown by the author of the article on the example of the situation in Ukraine in 2015-2016. The article also reveals the mechanisms of international spread of cultural norms and values. The main think-tanks using agent-based modeling in international studies are

  2. Simulation and Modeling Application in Agricultural Mechanization

    Directory of Open Access Journals (Sweden)

    R. M. Hudzari

    2012-01-01

    Full Text Available This experiment was conducted to determine the equations relating the Hue digital values of the fruits surface of the oil palm with maturity stage of the fruit in plantation. The FFB images were zoomed and captured using Nikon digital camera, and the calculation of Hue was determined using the highest frequency of the value for R, G, and B color components from histogram analysis software. New procedure in monitoring the image pixel value for oil palm fruit color surface in real-time growth maturity was developed. The estimation of day harvesting prediction was calculated based on developed model of relationships for Hue values with mesocarp oil content. The simulation model is regressed and predicts the day of harvesting or a number of days before harvest of FFB. The result from experimenting on mesocarp oil content can be used for real-time oil content determination of MPOB color meter. The graph to determine the day of harvesting the FFB was presented in this research. The oil was found to start developing in mesocarp fruit at 65 days before fruit at ripe maturity stage of 75% oil to dry mesocarp.

  3. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    Science.gov (United States)

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  4. Model for transient simulation in a PWR steam circuit

    International Nuclear Information System (INIS)

    Mello, L.A. de.

    1982-11-01

    A computer code (SURF) was developed and used to simulate pressure losses along the tubes of the main steam circuit of a PWR nuclear power plant, and the steam flow through relief and safety valves when pressure reactors its thresholds values. A thermodynamic model of turbines (high and low pressure), and its associated components are simulated too. The SURF computer code was coupled to the GEVAP computer code, complementing the simulation of a PWR nuclear power plant main steam circuit. (Author) [pt

  5. Systematic simulations of modified gravity: chameleon models

    International Nuclear Information System (INIS)

    Brax, Philippe; Davis, Anne-Christine; Li, Baojiu; Winther, Hans A.; Zhao, Gong-Bo

    2013-01-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc −1 , since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future

  6. Systematic simulations of modified gravity: chameleon models

    Energy Technology Data Exchange (ETDEWEB)

    Brax, Philippe [Institut de Physique Theorique, CEA, IPhT, CNRS, URA 2306, F-91191Gif/Yvette Cedex (France); Davis, Anne-Christine [DAMTP, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Li, Baojiu [Institute for Computational Cosmology, Department of Physics, Durham University, Durham DH1 3LE (United Kingdom); Winther, Hans A. [Institute of Theoretical Astrophysics, University of Oslo, 0315 Oslo (Norway); Zhao, Gong-Bo, E-mail: philippe.brax@cea.fr, E-mail: a.c.davis@damtp.cam.ac.uk, E-mail: baojiu.li@durham.ac.uk, E-mail: h.a.winther@astro.uio.no, E-mail: gong-bo.zhao@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 3FX (United Kingdom)

    2013-04-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.

  7. [Modeling and Simulation of Spectral Polarimetric BRDF].

    Science.gov (United States)

    Ling, Jin-jiang; Li, Gang; Zhang, Ren-bin; Tang, Qian; Ye, Qiu

    2016-01-01

    Under the conditions of the polarized light, The reflective surface of the object is affected by many factors, refractive index, surface roughness, and so the angle of incidence. For the rough surface in the different wavelengths of light exhibit different reflection characteristics of polarization, a spectral polarimetric BRDF based on Kirchhof theory is proposee. The spectral model of complex refraction index is combined with refraction index and extinction coefficient spectral model which were got by using the known complex refraction index at different value. Then get the spectral model of surface roughness derived from the classical surface roughness measuring method combined with the Fresnel reflection function. Take the spectral model of refraction index and roughness into the BRDF model, then the spectral polarimetirc BRDF model is proposed. Compare the simulation results of the refractive index varies with wavelength, roughness is constant, the refraction index and roughness both vary with wavelength and origin model with other papers, it shows that, the spectral polarimetric BRDF model can show the polarization characteristics of the surface accurately, and can provide a reliable basis for the application of polarization remote sensing, and other aspects of the classification of substances.

  8. Modeling and simulation of pressurized water reactor power plant

    International Nuclear Information System (INIS)

    Wang, S.J.

    1983-01-01

    Two kinds of balance of plant (BOP) models of a pressurized water reactor (PWR) system are developed in this work - the detailed BOP model and the simple BOP model. The detailed model is used to simulate the normal operational performance of a whole BOP system. The simple model is used to combine with the NSSS model for a whole plant simulation. The trends of the steady state values of the detailed model are correct and the dynamic responses are reasonable. The simple BOP model approach starts the modelling work from the overall point of view. The response of the normalized turbine power and the feedwater inlet temperature to the steam generator of the simple model are compared with those of the detailed model. Both the steady state values and the dynamic responses are close to those of the detailed model. The simple BOP model is found adequate to represent the main performance of the BOP system. The simple balance of plant model was coupled with a NSSS model for a whole plant simulation. The NSSS model consists of the reactor core model, the steam generator model, and the coolant temperature control system. A closed loop whole plant simulation for an electric load perturbation was performed. The results are plausible. The coupling effect between the NSSS system and the BOP system was analyzed. The feedback of the BOP system has little effect on the steam generator performance, while the performance of the BOP system is strongly affected by the steam flow rate from the NSSS

  9. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  10. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  11. The Advancement Value Chain: An Exploratory Model

    Science.gov (United States)

    Leonard, Edward F., III

    2005-01-01

    Since the introduction of the value chain concept in 1985, several varying, yet virtually similar, value chains have been developed for the business enterprise. Shifting to higher education, can a value chain be found that links together the various activities of advancement so that an institution's leaders can actually look at the philanthropic…

  12. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ

    1985-03-01

    Full Text Available -animal interactions. An additional two models, which expand aspects of the FYNBOS model, are described: a model for simulating canopy processes; and a Fire Recovery Simulator. The canopy process model will simulate ecophysiological processes in more detail than FYNBOS...

  13. PDF added value of a high resolution climate simulation for precipitation

    Science.gov (United States)

    Soares, Pedro M. M.; Cardoso, Rita M.

    2015-04-01

    dynamical downscaling, based on simple PDF skill scores. The measure can assess the full quality of the PDFs and at the same time integrates a flexible manner to weight differently the PDF tails. In this study we apply the referred method to characaterize the PDF added value of a high resolution simulation with the WRF model. Results from a WRF climate simulation centred at the Iberian Penisnula with two nested grids, a larger one at 27km and a smaller one at 9km. This simulation is forced by ERA-Interim. The observational data used covers from rain gauges precipitation records to observational regular grids of daily precipitation. Two regular gridded precipitation datasets are used. A Portuguese grid precipitation dataset developed at 0.2°× 0.2°, from observed rain gauges daily precipitation. A second one corresponding to the ENSEMBLES observational gridded dataset for Europe, which includes daily precipitation values at 0.25°. The analisys shows an important PDF added value from the higher resolution simulation, regarding the full PDF and the extremes. This method shows higher potential to be applied to other simulation exercises and to evaluate other variables.

  14. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  15. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  16. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  17. Psychosocial value of space simulation for extended spaceflight

    Science.gov (United States)

    Kanas, N.

    1997-01-01

    There have been over 60 studies of Earth-bound activities that can be viewed as simulations of manned spaceflight. These analogs have involved Antarctic and Arctic expeditions, submarines and submersible simulators, land-based simulators, and hypodynamia environments. None of these analogs has accounted for all the variables related to extended spaceflight (e.g., microgravity, long-duration, heterogeneous crews), and some of the stimulation conditions have been found to be more representative of space conditions than others. A number of psychosocial factors have emerged from the simulation literature that correspond to important issues that have been reported from space. Psychological factors include sleep disorders, alterations in time sense, transcendent experiences, demographic issues, career motivation, homesickness, and increased perceptual sensitivities. Psychiatric factors include anxiety, depression, psychosis, psychosomatic symptoms, emotional reactions related to mission stage, asthenia, and postflight personality, and marital problems. Finally, interpersonal factors include tension resulting from crew heterogeneity, decreased cohesion over time, need for privacy, and issues involving leadership roles and lines of authority. Since future space missions will usually involve heterogeneous crews working on complicated objectives over long periods of time, these features require further study. Socio-cultural factors affecting confined crews (e.g., language and dialect, cultural differences, gender biases) should be explored in order to minimize tension and sustain performance. Career motivation also needs to be examined for the purpose of improving crew cohesion and preventing subgrouping, scapegoating, and territorial behavior. Periods of monotony and reduced activity should be addressed in order to maintain morale, provide meaningful use of leisure time, and prevent negative consequences of low stimulation, such as asthenia and crew member withdrawal

  18. Comparison of perceived value structural models

    OpenAIRE

    Sunčana Piri Rajh

    2012-01-01

    Perceived value has been considered an important determinant of consumer shopping behavior and studied as such for a long period of time. According to one research stream, perceived value is a variable determined by perceived quality and perceived sacrifice. Another research stream suggests that the perception of value is a result of the consumer risk perception. This implies the presence of two somewhat independent research streams that are integrated by a third research stream – the one sug...

  19. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  20. Triple Value System Dynamics Modeling to Help Stakeholders Engage with Food-Energy-Water Problems

    Science.gov (United States)

    Triple Value (3V) Community scoping projects and Triple Value Simulation (3VS) models help decision makers and stakeholders apply systems-analysis methodology to complex problems related to food production, water quality, and energy use. 3VS models are decision support tools that...

  1. Mammogram synthesis using a 3D simulation. I. Breast tissue model and image acquisition simulation

    International Nuclear Information System (INIS)

    Bakic, Predrag R.; Albert, Michael; Brzakovic, Dragana; Maidment, Andrew D. A.

    2002-01-01

    A method is proposed for generating synthetic mammograms based upon simulations of breast tissue and the mammographic imaging process. A computer breast model has been designed with a realistic distribution of large and medium scale tissue structures. Parameters controlling the size and placement of simulated structures (adipose compartments and ducts) provide a method for consistently modeling images of the same simulated breast with modified position or acquisition parameters. The mammographic imaging process is simulated using a compression model and a model of the x-ray image acquisition process. The compression model estimates breast deformation using tissue elasticity parameters found in the literature and clinical force values. The synthetic mammograms were generated by a mammogram acquisition model using a monoenergetic parallel beam approximation applied to the synthetically compressed breast phantom

  2. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  3. Network Modeling and Simulation A Practical Perspective

    CERN Document Server

    Guizani, Mohsen; Khan, Bilal

    2010-01-01

    Network Modeling and Simulation is a practical guide to using modeling and simulation to solve real-life problems. The authors give a comprehensive exposition of the core concepts in modeling and simulation, and then systematically address the many practical considerations faced by developers in modeling complex large-scale systems. The authors provide examples from computer and telecommunication networks and use these to illustrate the process of mapping generic simulation concepts to domain-specific problems in different industries and disciplines. Key features: Provides the tools and strate

  4. Multi-valued simulation and abstraction using lattice operations

    NARCIS (Netherlands)

    Vijzelaar, Stefan; Fokkink, W.J.

    2017-01-01

    Abstractions can cause spurious results, which need to be verified in the concrete system to gain conclusive results. Verification based on a multi-valued logic can distinguish between conclusive and inconclusive results, provides increased precision, and allows for encoding additional information

  5. Modelling and simulation of a heat exchanger

    Science.gov (United States)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  6. Value Reappraisal as a Conceptual Model for Task-Value Interventions

    Science.gov (United States)

    Acee, Taylor W.; Weinstein, Claire Ellen; Hoang, Theresa V.; Flaggs, Darolyn A.

    2018-01-01

    We discuss task-value interventions as one type of relevance intervention and propose a process model of value reappraisal whereby task-value interventions elicit cognitive-affective responses that lead to attitude change and in turn affect academic outcomes. The model incorporates a metacognitive component showing that students can intentionally…

  7. Modeling and simulation of large HVDC systems

    Energy Technology Data Exchange (ETDEWEB)

    Jin, H.; Sood, V.K.

    1993-01-01

    This paper addresses the complexity and the amount of work in preparing simulation data and in implementing various converter control schemes and the excessive simulation time involved in modelling and simulation of large HVDC systems. The Power Electronic Circuit Analysis program (PECAN) is used to address these problems and a large HVDC system with two dc links is simulated using PECAN. A benchmark HVDC system is studied to compare the simulation results with those from other packages. The simulation time and results are provided in the paper.

  8. A simulation of water pollution model parameter estimation

    Science.gov (United States)

    Kibler, J. F.

    1976-01-01

    A parameter estimation procedure for a water pollution transport model is elaborated. A two-dimensional instantaneous-release shear-diffusion model serves as representative of a simple transport process. Pollution concentration levels are arrived at via modeling of a remote-sensing system. The remote-sensed data are simulated by adding Gaussian noise to the concentration level values generated via the transport model. Model parameters are estimated from the simulated data using a least-squares batch processor. Resolution, sensor array size, and number and location of sensor readings can be found from the accuracies of the parameter estimates.

  9. Evaluation of articulation simulation system using artificial maxillectomy models.

    Science.gov (United States)

    Elbashti, M E; Hattori, M; Sumita, Y I; Taniguchi, H

    2015-09-01

    Acoustic evaluation is valuable for guiding the treatment of maxillofacial defects and determining the effectiveness of rehabilitation with an obturator prosthesis. Model simulations are important in terms of pre-surgical planning and pre- and post-operative speech function. This study aimed to evaluate the acoustic characteristics of voice generated by an articulation simulation system using a vocal tract model with or without artificial maxillectomy defects. More specifically, we aimed to establish a speech simulation system for maxillectomy defect models that both surgeons and maxillofacial prosthodontists can use in guiding treatment planning. Artificially simulated maxillectomy defects were prepared according to Aramany's classification (Classes I-VI) in a three-dimensional vocal tract plaster model of a subject uttering the vowel /a/. Formant and nasalance acoustic data were analysed using Computerized Speech Lab and the Nasometer, respectively. Formants and nasalance of simulated /a/ sounds were successfully detected and analysed. Values of Formants 1 and 2 for the non-defect model were 675.43 and 976.64 Hz, respectively. Median values of Formants 1 and 2 for the defect models were 634.36 and 1026.84 Hz, respectively. Nasalance was 11% in the non-defect model, whereas median nasalance was 28% in the defect models. The results suggest that an articulation simulation system can be used to help surgeons and maxillofacial prosthodontists to plan post-surgical defects that will be facilitate maxillofacial rehabilitation. © 2015 John Wiley & Sons Ltd.

  10. Optimization of Operations Resources via Discrete Event Simulation Modeling

    Science.gov (United States)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  11. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  12. The New Digital Media Value Network: Proposing an Interactive Model of Digital Media Value Activities

    Directory of Open Access Journals (Sweden)

    Sylvia Chan-Olmsted

    2016-07-01

    Full Text Available This study models the dynamic nature of today’s media markets using the framework of value-adding activities in the provision and consumption of media products. The proposed user-centric approach introduces the notion that the actions of external users, social media, and interfaces affect the internal value activities of media firms via a feedback loop, and therefore should themselves be considered value activities. The model also suggests a more comprehensive list of indicators for value assessment.

  13. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  14. Modeling the Pineapple Express phenomenon via Multivariate Extreme Value Theory

    Science.gov (United States)

    Weller, G.; Cooley, D. S.

    2011-12-01

    The pineapple express (PE) phenomenon is responsible for producing extreme winter precipitation events in the coastal and mountainous regions of the western United States. Because the PE phenomenon is also associated with warm temperatures, the heavy precipitation and associated snowmelt can cause destructive flooding. In order to study impacts, it is important that regional climate models from NARCCAP are able to reproduce extreme precipitation events produced by PE. We define a daily precipitation quantity which captures the spatial extent and intensity of precipitation events produced by the PE phenomenon. We then use statistical extreme value theory to model the tail dependence of this quantity as seen in an observational data set and each of the six NARCCAP regional models driven by NCEP reanalysis. We find that most NCEP-driven NARCCAP models do exhibit tail dependence between daily model output and observations. Furthermore, we find that not all extreme precipitation events are pineapple express events, as identified by Dettinger et al. (2011). The synoptic-scale atmospheric processes that drive extreme precipitation events produced by PE have only recently begun to be examined. Much of the current work has focused on pattern recognition, rather than quantitative analysis. We use daily mean sea-level pressure (MSLP) fields from NCEP to develop a "pineapple express index" for extreme precipitation, which exhibits tail dependence with our observed precipitation quantity for pineapple express events. We build a statistical model that connects daily precipitation output from the WRFG model, daily MSLP fields from NCEP, and daily observed precipitation in the western US. Finally, we use this model to simulate future observed precipitation based on WRFG output driven by the CCSM model, and our pineapple express index derived from future CCSM output. Our aim is to use this model to develop a better understanding of the frequency and intensity of extreme

  15. The added value of business models

    NARCIS (Netherlands)

    Vliet, Harry van

    An overview of innovations in a particular area, for example retail developments in the fashion sector (Van Vliet, 2014), and a subsequent discussion about the probability as to whether these innovations will realise a ‘breakthrough’, has to be supplemented with the question of what the added value

  16. Modeling value creation with enterprise architecture

    NARCIS (Netherlands)

    Singh, Prince Mayurank; Jonkers, H.; Iacob, Maria Eugenia; van Sinderen, Marten J.

    2014-01-01

    Firms may not succeed in business if strategies are not properly implemented in practice. Every firm needs to know, represent and master its value creation logic, not only to stay in business but also to keep growing. This paper is about focusing on an important topic in the field of strategic

  17. Model improvements to simulate charging in SEM

    Science.gov (United States)

    Arat, K. T.; Klimpel, T.; Hagen, C. W.

    2018-03-01

    Charging of insulators is a complex phenomenon to simulate since the accuracy of the simulations is very sensitive to the interaction of electrons with matter and electric fields. In this study, we report model improvements for a previously developed Monte-Carlo simulator to more accurately simulate samples that charge. The improvements include both modelling of low energy electron scattering and charging of insulators. The new first-principle scattering models provide a more realistic charge distribution cloud in the material, and a better match between non-charging simulations and experimental results. Improvements on charging models mainly focus on redistribution of the charge carriers in the material with an induced conductivity (EBIC) and a breakdown model, leading to a smoother distribution of the charges. Combined with a more accurate tracing of low energy electrons in the electric field, we managed to reproduce the dynamically changing charging contrast due to an induced positive surface potential.

  18. Probabilistic Load Models for Simulating the Impact of Load Management

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    . It is concluded that the AR(12) model is favored with limited measurement data and that the joint-normal model may provide better results with a large data set. Both models can be applied in general to model load time series and used in time-sequential simulation of distribution system planning.......This paper analyzes a distribution system load time series through autocorrelation coefficient, power spectral density, probabilistic distribution and quantile value. Two probabilistic load models, i.e. the joint-normal model and the autoregressive model of order 12 (AR(12)), are proposed...... to simulate the impact of load management. The joint-normal model is superior in modeling the tail region of the hourly load distribution and implementing the change of hourly standard deviation. Whereas the AR(12) model requires much less parameter and is superior in modeling the autocorrelation...

  19. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    Science.gov (United States)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  20. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  1. Value uncaptured perspective for sustainable business model innovation

    OpenAIRE

    Yang, M; Evans, S; Vladimirova, D; Rana, P

    2016-01-01

    Sustainability has become one of the key factors for long-term business success. Recent research and practice show that business model innovation is a promising approach for improving sustainability in manufacturing firms. To date business models have been examined mostly from the perspectives of value proposition, value capture, value creation and delivery. There is a need for a more comprehensive understanding of value in order to promote sustainability. This paper proposes value uncaptured...

  2. How processing digital elevation models can affect simulated water budgets

    Science.gov (United States)

    Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  3. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  4. Protein Simulation Data in the Relational Model.

    Science.gov (United States)

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  5. Virtual milk for modelling and simulation of dairy processes.

    Science.gov (United States)

    Munir, M T; Zhang, Y; Yu, W; Wilson, D I; Young, B R

    2016-05-01

    The modeling of dairy processing using a generic process simulator suffers from shortcomings, given that many simulators do not contain milk components in their component libraries. Recently, pseudo-milk components for a commercial process simulator were proposed for simulation and the current work extends this pseudo-milk concept by studying the effect of both total milk solids and temperature on key physical properties such as thermal conductivity, density, viscosity, and heat capacity. This paper also uses expanded fluid and power law models to predict milk viscosity over the temperature range from 4 to 75°C and develops a succinct regressed model for heat capacity as a function of temperature and fat composition. The pseudo-milk was validated by comparing the simulated and actual values of the physical properties of milk. The milk thermal conductivity, density, viscosity, and heat capacity showed differences of less than 2, 4, 3, and 1.5%, respectively, between the simulated results and actual values. This work extends the capabilities of the previously proposed pseudo-milk and of a process simulator to model dairy processes, processing different types of milk (e.g., whole milk, skim milk, and concentrated milk) with different intrinsic compositions, and to predict correct material and energy balances for dairy processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. Managing health care decisions and improvement through simulation modeling.

    Science.gov (United States)

    Forsberg, Helena Hvitfeldt; Aronsson, Håkan; Keller, Christina; Lindblad, Staffan

    2011-01-01

    Simulation modeling is a way to test changes in a computerized environment to give ideas for improvements before implementation. This article reviews research literature on simulation modeling as support for health care decision making. The aim is to investigate the experience and potential value of such decision support and quality of articles retrieved. A literature search was conducted, and the selection criteria yielded 59 articles derived from diverse applications and methods. Most met the stated research-quality criteria. This review identified how simulation can facilitate decision making and that it may induce learning. Furthermore, simulation offers immediate feedback about proposed changes, allows analysis of scenarios, and promotes communication on building a shared system view and understanding of how a complex system works. However, only 14 of the 59 articles reported on implementation experiences, including how decision making was supported. On the basis of these articles, we proposed steps essential for the success of simulation projects, not just in the computer, but also in clinical reality. We also presented a novel concept combining simulation modeling with the established plan-do-study-act cycle for improvement. Future scientific inquiries concerning implementation, impact, and the value for health care management are needed to realize the full potential of simulation modeling.

  7. Modeling and simulation of blood collection systems.

    Science.gov (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier

    2012-03-01

    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  8. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  9. Simulation models for tokamak plasmas

    International Nuclear Information System (INIS)

    Dimits, A.M.; Cohen, B.I.

    1992-01-01

    Two developments in the nonlinear simulation of tokamak plasmas are described: (A) Simulation algorithms that use quasiballooning coordinates have been implemented in a 3D fluid code and a 3D partially linearized (Δf) particle code. In quasiballooning coordinates, one of the coordinate directions is closely aligned with that of the magnetic field, allowing both optimal use of the grid resolution for structures highly elongated along the magnetic field as well as implementation of the correct periodicity conditions with no discontinuities in the toroidal direction. (B) Progress on the implementation of a likeparticle collision operator suitable for use in partially linearized particle codes is reported. The binary collision approach is shown to be unusable for this purpose. The algorithm under development is a complete version of the test-particle plus source-field approach that was suggested and partially implemented by Xu and Rosenbluth

  10. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  11. The role of non-epistemic values in engineering models

    NARCIS (Netherlands)

    Diekmann, S.; Peterson, M.B.

    2013-01-01

    We argue that non-epistemic values, including moral ones, play an important role in the construction and choice of models in science and engineering. Our main claim is that non-epistemic values are not only "secondary values" that become important just in case epistemic values leave some issues

  12. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    Directory of Open Access Journals (Sweden)

    Jin Xiao

    2014-01-01

    Full Text Available Scientific customer value segmentation (CVS is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM model. On the one hand, ODCEM integrates the preprocess of missing values and the classification modeling into one step; on the other hand, it utilizes multiple classifiers ensemble technology in constructing the classification models. The empirical results in credit scoring dataset “German” from UCI and the real customer churn prediction dataset “China churn” show that the ODCEM outperforms four commonly used “two-step” models and the ensemble based model LMF and can provide better decision support for market managers.

  13. A model management system for combat simulation

    OpenAIRE

    Dolk, Daniel R.

    1986-01-01

    The design and implementation of a model management system to support combat modeling is discussed. Structured modeling is introduced as a formalism for representing mathematical models. A relational information resource dictionary system is developed which can accommodate structured models. An implementation is described. Structured modeling is then compared to Jackson System Development (JSD) as a methodology for facilitating discrete event simulation. JSD is currently better at representin...

  14. HVDC System Characteristics and Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.I.; Han, B.M.; Jang, G.S. [Electric Enginnering and Science Research Institute, Seoul (Korea)

    2001-07-01

    This report deals with the AC-DC power system simulation method by PSS/E and EUROSTAG for the development of a strategy for the reliable operation of the Cheju-Haenam interconnected system. The simulation using both programs is performed to analyze HVDC simulation models. In addition, the control characteristics of the Cheju-Haenam HVDC system as well as Cheju AC system characteristics are described in this work. (author). 104 figs., 8 tabs.

  15. Physically realistic modeling of maritime training simulation

    OpenAIRE

    Cieutat , Jean-Marc

    2003-01-01

    Maritime training simulation is an important matter of maritime teaching, which requires a lot of scientific and technical skills.In this framework, where the real time constraint has to be maintained, all physical phenomena cannot be studied; the most visual physical phenomena relating to the natural elements and the ship behaviour are reproduced only. Our swell model, based on a surface wave simulation approach, permits to simulate the shape and the propagation of a regular train of waves f...

  16. Dynamic Average-Value Modeling of Doubly-Fed Induction Generator Wind Energy Conversion Systems

    Science.gov (United States)

    Shahab, Azin

    In a Doubly-fed Induction Generator (DFIG) wind energy conversion system, the rotor of a wound rotor induction generator is connected to the grid via a partial scale ac/ac power electronic converter which controls the rotor frequency and speed. In this research, detailed models of the DFIG wind energy conversion system with Sinusoidal Pulse-Width Modulation (SPWM) scheme and Optimal Pulse-Width Modulation (OPWM) scheme for the power electronic converter are developed in detail in PSCAD/EMTDC. As the computer simulation using the detailed models tends to be computationally extensive, time consuming and even sometimes not practical in terms of speed, two modified approaches (switching-function modeling and average-value modeling) are proposed to reduce the simulation execution time. The results demonstrate that the two proposed approaches reduce the simulation execution time while the simulation results remain close to those obtained using the detailed model simulation.

  17. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  18. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  19. Modelling, simulating and optimizing boiler heating surfaces and evaporator circuits

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for optimizing the dynamic performance of boiler have been developed. Design variables related to the size of the boiler and its dynamic performance have been defined. The object function to be optimized takes the weight of the boiler and its dynamic capability into account. As constraints...... for the optimization a dynamic model for the boiler is applied. Furthermore a function for the value of the dynamic performance is included in the model. The dynamic models for simulating boiler performance consists of a model for the flue gas side, a model for the evaporator circuit and a model for the drum....... The dynamic model has been developed for the purpose of determining boiler material temperatures and heat transfer from the flue gas side to the water-/steam side in order to simulate the circulation in the evaporator circuit and hereby the water level fluctuations in the drum. The dynamic model has been...

  20. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    where x increases from zero to N, the saturation value. Box 1. Matrix Meth- ... such as Laplace transforms and non-linear differential equa- tions with .... atomic bomb project in the. US in the early ... his work on game theory and computers.

  1. Deriving simulators for hybrid Chi models

    NARCIS (Netherlands)

    Beek, van D.A.; Man, K.L.; Reniers, M.A.; Rooda, J.E.; Schiffelers, R.R.H.

    2006-01-01

    The hybrid Chi language is formalism for modeling, simulation and verification of hybrid systems. The formal semantics of hybrid Chi allows the definition of provably correct implementations for simulation, verification and realtime control. This paper discusses the principles of deriving an

  2. Modeling and simulation for RF system design

    CERN Document Server

    Frevert, Ronny; Jancke, Roland; Knöchel, Uwe; Schwarz, Peter; Kakerow, Ralf; Darianian, Mohsen

    2005-01-01

    Focusing on RF specific modeling and simulation methods, and system and circuit level descriptions, this work contains application-oriented training material. Accompanied by a CD- ROM, it combines the presentation of a mixed-signal design flow, an introduction into VHDL-AMS and Verilog-A, and the application of commercially available simulators.

  3. Mathematical modelling of a farm enterprise value on the ...

    African Journals Online (AJOL)

    Mathematical modelling of a farm enterprise value on the agricultural market with the ... Subsidies in the EU countries reached 45-50% of the value of commodity output ... This financing gap entailed a number of negative consequences.

  4. The added value of the replica simulators in the exploitation of nuclear power plants

    International Nuclear Information System (INIS)

    Diaz Giron, P. a.; Ortega, F.; Rivero, N.

    2011-01-01

    Nuclear power plants full scope replica simulators were in the past solely designed following operational personnel training criteria. Nevertheless, these simulators not only feature a high replica control room but also provide an accurate process response. Control room replica simulators are presently based on complex technological platforms permitting highest physical and functional fidelity, allowing to be used as versatile and value added tools in diverse plants operation and maintenance activities. In recent years. Tecnatom has extended the use of such simulators to different engineering applications. this article intends to identify the simulators use in training and other applications beyond training. (Author)

  5. Magnetosphere Modeling: From Cartoons to Simulations

    Science.gov (United States)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems

  6. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  7. NUMERICAL SIMULATION AND MODELING OF UNSTEADY FLOW ...

    African Journals Online (AJOL)

    2014-06-30

    Jun 30, 2014 ... objective of this study is to control the simulation of unsteady flows around structures. ... Aerospace, our results were in good agreement with experimental .... Two-Equation Eddy-Viscosity Turbulence Models for Engineering.

  8. SEIR model simulation for Hepatitis B

    Science.gov (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  9. Maintenance Personnel Performance Simulation (MAPPS) model

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.; Haas, P.M.

    1984-01-01

    A stochastic computer model for simulating the actions and behavior of nuclear power plant maintenance personnel is described. The model considers personnel, environmental, and motivational variables to yield predictions of maintenance performance quality and time to perform. The mode has been fully developed and sensitivity tested. Additional evaluation of the model is now taking place

  10. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  11. Turbine modelling for real time simulators

    International Nuclear Information System (INIS)

    Oliveira Barroso, A.C. de; Araujo Filho, F. de

    1992-01-01

    A model for vapor turbines and its peripherals has been developed. All the important variables have been included and emphasis has been given for the computational efficiency to obtain a model able to simulate all the modeled equipment. (A.C.A.S.)

  12. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  13. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  14. Modeling and simulation with operator scaling

    OpenAIRE

    Cohen, Serge; Meerschaert, Mark M.; Rosiński, Jan

    2010-01-01

    Self-similar processes are useful in modeling diverse phenomena that exhibit scaling properties. Operator scaling allows a different scale factor in each coordinate. This paper develops practical methods for modeling and simulating stochastic processes with operator scaling. A simulation method for operator stable Levy processes is developed, based on a series representation, along with a Gaussian approximation of the small jumps. Several examples are given to illustrate practical application...

  15. Value Chain Model for Steel Manufacturing Sector: A Case Study

    OpenAIRE

    S G Acharyulu; K Venkata Subbaiah; K Narayana Rao

    2018-01-01

    Michael E Porter developed a value chain model for manufacturing sector with five primary activities and four supporting activities. The value chain model developed by Porter is extended to a steel manufacturing sector due to expansions of steel plants has become a continual process for their growth and survival. In this paper a value chain model for steel manufacturing sector is developed considering five primary activities and six support activities.

  16. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  17. Communicating Value in Simulation: Cost-Benefit Analysis and Return on Investment.

    Science.gov (United States)

    Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu

    2018-02-01

    Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and the economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost-effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes," our breakout session critically evaluated the cost-benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost-benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. © 2017 by the Society for Academic Emergency Medicine.

  18. Communicating Value in Simulation: Cost Benefit Analysis and Return on Investment.

    Science.gov (United States)

    Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu

    2017-10-26

    Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes", our breakout session critically evaluated the cost benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  20. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  1. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  2. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  3. Automobile simulation model and its identification. Behavior measuring by image processing; Jidosha simulation model to dotei jikken. Gazo kaiseki ni yoru undo no keisoku

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, H; Morita, S; Matsuura, Y [Osaka Sangyo University, Osaka (Japan)

    1997-10-01

    Model simulation technology is important for automobiles development. Especially, for the investigations concerning to ABS, TRC, VDC, and so on, the model should be the one which can simulates not only whole behaviors of the automobile, but also such internal information as torque, acceleration, and, velocity of each drive shafts, etc.. From this point of view, 4-wheels simulation model which can simulates almost over 50 items, was made. On the other hand, technique of 3-D image processing using 2 video cameras was adopted to identify the model. Considerably good coincidences were recognized between the simulated values and measured ones. 3 refs., 7 figs., 2 tabs.

  4. Farm-specific economic value of automatic lameness detection systems in dairy cattle: From concepts to operational simulations.

    Science.gov (United States)

    Van De Gucht, Tim; Saeys, Wouter; Van Meensel, Jef; Van Nuffel, Annelies; Vangeyte, Jurgen; Lauwers, Ludwig

    2018-01-01

    Although prototypes of automatic lameness detection systems for dairy cattle exist, information about their economic value is lacking. In this paper, a conceptual and operational framework for simulating the farm-specific economic value of automatic lameness detection systems was developed and tested on 4 system types: walkover pressure plates, walkover pressure mats, camera systems, and accelerometers. The conceptual framework maps essential factors that determine economic value (e.g., lameness prevalence, incidence and duration, lameness costs, detection performance, and their relationships). The operational simulation model links treatment costs and avoided losses with detection results and farm-specific information, such as herd size and lameness status. Results show that detection performance, herd size, discount rate, and system lifespan have a large influence on economic value. In addition, lameness prevalence influences the economic value, stressing the importance of an adequate prior estimation of the on-farm prevalence. The simulations provide first estimates for the upper limits for purchase prices of automatic detection systems. The framework allowed for identification of knowledge gaps obstructing more accurate economic value estimation. These include insights in cost reductions due to early detection and treatment, and links between specific lameness causes and their related losses. Because this model provides insight in the trade-offs between automatic detection systems' performance and investment price, it is a valuable tool to guide future research and developments. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. Theoretical modeling of iodine value and saponification value of biodiesel fuels from their fatty acid composition

    Energy Technology Data Exchange (ETDEWEB)

    Gopinath, A.; Puhan, Sukumar; Nagarajan, G. [Internal Combustion Engineering Division, Department of Mechanical Engineering, Anna University, Chennai 600 025, Tamil Nadu (India)

    2009-07-15

    Biodiesel is an alternative fuel consisting of alkyl esters of fatty acids from vegetable oils or animal fats. The properties of biodiesel depend on the type of vegetable oil used for the transesterification process. The objective of the present work is to theoretically predict the iodine value and the saponification value of different biodiesels from their fatty acid methyl ester composition. The fatty acid ester compositions and the above values of different biodiesels were taken from the available published data. A multiple linear regression model was developed to predict the iodine value and saponification value of different biodiesels. The predicted results showed that the prediction errors were less than 3.4% compared to the available published data. The predicted values were also verified by substituting in the available published model which was developed to predict the higher heating values of biodiesel fuels from their iodine value and the saponification value. The resulting heating values of biodiesels were then compared with the published heating values and reported. (author)

  6. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  7. Thermal unit availability modeling in a regional simulation model

    International Nuclear Information System (INIS)

    Yamayee, Z.A.; Port, J.; Robinett, W.

    1983-01-01

    The System Analysis Model (SAM) developed under the umbrella of PNUCC's System Analysis Committee is capable of simulating the operation of a given load/resource scenario. This model employs a Monte-Carlo simulation to incorporate uncertainties. Among uncertainties modeled is thermal unit availability both for energy simulation (seasonal) and capacity simulations (hourly). This paper presents the availability modeling in the capacity and energy models. The use of regional and national data in deriving the two availability models, the interaction between the two and modifications made to the capacity model in order to reflect regional practices is presented. A sample problem is presented to show the modification process. Results for modeling a nuclear unit using NERC-GADS is presented

  8. Plasma disruption modeling and simulation

    International Nuclear Information System (INIS)

    Hassanein, A.

    1994-01-01

    Disruptions in tokamak reactors are considered a limiting factor to successful operation and reliable design. The behavior of plasma-facing components during a disruption is critical to the overall integrity of the reactor. Erosion of plasma facing-material (PFM) surfaces due to thermal energy dump during the disruption can severely limit the lifetime of these components and thus diminish the economic feasibility of the reactor. A comprehensive understanding of the interplay of various physical processes during a disruption is essential for determining component lifetime and potentially improving the performance of such components. There are three principal stages in modeling the behavior of PFM during a disruption. Initially, the incident plasma particles will deposit their energy directly on the PFM surface, heating it to a very high temperature where ablation occurs. Models for plasma-material interactions have been developed and used to predict material thermal evolution during the disruption. Within a few microseconds after the start of the disruption, enough material is vaporized to intercept most of the incoming plasma particles. Models for plasma-vapor interactions are necessary to predict vapor cloud expansion and hydrodynamics. Continuous heating of the vapor cloud above the material surface by the incident plasma particles will excite, ionize, and cause vapor atoms to emit thermal radiation. Accurate models for radiation transport in the vapor are essential for calculating the net radiated flux to the material surface which determines the final erosion thickness and consequently component lifetime. A comprehensive model that takes into account various stages of plasma-material interaction has been developed and used to predict erosion rates during reactor disruption, as well during induced disruption in laboratory experiments

  9. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  10. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    Science.gov (United States)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  11. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our...... understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  12. Model calibration for building energy efficiency simulation

    International Nuclear Information System (INIS)

    Mustafaraj, Giorgio; Marini, Dashamir; Costa, Andrea; Keane, Marcus

    2014-01-01

    Highlights: • Developing a 3D model relating to building architecture, occupancy and HVAC operation. • Two calibration stages developed, final model providing accurate results. • Using an onsite weather station for generating the weather data file in EnergyPlus. • Predicting thermal behaviour of underfloor heating, heat pump and natural ventilation. • Monthly energy saving opportunities related to heat pump of 20–27% was identified. - Abstract: This research work deals with an Environmental Research Institute (ERI) building where an underfloor heating system and natural ventilation are the main systems used to maintain comfort condition throughout 80% of the building areas. Firstly, this work involved developing a 3D model relating to building architecture, occupancy and HVAC operation. Secondly, the calibration methodology, which consists of two levels, was then applied in order to insure accuracy and reduce the likelihood of errors. To further improve the accuracy of calibration a historical weather data file related to year 2011, was created from the on-site local weather station of ERI building. After applying the second level of calibration process, the values of Mean bias Error (MBE) and Cumulative Variation of Root Mean Squared Error (CV(RMSE)) on hourly based analysis for heat pump electricity consumption varied within the following ranges: (MBE) hourly from −5.6% to 7.5% and CV(RMSE) hourly from 7.3% to 25.1%. Finally, the building was simulated with EnergyPlus to identify further possibilities of energy savings supplied by a water to water heat pump to underfloor heating system. It found that electricity consumption savings from the heat pump can vary between 20% and 27% on monthly bases

  13. Algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images

    International Nuclear Information System (INIS)

    Ogino, Takashi; Egawa, Sunao

    1991-01-01

    New algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images were developed. One, designated plane weighting method, is to correct CT value in proportion to the position of the beam element passing through the voxel. The other, designated solid weighting method, is to correct CT value in proportion to the length of the beam element passing through the voxel and the volume of voxel. Phantom experiments showed fair spatial resolution in the transverse direction. In the longitudinal direction, however, spatial resolution of under slice thickness could not be obtained. Contrast resolution was equivalent for both methods. In patient studies, the reconstructed radiotherapy simulation image was almost similar in visual perception of the density resolution to a simulation film taken by X-ray simulator. (author)

  14. Delivering Value In A Global Aerospace Manufacturer Through The Effective Use Of Numerical Process Simulation

    Science.gov (United States)

    Ward, M. J.; Walløe, S. J.

    2004-06-01

    Numerical models are used extensively in the aerospace sector to identify appropriate manufacturing parameters, and to minimize the risk associated with new product introduction and manufacturing change. This usage is equally prevalent in original equipment manufacturers (OEMs), and in their supply chains. The wide range of manufacturing processes and production environments involved, coupled with the varying degrees of technology maturity associated with numerical models of different processes leads to a situation of significant complexity from the OEM perspective. In addition, the intended use of simulation technology can vary considerably between applications, from simple geometric assessment of die shape at one extreme, to full process design or development at the other. Consequently there is an increasing trend towards multi-scale modelling, i.e. the use of several different model types, with differing attributes in terms of accuracy and speed to support a range of different new product introduction decisions. This makes the allocation of appropriate levels of activity to the research and implementation of new capabilities a difficult problem. This paper uses a number of industrial cases studies to illustrate a framework for making such allocation decisions such that value to the OEM is maximized, and investigates how such a framework is likely to shift over the next few years based on technological developments.

  15. Diversity modelling for electrical power system simulation

    International Nuclear Information System (INIS)

    Sharip, R M; Abu Zarim, M A U A

    2013-01-01

    This paper considers diversity of generation and demand profiles against the different future energy scenarios and evaluates these on a technical basis. Compared to previous studies, this research applied a forecasting concept based on possible growth rates from publically electrical distribution scenarios concerning the UK. These scenarios were created by different bodies considering aspects such as environment, policy, regulation, economic and technical. In line with these scenarios, forecasting is on a long term timescale (up to every ten years from 2020 until 2050) in order to create a possible output of generation mix and demand profiles to be used as an appropriate boundary condition for the network simulation. The network considered is a segment of rural LV populated with a mixture of different housing types. The profiles for the 'future' energy and demand have been successfully modelled by applying a forecasting method. The network results under these profiles shows for the cases studied that even though the value of the power produced from each Micro-generation is often in line with the demand requirements of an individual dwelling there will be no problems arising from high penetration of Micro-generation and demand side management for each dwellings considered. The results obtained highlight the technical issues/changes for energy delivery and management to rural customers under the future energy scenarios

  16. Diversity modelling for electrical power system simulation

    Science.gov (United States)

    Sharip, R. M.; Abu Zarim, M. A. U. A.

    2013-12-01

    This paper considers diversity of generation and demand profiles against the different future energy scenarios and evaluates these on a technical basis. Compared to previous studies, this research applied a forecasting concept based on possible growth rates from publically electrical distribution scenarios concerning the UK. These scenarios were created by different bodies considering aspects such as environment, policy, regulation, economic and technical. In line with these scenarios, forecasting is on a long term timescale (up to every ten years from 2020 until 2050) in order to create a possible output of generation mix and demand profiles to be used as an appropriate boundary condition for the network simulation. The network considered is a segment of rural LV populated with a mixture of different housing types. The profiles for the 'future' energy and demand have been successfully modelled by applying a forecasting method. The network results under these profiles shows for the cases studied that even though the value of the power produced from each Micro-generation is often in line with the demand requirements of an individual dwelling there will be no problems arising from high penetration of Micro-generation and demand side management for each dwellings considered. The results obtained highlight the technical issues/changes for energy delivery and management to rural customers under the future energy scenarios.

  17. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  18. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  19. Analytical system dynamics modeling and simulation

    CERN Document Server

    Fabien, Brian C

    2008-01-01

    This book offering a modeling technique based on Lagrange's energy method includes 125 worked examples. Using this technique enables one to model and simulate systems as diverse as a six-link, closed-loop mechanism or a transistor power amplifier.

  20. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  1. Dynamic modeling and simulation of wind turbines

    International Nuclear Information System (INIS)

    Ghafari Seadat, M.H.; Kheradmand Keysami, M.; Lari, H.R.

    2002-01-01

    Using wind energy for generating electricity in wind turbines is a good way for using renewable energies. It can also help to protect the environment. The main objective of this paper is dynamic modeling by energy method and simulation of a wind turbine aided by computer. In this paper, the equations of motion are extracted for simulating the system of wind turbine and then the behavior of the system become obvious by solving the equations. The turbine is considered with three blade rotor in wind direction, induced generator that is connected to the network and constant revolution for simulation of wind turbine. Every part of the wind turbine should be simulated for simulation of wind turbine. The main parts are blades, gearbox, shafts and generator

  2. Regional model simulations of New Zealand climate

    Science.gov (United States)

    Renwick, James A.; Katzfey, Jack J.; Nguyen, Kim C.; McGregor, John L.

    1998-03-01

    Simulation of New Zealand climate is examined through the use of a regional climate model nested within the output of the Commonwealth Scientific and Industrial Research Organisation nine-level general circulation model (GCM). R21 resolution GCM output is used to drive a regional model run at 125 km grid spacing over the Australasian region. The 125 km run is used in turn to drive a simulation at 50 km resolution over New Zealand. Simulations with a full seasonal cycle are performed for 10 model years. The focus is on the quality of the simulation of present-day climate, but results of a doubled-CO2 run are discussed briefly. Spatial patterns of mean simulated precipitation and surface temperatures improve markedly as horizontal resolution is increased, through the better resolution of the country's orography. However, increased horizontal resolution leads to a positive bias in precipitation. At 50 km resolution, simulated frequency distributions of daily maximum/minimum temperatures are statistically similar to those of observations at many stations, while frequency distributions of daily precipitation appear to be statistically different to those of observations at most stations. Modeled daily precipitation variability at 125 km resolution is considerably less than observed, but is comparable to, or exceeds, observed variability at 50 km resolution. The sensitivity of the simulated climate to changes in the specification of the land surface is discussed briefly. Spatial patterns of the frequency of extreme temperatures and precipitation are generally well modeled. Under a doubling of CO2, the frequency of precipitation extremes changes only slightly at most locations, while air frosts become virtually unknown except at high-elevation sites.

  3. Landscape Modelling and Simulation Using Spatial Data

    Directory of Open Access Journals (Sweden)

    Amjed Naser Mohsin AL-Hameedawi

    2017-08-01

    Full Text Available In this paper a procedure was performed for engendering spatial model of landscape acclimated to reality simulation. This procedure based on combining spatial data and field measurements with computer graphics reproduced using Blender software. Thereafter that we are possible to form a 3D simulation based on VIS ALL packages. The objective was to make a model utilising GIS, including inputs to the feature attribute data. The objective of these efforts concentrated on coordinating a tolerable spatial prototype, circumscribing facilitation scheme and outlining the intended framework. Thus; the eventual result was utilized in simulation form. The performed procedure contains not only data gathering, fieldwork and paradigm providing, but extended to supply a new method necessary to provide the respective 3D simulation mapping production, which authorises the decision makers as well as investors to achieve permanent acceptance an independent navigation system for Geoscience applications.

  4. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  5. A queuing model for road traffic simulation

    International Nuclear Information System (INIS)

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-01-01

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme

  6. Clock error models for simulation and estimation

    International Nuclear Information System (INIS)

    Meditch, J.S.

    1981-10-01

    Mathematical models for the simulation and estimation of errors in precision oscillators used as time references in satellite navigation systems are developed. The results, based on all currently known oscillator error sources, are directly implementable on a digital computer. The simulation formulation is sufficiently flexible to allow for the inclusion or exclusion of individual error sources as desired. The estimation algorithms, following from Kalman filter theory, provide directly for the error analysis of clock errors in both filtering and prediction

  7. Modeling and simulation goals and accomplishments

    International Nuclear Information System (INIS)

    Turinsky, P.

    2013-01-01

    The CASL (Consortium for Advanced Simulation of Light Water Reactors) mission is to develop and apply the Virtual Reactor simulator (VERA) to optimise nuclear power in terms of capital and operating costs, of nuclear waste production and of nuclear safety. An efficient and reliable virtual reactor simulator relies on 3-dimensional calculations, accurate physics models and code coupling. Advances in computer hardware, along with comparable advances in numerical solvers make the VERA project achievable. This series of slides details the VERA project and presents the specificities and performance of the codes involved in the project and ends by listing the computing needs

  8. The value of load shifting. An estimate for Norway using the EMPS model

    International Nuclear Information System (INIS)

    Doorman, Gerard; Wolfgang, Ove

    2006-05-01

    An attempt is made to estimate the value of Load Shifting (LS) in the Norwegian system, using the EMPS model. A thorough update of the demand side model and cost estimates used in the model was done as a preparation for the project, and the report gives a comprehensive description of the demand models used. The LS measure that is analyzed is moving 600 MW demand in Norway from peak to lower demand hours during the day. The value of this was estimated both in a simplified manner (based on simulated price differences between these periods), and by simulations with the EMPS model and a subsequent calculation of the socio-economic surplus. Neither approaches showed any significant value. The results do not necessarily mean that the value in reality is zero - there are a number of limitations in the model which make it difficult to estimate the real value, like the representation of wind generation, demand variability, outages, exchange prices with continental Europe, flexibility of hydro and thermal generation, reserves and elasticity of demand in the short run. It was verified through sensitivity calculations that especially increasing reserve requirements and increasing the variability of wind generation increased price differences and therefore the value of LS. A number of improvements in the EMPS model and data are proposed to obtain a more suitable simulation model for this kind of analyses: 1) modeling of reserves, 2) representation of wind variability, 3) thermal generation models, 4) differentiation between long and short term price elasticity, 5) review of interconnection capacities, 6) use of quadratic losses and the 7) representation of more stochastic factors like e.g. outages in the simulations. Although the model at present clearly has its limitations with respect to estimating the value of LS, it appears that price differences between spot prices in the actual hours in reality are small. Comparison with Nord Pool spot prices for the years 2003

  9. The development of the Professional Values Model in Nursing.

    Science.gov (United States)

    Kaya, Ayla; Boz, İlkay

    2017-01-01

    One of the most important criteria for professionalism is accumulation of knowledge that is usable in professional practice. Nursing models and theories are important elements of accumulating nursing knowledge and have a chance to guarantee the ethical professional practice. In recent years, there has been an increase in the use of models in nursing research and newly created terminology has started to be used in nursing. In this study, a new model, termed as the Professional Values Model, developed by the authors was described. Concepts comprising the conceptual framework of the model and relations between the concepts were explained. It is assumed that awareness about concepts of the model will increase not only the patients' satisfaction with nursing care, but also the nurses' job satisfaction and quality of nursing care. Contemporary literature has been reviewed and synthesized to develop this theoretical paper on the Professional Values Model in nursing. Having high values in nursing increases job satisfaction, which results in the improvement of patient care and satisfaction. Also, individual characteristics are effective in the determination of individual needs, priorities, and values. This relation, proved through research about the Professional Values Model, has been explained. With development of these concepts, individuals' satisfaction with care and nurses' job satisfaction will be enhanced, which will increase the quality of nursing care. Most importantly, nurses can take proper decisions about ethical dilemmas and take ethical action when they take these values into consideration when giving care. The Professional Values Model seems suitable for nurse managers and it is expected that testing will improve it. Implementation of the Professional Values Model by nurse managers may increase motivation of nurses they work with. It is suggested that guidance by the Professional Values Model may help in enhancement of motivation efforts of the nurse managers

  10. Modeling and simulation of different and representative engineering problems using Network Simulation Method.

    Science.gov (United States)

    Sánchez-Pérez, J F; Marín, F; Morales, J L; Cánovas, M; Alhama, F

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model.

  11. Modeling and simulation of different and representative engineering problems using Network Simulation Method

    Science.gov (United States)

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model. PMID:29518121

  12. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  13. Multi-Valued Modal Fixed Point Logics for Model Checking

    Science.gov (United States)

    Nishizawa, Koki

    In this paper, I will show how multi-valued logics are used for model checking. Model checking is an automatic technique to analyze correctness of hardware and software systems. A model checker is based on a temporal logic or a modal fixed point logic. That is to say, a system to be checked is formalized as a Kripke model, a property to be satisfied by the system is formalized as a temporal formula or a modal formula, and the model checker checks that the Kripke model satisfies the formula. Although most existing model checkers are based on 2-valued logics, recently new attempts have been made to extend the underlying logics of model checkers to multi-valued logics. I will summarize these new results.

  14. Validation of the simulator neutronics model

    International Nuclear Information System (INIS)

    Gregory, M.V.

    1984-01-01

    The neutronics model in the SRP reactor training simulator computes the variation with time of the neutron population in the reactor core. The power output of a reactor is directly proportional to the neutron population, thus in a very real sense the neutronics model determines the response of the simulator. The geometrical complexity of the reactor control system in SRP reactors requires the neutronics model to provide a detailed, 3D representation of the reactor core. Existing simulator technology does not allow such a detailed representation to run in real-time in a minicomputer environment, thus an entirely different approach to the problem was required. A prompt jump method has been developed in answer to this need

  15. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  16. Audiovisual preservation strategies, data models and value-chains

    OpenAIRE

    Addis, Matthew; Wright, Richard

    2010-01-01

    This is a report on preservation strategies, models and value-chains for digital file-based audiovisual content. The report includes: (a)current and emerging value-chains and business-models for audiovisual preservation;(b) a comparison of preservation strategies for audiovisual content including their strengths and weaknesses, and(c) a review of current preservation metadata models, and requirements for extension to support audiovisual files.

  17. Modeling and simulation of axisymmetric coating growth on nanofibers

    International Nuclear Information System (INIS)

    Moore, K.; Clemons, C. B.; Kreider, K. L.; Young, G. W.

    2007-01-01

    This work is a modeling and simulation extension of an integrated experimental/modeling investigation of a procedure to coat nanofibers and core-clad nanostructures with thin film materials using plasma enhanced physical vapor deposition. In the experimental effort, electrospun polymer nanofibers are coated with metallic materials under different operating conditions to observe changes in the coating morphology. The modeling effort focuses on linking simple models at the reactor level, nanofiber level, and atomic level to form a comprehensive model. The comprehensive model leads to the definition of an evolution equation for the coating free surface. This equation was previously derived and solved under a single-valued assumption in a polar geometry to determine the coating morphology as a function of operating conditions. The present work considers the axisymmetric geometry and solves the evolution equation without the single-valued assumption and under less restrictive assumptions on the concentration field than the previous work

  18. New exploration on TMSR: modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Si, S.; Chen, Q.; Bei, H.; Zhao, J., E-mail: ssy@snerdi.com.cn [Shanghai Nuclear Engineering Research & Design Inst., Shanghai (China)

    2015-07-01

    A tightly coupled multi-physics model for MSR (Molten Salt Reactor) system involving the reactor core and the rest of the primary loop has been developed and employed in an in-house developed computer code TANG-MSR. In this paper, the computer code is used to simulate the behavior of steady state operation and transient for our redesigned TMSR. The presented simulation results demonstrate that the models employed in TANG-MSR can capture major physics phenomena in MSR and the redesigned TMSR has excellent performance of safety and sustainability. (author)

  19. Self-Service Banking: Value Creation Models and Information Exchange

    Directory of Open Access Journals (Sweden)

    Ragnvald Sannes

    2001-01-01

    Full Text Available This paper argues that most banks have failed to exploit the potential of self-service banking because they base their service design on an incomplete business model for self-service. A framework for evaluation of self-service banking concepts is developed on the basis of Stabell and Fjeldstad's three value configurations. The value network and the value shop are consistent with self-service banking while the value chain is inappropriate. The impact of the value configurations on information exchange and self-service functionality is discussed, and a framework for design of such services proposed. Current self-service banking practices are compared to the framework, and it is concluded that current practice matches the concept of a value network and not the value shop. However, current practices are only a partial implementation of a value network-based self-service banking concept.

  20. Nuclear reactor core modelling in multifunctional simulators

    International Nuclear Information System (INIS)

    Puska, E.K.

    1999-01-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  1. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  2. Environment Modeling Using Runtime Values for JPF-Android

    Science.gov (United States)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  3. Value-based resource management: a model for best value nursing care.

    Science.gov (United States)

    Caspers, Barbara A; Pickard, Beth

    2013-01-01

    With the health care environment shifting to a value-based payment system, Catholic Health Initiatives nursing leadership spearheaded an initiative with 14 hospitals to establish best nursing care at a lower cost. The implementation of technology-enabled business processes at point of care led to a new model for best value nursing care: Value-Based Resource Management. The new model integrates clinical patient data from the electronic medical record and embeds the new information in care team workflows for actionable real-time decision support and predictive forecasting. The participating hospitals reported increased patient satisfaction and cost savings in the reduction of overtime and improvement in length of stay management. New data generated by the initiative on nursing hours and cost by patient and by population (Medicare severity diagnosis-related groups), and patient health status outcomes across the acute care continuum expanded business intelligence for a value-based population health system.

  4. Modeling and Application of Customer Lifetime Value in Online Retail

    OpenAIRE

    Pavel Jasek; Lenka Vrana; Lucie Sperkova; Zdenek Smutny; Marek Kobulsky

    2018-01-01

    This article provides an empirical statistical analysis and discussion of the predictive abilities of selected customer lifetime value (CLV) models that could be used in online shopping within e-commerce business settings. The comparison of CLV predictive abilities, using selected evaluation metrics, is made on selected CLV models: Extended Pareto/NBD model (EP/NBD), Markov chain model and Status Quo model. The article uses six online store datasets with annual revenues in the order of tens o...

  5. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  6. Vermont Yankee simulator BOP model upgrade

    International Nuclear Information System (INIS)

    Alejandro, R.; Udbinac, M.J.

    2006-01-01

    The Vermont Yankee simulator has undergone significant changes in the 20 years since the original order was placed. After the move from the original Unix to MS Windows environment, and upgrade to the latest version of SimPort, now called MASTER, the platform was set for an overhaul and replacement of major plant system models. Over a period of a few months, the VY simulator team, in partnership with WSC engineers, replaced outdated legacy models of the main steam, condenser, condensate, circulating water, feedwater and feedwater heaters, and main turbine and auxiliaries. The timing was ideal, as the plant was undergoing a power up-rate, so the opportunity was taken to replace the legacy models with industry-leading, true on-line object oriented graphical models. Due to the efficiency of design and ease of use of the MASTER tools, VY staff performed the majority of the modeling work themselves with great success, with only occasional assistance from WSC, in a relatively short time-period, despite having to maintain all of their 'regular' simulator maintenance responsibilities. This paper will provide a more detailed view of the VY simulator, including how it is used and how it has benefited from the enhancements and upgrades implemented during the project. (author)

  7. Simulation modeling and analysis in safety. II

    International Nuclear Information System (INIS)

    Ayoub, M.A.

    1981-01-01

    The paper introduces and illustrates simulation modeling as a viable approach for dealing with complex issues and decisions in safety and health. The author details two studies: evaluation of employee exposure to airborne radioactive materials and effectiveness of the safety organization. The first study seeks to define a policy to manage a facility used in testing employees for radiation contamination. An acceptable policy is one that would permit the testing of all employees as defined under regulatory requirements, while not exceeding available resources. The second study evaluates the relationship between safety performance and the characteristics of the organization, its management, its policy, and communication patterns among various functions and levels. Both studies use models where decisions are reached based on the prevailing conditions and occurrence of key events within the simulation environment. Finally, several problem areas suitable for simulation studies are highlighted. (Auth.)

  8. Modeling salmonella Dublin into the dairy herd simulation model Simherd

    DEFF Research Database (Denmark)

    Kudahl, Anne Braad

    2010-01-01

    Infection with Salmonella Dublin in the dairy herd and effects of the infection and relevant control measures are currently being modeled into the dairy herd simulation model called Simherd. The aim is to compare the effects of different control strategies against Salmonella Dublin on both within...... of the simulations will therefore be used for decision support in the national surveillance and eradication program against Salmonella Dublin. Basic structures of the model are programmed and will be presented at the workshop. The model is in a phase of face-validation by a group of Salmonella......-herd- prevalence and economy by simulations. The project Dublin on both within-herd- prevalence and economy by simulations. The project is a part of a larger national project "Salmonella 2007 - 2011" with the main objective to reduce the prevalence of Salmonella Dublin in Danish Dairy herds. Results...

  9. Brownian gas models for extreme-value laws

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2013-01-01

    In this paper we establish one-dimensional Brownian gas models for the extreme-value laws of Gumbel, Weibull, and Fréchet. A gas model is a countable collection of independent particles governed by common diffusion dynamics. The extreme-value laws are the universal probability distributions governing the affine scaling limits of the maxima and minima of ensembles of independent and identically distributed one-dimensional random variables. Using the recently introduced concept of stationary Poissonian intensities, we construct two gas models whose global statistical structures are stationary, and yield the extreme-value laws: a linear Brownian motion gas model for the Gumbel law, and a geometric Brownian motion gas model for the Weibull and Fréchet laws. The stochastic dynamics of these gas models are studied in detail, and closed-form analytical descriptions of their temporal correlation structures, their topological phase transitions, and their intrinsic first-passage-time fluxes are presented. (paper)

  10. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels

    2013-01-01

    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  11. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  12. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  13. Development of a Value Inquiry Model in Biology Education.

    Science.gov (United States)

    Jeong, Eun-Young; Kim, Young-Soo

    2000-01-01

    Points out the rapid advances in biology, increasing bioethical issues, and how students need to make rational decisions. Introduces a value inquiry model development that includes identifying and clarifying value problems; understanding biological knowledge related to conflict situations; considering, selecting, and evaluating each alternative;…

  14. Towards the Integration of Value and Coordination Models - Position Paper -

    NARCIS (Netherlands)

    Bodenstaff, L.; Reichert, M.U.; Wieringa, Roelf J.; Pernici, B; Gulla, J.A.

    Cross-organizational collaborations have a high complexity. Modelling these collaborations can be done from di®erent perspectives. For example, the value perspective represents expected value exchanges in a collaboration while the coordination perspective represents the order in which these

  15. Satisfaction with virtual worlds: An integrated model of experiential value

    NARCIS (Netherlands)

    Verhagen, T.; Feldberg, J.F.M.; van den Hooff, B.J.; Meents, S.; Merikivi, J.

    2011-01-01

    Although virtual worlds increasingly attract users today, few studies have addressed what satisfies virtual world users. We therefore defined and tested an integrated model of experiential system value and virtual world satisfaction. Drawing upon expectancy-value and cognitive evaluation theories,

  16. Environment modeling using runtime values for JPF-Android

    CSIR Research Space (South Africa)

    Van der Merwe, H

    2015-11-01

    Full Text Available , the environment of an application is simplified/abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution...

  17. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  18. Advanced feeder control using fast simulation models

    NARCIS (Netherlands)

    Verheijen, O.S.; Op den Camp, O.M.G.C.; Beerkens, R.G.C.; Backx, A.C.P.M.; Huisman, L.; Drummond, C.H.

    2005-01-01

    For the automatic control of glass quality in glass production, the relation between process variable and product or glass quality and process conditions/process input parameters must be known in detail. So far, detailed 3-D glass melting simulation models were used to predict the effect of process

  19. Modeling and Simulating Virtual Anatomical Humans

    NARCIS (Netherlands)

    Madehkhaksar, Forough; Luo, Zhiping; Pronost, Nicolas; Egges, Arjan

    2014-01-01

    This chapter presents human musculoskeletal modeling and simulation as a challenging field that lies between biomechanics and computer animation. One of the main goals of computer animation research is to develop algorithms and systems that produce plausible motion. On the other hand, the main

  20. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  1. Continuous Spatial Process Models for Spatial Extreme Values

    KAUST Repository

    Sang, Huiyan; Gelfand, Alan E.

    2010-01-01

    process model for extreme values that provides mean square continuous realizations, where the behavior of the surface is driven by the spatial dependence which is unexplained under the latent spatio-temporal specification for the GEV parameters

  2. Thermohydraulic modeling and simulation of breeder reactors

    International Nuclear Information System (INIS)

    Agrawal, A.K.; Khatib-Rahbar, M.; Curtis, R.T.; Hetrick, D.L.; Girijashankar, P.V.

    1982-01-01

    This paper deals with the modeling and simulation of system-wide transients in LMFBRs. Unprotected events (i.e., the presumption of failure of the plant protection system) leading to core-melt are not considered in this paper. The existing computational capabilities in the area of protected transients in the US are noted. Various physical and numerical approximations that are made in these codes are discussed. Finally, the future direction in the area of model verification and improvements is discussed

  3. Value Creation Challenges in Multichannel Retail Business Models

    Directory of Open Access Journals (Sweden)

    Mika Yrjölä

    2014-08-01

    Full Text Available Purpose: The purpose of the paper is to identify and analyze the challenges of value creation in multichannel retail business models. Design/methodology/approach: With the help of semi-structured interviews with top executives from different retailing environments, this study introduces a model of value creation challenges in the context of multichannel retailing. The challenges are analyzed in terms of three retail business model elements, i.e., format, activities, and governance. Findings: Adopting a multichannel retail business model requires critical rethinking of the basic building blocks of value creation. First of all, as customers effortlessly move between multiple channels, multichannel formats can lead to a mismatch between customer and firm value. Secondly, retailers face pressures to use their activities to form integrated total offerings to customers. Thirdly, multiple channels might lead to organizational silos with conflicting goals. A careful orchestration of value creation is needed to determine the roles and incentives of the channel parties involved. Research limitations/implications: In contrast to previous business model literature, this study did not adopt a network-centric view. By embracing the boundary-spanning nature of the business model, other challenges and elements might have been discovered (e.g., challenges in managing relationships with suppliers. Practical implications: As a practical contribution, this paper has analyzed the challenges retailers face in adopting multichannel business models. Customer tendencies for showrooming behavior highlight the need for generating efficient lock-in strategies. Customized, personal offers and information are ways to increase customer value, differentiate from competition, and achieve lock-in. Originality/value: As a theoretical contribution, this paper empirically investigates value creation challenges in a specific context, lowering the level of abstraction in the mostly

  4. Equation-oriented specification of neural models for simulations

    Directory of Open Access Journals (Sweden)

    Marcel eStimberg

    2014-02-01

    Full Text Available Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modelling software is to build models based on a library of pre-defined models and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions.The presented approach has been implemented in the Brian2 simulator.

  5. Modified network simulation model with token method of bus access

    Directory of Open Access Journals (Sweden)

    L.V. Stribulevich

    2013-08-01

    Full Text Available Purpose. To study the characteristics of the local network with the marker method of access to the bus its modified simulation model was developed. Methodology. Defining characteristics of the network is carried out on the developed simulation model, which is based on the state diagram-layer network station with the mechanism of processing priorities, both in steady state and in the performance of control procedures: the initiation of a logical ring, the entrance and exit of the station network with a logical ring. Findings. A simulation model, on the basis of which can be obtained the dependencies of the application the maximum waiting time in the queue for different classes of access, and the reaction time usable bandwidth on the data rate, the number of network stations, the generation rate applications, the number of frames transmitted per token holding time, frame length was developed. Originality. The technique of network simulation reflecting its work in the steady condition and during the control procedures, the mechanism of priority ranking and handling was proposed. Practical value. Defining network characteristics in the real-time systems on railway transport based on the developed simulation model.

  6. Integral-Value Models for Outcomes over Continuous Time

    DEFF Research Database (Denmark)

    Harvey, Charles M.; Østerdal, Lars Peter

    Models of preferences between outcomes over continuous time are important for individual, corporate, and social decision making, e.g., medical treatment, infrastructure development, and environmental regulation. This paper presents a foundation for such models. It shows that conditions on prefere...... on preferences between real- or vector-valued outcomes over continuous time are satisfied if and only if the preferences are represented by a value function having an integral form......Models of preferences between outcomes over continuous time are important for individual, corporate, and social decision making, e.g., medical treatment, infrastructure development, and environmental regulation. This paper presents a foundation for such models. It shows that conditions...

  7. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  8. Modeling Supermassive Black Holes in Cosmological Simulations

    Science.gov (United States)

    Tremmel, Michael

    My thesis work has focused on improving the implementation of supermassive black hole (SMBH) physics in cosmological hydrodynamic simulations. SMBHs are ubiquitous in mas- sive galaxies, as well as bulge-less galaxies and dwarfs, and are thought to be a critical component to massive galaxy evolution. Still, much is unknown about how SMBHs form, grow, and affect their host galaxies. Cosmological simulations are an invaluable tool for un- derstanding the formation of galaxies, self-consistently tracking their evolution with realistic merger and gas accretion histories. SMBHs are often modeled in these simulations (generally as a necessity to produce realistic massive galaxies), but their implementations are commonly simplified in ways that can limit what can be learned. Current and future observations are opening new windows into the lifecycle of SMBHs and their host galaxies, but require more detailed, physically motivated simulations. Within the novel framework I have developed, SMBHs 1) are seeded at early times without a priori assumptions of galaxy occupation, 2) grow in a way that accounts for the angular momentum of gas, and 3) experience realistic orbital evolution. I show how this model, properly tuned with a novel parameter optimiza- tion technique, results in realistic galaxies and SMBHs. Utilizing the unique ability of these simulations to capture the dynamical evolution of SMBHs, I present the first self-consistent prediction for the formation timescales of close SMBH pairs, precursors to SMBH binaries and merger events potentially detected by future gravitational wave experiments.

  9. Simulation data for an estimation of the maximum theoretical value and confidence interval for the correlation coefficient.

    Science.gov (United States)

    Rocco, Paolo; Cilurzo, Francesco; Minghetti, Paola; Vistoli, Giulio; Pedretti, Alessandro

    2017-10-01

    The data presented in this article are related to the article titled "Molecular Dynamics as a tool for in silico screening of skin permeability" (Rocco et al., 2017) [1]. Knowledge of the confidence interval and maximum theoretical value of the correlation coefficient r can prove useful to estimate the reliability of developed predictive models, in particular when there is great variability in compiled experimental datasets. In this Data in Brief article, data from purposely designed numerical simulations are presented to show how much the maximum r value is worsened by increasing the data uncertainty. The corresponding confidence interval of r is determined by using the Fisher r → Z transform.

  10. Simulation of MILD combustion using Perfectly Stirred Reactor model

    KAUST Repository

    Chen, Z.

    2016-07-06

    A simple model based on a Perfectly Stirred Reactor (PSR) is proposed for moderate or intense low-oxygen dilution (MILD) combustion. The PSR calculation is performed covering the entire flammability range and the tabulated chemistry approach is used with a presumed joint probability density function (PDF). The jet, in hot and diluted coflow experimental set-up under MILD conditions, is simulated using this reactor model for two oxygen dilution levels. The computed results for mean temperature, major and minor species mass fractions are compared with the experimental data and simulation results obtained recently using a multi-environment transported PDF approach. Overall, a good agreement is observed at three different axial locations for these comparisons despite the over-predicted peak value of CO formation. This suggests that MILD combustion can be effectively modelled by the proposed PSR model with lower computational cost.

  11. Crash simulation: an immersive learning model.

    Science.gov (United States)

    Wenham, John; Bennett, Paul; Gleeson, Wendy

    2017-12-26

    Far West New South Wales Local Emergency Management Committee runs an annual crash simulation exercise to assess the operational readiness of all local emergency services to coordinate and manage a multi-casualty exercise. Since 2009, the Broken Hill University Department of Rural Health (BHUDRH) has collaborated with the committee, enabling the inclusion of health students in this exercise. It is an immersive interprofessional learning experience that evaluates teamwork, communication and safe effective clinical trauma management outside the hospital setting. After 7 years of modifying and developing the exercise, we set out to evaluate its impact on the students' learning, and sought ethics approval from the University of Sydney for this study. At the start of this year's crash simulation, students were given information sheets and consent forms with regards to the research. Once formal debriefing had finished, the researchers conducted a semi-structured focus-group interview with the health students to gain insight into their experience and their perceived value of the training. Students also completed short-answer questionnaires, and the anonymised responses were analysed. Crash simulation … evaluates teamwork, communication and safe effective clinical trauma management IMPLICATIONS: Participants identified that this multidisciplinary learning opportunity in a pre-hospital mass casualty situation was of value to them. It has taken them outside of their usually protected hospital or primary care setting and tested their critical thinking and communication skills. We recommend this learning concept to other educational institutions. Further research will assess the learning value of the simulated event to the other agencies involved. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  12. Transverse Momentum Distributions of Electron in Simulated QED Model

    Science.gov (United States)

    Kaur, Navdeep; Dahiya, Harleen

    2018-05-01

    In the present work, we have studied the transverse momentum distributions (TMDs) for the electron in simulated QED model. We have used the overlap representation of light-front wave functions where the spin-1/2 relativistic composite system consists of spin-1/2 fermion and spin-1 vector boson. The results have been obtained for T-even TMDs in transverse momentum plane for fixed value of longitudinal momentum fraction x.

  13. Advances in NLTE Modeling for Integrated Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Scott, H A; Hansen, S B

    2009-07-08

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, {Delta}n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  14. Decomposing the roles of perseveration and expected value representation in models of the Iowa gambling task.

    Science.gov (United States)

    Worthy, Darrell A; Pang, Bo; Byrne, Kaileigh A

    2013-01-01

    Models of human behavior in the Iowa Gambling Task (IGT) have played a pivotal role in accounting for behavioral differences during decision-making. One critical difference between models that have been used to account for behavior in the IGT is the inclusion or exclusion of the assumption that participants tend to persevere, or stay with the same option over consecutive trials. Models that allow for this assumption include win-stay-lose-shift (WSLS) models and reinforcement learning (RL) models that include a decay learning rule where expected values for each option decay as they are chosen less often. One shortcoming of RL models that have included decay rules is that the tendency to persevere by sticking with the same option has been conflated with the tendency to select the option with the highest expected value because a single term is used to represent both of these tendencies. In the current work we isolate the tendencies to perseverate and to select the option with the highest expected value by including them as separate terms in a Value-Plus-Perseveration (VPP) RL model. Overall the VPP model provides a better fit to data from a large group of participants than models that include a single term to account for both perseveration and the representation of expected value. Simulations of each model show that the VPP model's simulated choices most closely resemble the decision-making behavior of human subjects. In addition, we also find that parameter estimates of loss aversion are more strongly correlated with performance when perseverative tendencies and expected value representations are decomposed as separate terms within the model. The results suggest that the tendency to persevere and the tendency to select the option that leads to the best net payoff are central components of decision-making behavior in the IGT. Future work should use this model to better examine decision-making behavior.

  15. Possibilistic Fuzzy Net Present Value Model and Application

    Directory of Open Access Journals (Sweden)

    S. S. Appadoo

    2014-01-01

    Full Text Available The cash flow values and the interest rate in the net present value (NPV model are usually specified by either crisp numbers or random variables. In this paper, we first discuss some of the recent developments in possibility theory and find closed form expressions for fuzzy possibilistic net present value (FNPV. Then, following Carlsson and Fullér (2001, we discuss some of the possibilistic moments related to FNPV model along with an illustrative numerical example. We also give a unified approach to find higher order moments of FNPV by using the moment generating function introduced by Paseka et al. (2011.

  16. Mesoscopic modelling and simulation of soft matter.

    Science.gov (United States)

    Schiller, Ulf D; Krüger, Timm; Henrich, Oliver

    2017-12-20

    The deformability of soft condensed matter often requires modelling of hydrodynamical aspects to gain quantitative understanding. This, however, requires specialised methods that can resolve the multiscale nature of soft matter systems. We review a number of the most popular simulation methods that have emerged, such as Langevin dynamics, dissipative particle dynamics, multi-particle collision dynamics, sometimes also referred to as stochastic rotation dynamics, and the lattice-Boltzmann method. We conclude this review with a short glance at current compute architectures for high-performance computing and community codes for soft matter simulation.

  17. Hydrodynamic modeling of petroleum reservoirs using simulator MUFITS

    Science.gov (United States)

    Afanasyev, Andrey

    2015-04-01

    MUFITS is new noncommercial software for numerical modeling of subsurface processes in various applications (www.mufits.imec.msu.ru). To this point, the simulator was used for modeling nonisothermal flows in geothermal reservoirs and for modeling underground carbon dioxide storage. In this work, we present recent extension of the code to petroleum reservoirs. The simulator can be applied in conventional black oil modeling, but it also utilizes a more complicated models for volatile oil and gas condensate reservoirs as well as for oil rim fields. We give a brief overview of the code by providing the description of internal representation of reservoir models, which are constructed of grid blocks, interfaces, stock tanks as well as of pipe segments and pipe junctions for modeling wells and surface networks. For conventional black oil approach, we present the simulation results for SPE comparative tests. We propose an accelerated compositional modeling method for sub- and supercritical flows subjected to various phase equilibria, particularly to three-phase equilibria of vapour-liquid-liquid type. The method is based on the calculation of the thermodynamic potential of reservoir fluid as a function of pressure, total enthalpy and total composition and storing its values as a spline table, which is used in hydrodynamic simulation for accelerated PVT properties prediction. We provide the description of both the spline calculation procedure and the flashing algorithm. We evaluate the thermodynamic potential for a mixture of two pseudo-components modeling the heavy and light hydrocarbon fractions. We develop a technique for converting black oil PVT tables to the potential, which can be used for in-situ hydrocarbons multiphase equilibria prediction under sub- and supercritical conditions, particularly, in gas condensate and volatile oil reservoirs. We simulate recovery from a reservoir subject to near-critical initial conditions for hydrocarbon mixture. We acknowledge

  18. Numerical model simulation of atmospheric coolant plumes

    International Nuclear Information System (INIS)

    Gaillard, P.

    1980-01-01

    The effect of humid atmospheric coolants on the atmosphere is simulated by means of a three-dimensional numerical model. The atmosphere is defined by its natural vertical profiles of horizontal velocity, temperature, pressure and relative humidity. Effluent discharge is characterised by its vertical velocity and the temperature of air satured with water vapour. The subject of investigation is the area in the vicinity of the point of discharge, with due allowance for the wake effect of the tower and buildings and, where application, wind veer with altitude. The model equations express the conservation relationships for mometum, energy, total mass and water mass, for an incompressible fluid behaving in accordance with the Boussinesq assumptions. Condensation is represented by a simple thermodynamic model, and turbulent fluxes are simulated by introduction of turbulent viscosity and diffusivity data based on in-situ and experimental water model measurements. The three-dimensional problem expressed in terms of the primitive variables (u, v, w, p) is governed by an elliptic equation system which is solved numerically by application of an explicit time-marching algorithm in order to predict the steady-flow velocity distribution, temperature, water vapour concentration and the liquid-water concentration defining the visible plume. Windstill conditions are simulated by a program processing the elliptic equations in an axisymmetrical revolution coordinate system. The calculated visible plumes are compared with plumes observed on site with a view to validate the models [fr

  19. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  20. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L

    2007-01-01

    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  1. The Value Simulation-Based Learning Added to Machining Technology in Singapore

    Science.gov (United States)

    Fang, Linda; Tan, Hock Soon; Thwin, Mya Mya; Tan, Kim Cheng; Koh, Caroline

    2011-01-01

    This study seeks to understand the value simulation-based learning (SBL) added to the learning of Machining Technology in a 15-week core subject course offered to university students. The research questions were: (1) How did SBL enhance classroom learning? (2) How did SBL help participants in their test? (3) How did SBL prepare participants for…

  2. [Healthcare value chain: a model for the Brazilian healthcare system].

    Science.gov (United States)

    Pedroso, Marcelo Caldeira; Malik, Ana Maria

    2012-10-01

    This article presents a model of the healthcare value chain which consists of a schematic representation of the Brazilian healthcare system. The proposed model is adapted for the Brazilian reality and has the scope and flexibility for use in academic activities and analysis of the healthcare sector in Brazil. It places emphasis on three components: the main activities of the value chain, grouped in vertical and horizontal links; the mission of each link and the main value chain flows. The proposed model consists of six vertical and three horizontal links, amounting to nine. These are: knowledge development; supply of products and technologies; healthcare services; financial intermediation; healthcare financing; healthcare consumption; regulation; distribution of healthcare products; and complementary and support services. Four flows can be used to analyze the value chain: knowledge and innovation; products and services; financial; and information.

  3. Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes

    International Nuclear Information System (INIS)

    André, T.; Morini, F.; Karamitros, M.; Delorme, R.; Le Loirec, C.; Campos, L.; Champion, C.; Groetz, J.-E.; Fromm, M.; Bordage, M.-C.; Perrot, Y.; Barberet, Ph.

    2014-01-01

    Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov–Smirnov test has allowed confirming the statistical compatibility of all simulation results

  4. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.

    1998-02-01

    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  5. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect...... incomplete knowledge of the characteristics inherent to each model. During water immersion, the hydrostatic pressure lowers the peripheral vascular capacity and causes increased thoracic blood volume and high vascular perfusion. In turn, these changes lead to high urinary flow, low vasomotor tone, and a high...

  6. Decomposing the Roles of Perseveration and Expected Value Representation in Models of the Iowa Gambling Task

    Directory of Open Access Journals (Sweden)

    Darrell A. Worthy

    2013-09-01

    Full Text Available Models of human behavior in the Iowa Gambling Task (IGT have played a pivotal role in accounting for behavioral differences during decision-making. One critical difference between models that have been used to account for behavior in the IGT is the inclusion or exclusion of the assumption that participants tend to persevere, or stay with the same option over consecutive trials. Models that allow for this assumption include win-stay-lose-shift (WSLS models and reinforcement learning (RL models that include a decay learning rule where expected values for each option decay as they are chosen less often. One shortcoming of RL models that have included decay rules is that the tendency to persevere by sticking with the same option has been conflated with the tendency to select the option with the highest expected value because a single term is used to represent both of these tendencies. In the current work we isolate the tendencies to perseverate and to select the option with the highest expected value by including them as separate terms in a Value-Plus-Perseveration (VPP RL model. Overall the VPP model provides a better fit to data from a large group of participants than models that include a single term to account for both perseveration and the representation of expected value. Simulations of each model show that the VPP model’s simulated choices most closely resemble the decision-making behavior of human subjects. In addition, we also find that parameter estimates of loss aversion are more strongly correlated with performance when perseverative tendencies and expected value representations are decomposed as separate terms within the model. The results suggest that the tendency to persevere and the tendency to select the option that leads to the best net payoff are central components of decision-making behavior in the IGT. Future work should use this model to better examine decision-making behavior.

  7. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  8. Simulation as a surgical teaching model.

    Science.gov (United States)

    Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos

    2018-01-01

    Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Gstat: a program for geostatistical modelling, prediction and simulation

    Science.gov (United States)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  10. Creating Value Through the Freemium Business Model: A Consumer Perspective

    NARCIS (Netherlands)

    G.J. Rietveld (Joost)

    2016-01-01

    textabstractThis paper develops a consumer-centric framework for creating value through the freemium business model. Goods that are commercialized through the freemium business model offer basic functionality for free and monetize users for extended use or complementary features. Compared to premium

  11. Generalized algebra-valued models of set theory

    NARCIS (Netherlands)

    Löwe, B.; Tarafder, S.

    2015-01-01

    We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.

  12. Vacuum expectation values for four-fermion operators. Model estimates

    International Nuclear Information System (INIS)

    Zhitnitskij, A.R.

    1985-01-01

    Some simple models (a system with a heavy quark, the rarefied insatanton gas) are used to investigate the problem of factorizability. Characteristics of vacuum fluctuations responsible for saturation of four-fermion vacuum expectation values which are known phenomenologically are discussed. A qualitative agreement between the model and phenomenologic;l estimates has been noted

  13. Vacuum expectation values of four-fermion operators. Model estimates

    International Nuclear Information System (INIS)

    Zhitnitskii, A.R.

    1985-01-01

    Simple models (a system with a heavy quark, a rarefied instanton gas) are used to study problems of factorizability. A discussion is given of the characteristics of the vacuum fluctuations responsible for saturation of the phenomenologically known four-fermion vacuum expectation values. Qualitative agreement between the model and phenomenological estimates is observed

  14. A Lookahead Behavior Model for Multi-Agent Hybrid Simulation

    Directory of Open Access Journals (Sweden)

    Mei Yang

    2017-10-01

    Full Text Available In the military field, multi-agent simulation (MAS plays an important role in studying wars statistically. For a military simulation system, which involves large-scale entities and generates a very large number of interactions during the runtime, the issue of how to improve the running efficiency is of great concern for researchers. Current solutions mainly use hybrid simulation to gain fewer updates and synchronizations, where some important continuous models are maintained implicitly to keep the system dynamics, and partial resynchronization (PR is chosen as the preferable state update mechanism. However, problems, such as resynchronization interval selection and cyclic dependency, remain unsolved in PR, which easily lead to low update efficiency and infinite looping of the state update process. To address these problems, this paper proposes a lookahead behavior model (LBM to implement a PR-based hybrid simulation. In LBM, a minimal safe time window is used to predict the interactions between implicit models, upon which the resynchronization interval can be efficiently determined. Moreover, the LBM gives an estimated state value in the lookahead process so as to break the state-dependent cycle. The simulation results show that, compared with traditional mechanisms, LBM requires fewer updates and synchronizations.

  15. Deep Drawing Simulations With Different Polycrystalline Models

    Science.gov (United States)

    Duchêne, Laurent; de Montleau, Pierre; Bouvier, Salima; Habraken, Anne Marie

    2004-06-01

    The goal of this research is to study the anisotropic material behavior during forming processes, represented by both complex yield loci and kinematic-isotropic hardening models. A first part of this paper describes the main concepts of the `Stress-strain interpolation' model that has been implemented in the non-linear finite element code Lagamine. This model consists of a local description of the yield locus based on the texture of the material through the full constraints Taylor's model. The texture evolution due to plastic deformations is computed throughout the FEM simulations. This `local yield locus' approach was initially linked to the classical isotropic Swift hardening law. Recently, a more complex hardening model was implemented: the physically-based microstructural model of Teodosiu. It takes into account intergranular heterogeneity due to the evolution of dislocation structures, that affects isotropic and kinematic hardening. The influence of the hardening model is compared to the influence of the texture evolution thanks to deep drawing simulations.

  16. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  17. Extreme value modelling of Ghana stock exchange index.

    Science.gov (United States)

    Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe

    2015-01-01

    Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.

  18. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  19. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  20. Multiple Regression and Mediator Variables can be used to Avoid Double Counting when Economic Values are Derived using Stochastic Herd Simulation

    DEFF Research Database (Denmark)

    Østergaard, Søren; Ettema, Jehan Frans; Hjortø, Line

    Multiple regression and model building with mediator variables was addressed to avoid double counting when economic values are estimated from data simulated with herd simulation modeling (using the SimHerd model). The simulated incidence of metritis was analyzed statistically as the independent v...... in multiparous cows. The merit of using this approach was demonstrated since the economic value of metritis was estimated to be 81% higher when no mediator variables were included in the multiple regression analysis......Multiple regression and model building with mediator variables was addressed to avoid double counting when economic values are estimated from data simulated with herd simulation modeling (using the SimHerd model). The simulated incidence of metritis was analyzed statistically as the independent...... variable, while using the traits representing the direct effects of metritis on yield, fertility and occurrence of other diseases as mediator variables. The economic value of metritis was estimated to be €78 per 100 cow-years for each 1% increase of metritis in the period of 1-100 days in milk...

  1. A matrix model for valuing anesthesia service with the resource-based relative value system.

    Science.gov (United States)

    Sinclair, David R; Lubarsky, David A; Vigoda, Michael M; Birnbach, David J; Harris, Eric A; Behrens, Vicente; Bazan, Richard E; Williams, Steve M; Arheart, Kristopher; Candiotti, Keith A

    2014-01-01

    The purpose of this study was to propose a new crosswalk using the resource-based relative value system (RBRVS) that preserves the time unit component of the anesthesia service and disaggregates anesthesia billing into component parts (preoperative evaluation, intraoperative management, and postoperative evaluation). The study was designed as an observational chart and billing data review of current and proposed payments, in the setting of a preoperative holing area, intraoperative suite, and post anesthesia care unit. In total, 1,195 charts of American Society of Anesthesiology (ASA) physical status 1 through 5 patients were reviewed. No direct patient interventions were undertaken. Spearman correlations between the proposed RBRVS billing matrix payments and the current ASA relative value guide methodology payments were strong (r=0.94-0.96, Pbilling matrix yielded payments that were 3.0%±1.34% less than would have been expected from commercial insurers, using standard rates for commercial ASA relative value units and RBRVS relative value units. Compared with current Medicare reimbursement under the ASA relative value guide, reimbursement would almost double when converting to an RBRVS billing model. The greatest increases in Medicare reimbursement between the current system and proposed billing model occurred as anesthetic management complexity increased. The new crosswalk correlates with existing evaluation and management and intensive care medicine codes in an essentially revenue neutral manner when applied to the market-based rates of commercial insurers. The new system more highly values delivery of care to more complex patients undergoing more complex surgery and better represents the true value of anesthetic case management.

  2. Assessing the potential value for an automated dairy cattle body condition scoring system through stochastic simulation

    NARCIS (Netherlands)

    Bewley, J.M.; Boehlje, M.D.; Gray, A.W.; Hogeveen, H.; Kenyon, S.J.; Eicher, S.D.; Schutz, M.M.

    2010-01-01

    Purpose – The purpose of this paper is to develop a dynamic, stochastic, mechanistic simulation model of a dairy business to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm

  3. NUMERICAL MODEL APPLICATION IN ROWING SIMULATOR DESIGN

    Directory of Open Access Journals (Sweden)

    Petr Chmátal

    2016-04-01

    Full Text Available The aim of the research was to carry out a hydraulic design of rowing/sculling and paddling simulator. Nowadays there are two main approaches in the simulator design. The first one includes a static water with no artificial movement and counts on specially cut oars to provide the same resistance in the water. The second approach, on the other hand uses pumps or similar devices to force the water to circulate but both of the designs share many problems. Such problems are affecting already built facilities and can be summarized as unrealistic feeling, unwanted turbulent flow and bad velocity profile. Therefore, the goal was to design a new rowing simulator that would provide nature-like conditions for the racers and provide an unmatched experience. In order to accomplish this challenge, it was decided to use in-depth numerical modeling to solve the hydraulic problems. The general measures for the design were taken in accordance with space availability of the simulator ́s housing. The entire research was coordinated with other stages of the construction using BIM. The detailed geometry was designed using a numerical model in Ansys Fluent and parametric auto-optimization tools which led to minimum negative hydraulic phenomena and decreased investment and operational costs due to the decreased hydraulic losses in the system.

  4. eShopper modeling and simulation

    Science.gov (United States)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  5. The Unfolding of Value Sources During Online Business Model Transformation

    Directory of Open Access Journals (Sweden)

    Nadja Hoßbach

    2016-12-01

    Full Text Available Purpose: In the magazine publishing industry, viable online business models are still rare to absent. To prepare for the ‘digital future’ and safeguard their long-term survival, many publishers are currently in the process of transforming their online business model. Against this backdrop, this study aims to develop a deeper understanding of (1 how the different building blocks of an online business model are transformed over time and (2 how sources of value creation unfold during this transformation process. Methodology: To answer our research question, we conducted a longitudinal case study with a leading German business magazine publisher (called BIZ. Data was triangulated from multiple sources including interviews, internal documents, and direct observations. Findings: Based on our case study, we nd that BIZ used the transformation process to differentiate its online business model from its traditional print business model along several dimensions, and that BIZ’s online business model changed from an efficiency- to a complementarity- to a novelty-based model during this process. Research implications: Our findings suggest that different business model transformation phases relate to different value sources, questioning the appropriateness of value source-based approaches for classifying business models. Practical implications: The results of our case study highlight the need for online-offline business model differentiation and point to the important distinction between service and product differentiation. Originality: Our study contributes to the business model literature by applying a dynamic and holistic perspective on the link between online business model changes and unfolding value sources.

  6. Classification of customer lifetime value models using Markov chain

    Science.gov (United States)

    Permana, Dony; Pasaribu, Udjianna S.; Indratno, Sapto W.; Suprayogi

    2017-10-01

    A firm’s potential reward in future time from a customer can be determined by customer lifetime value (CLV). There are some mathematic methods to calculate it. One method is using Markov chain stochastic model. Here, a customer is assumed through some states. Transition inter the states follow Markovian properties. If we are given some states for a customer and the relationships inter states, then we can make some Markov models to describe the properties of the customer. As Markov models, CLV is defined as a vector contains CLV for a customer in the first state. In this paper we make a classification of Markov Models to calculate CLV. Start from two states of customer model, we make develop in many states models. The development a model is based on weaknesses in previous model. Some last models can be expected to describe how real characters of customers in a firm.

  7. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  8. High-Fidelity Roadway Modeling and Simulation

    Science.gov (United States)

    Wang, Jie; Papelis, Yiannis; Shen, Yuzhong; Unal, Ozhan; Cetin, Mecit

    2010-01-01

    Roads are an essential feature in our daily lives. With the advances in computing technologies, 2D and 3D road models are employed in many applications, such as computer games and virtual environments. Traditional road models were generated by professional artists manually using modeling software tools such as Maya and 3ds Max. This approach requires both highly specialized and sophisticated skills and massive manual labor. Automatic road generation based on procedural modeling can create road models using specially designed computer algorithms or procedures, reducing the tedious manual editing needed for road modeling dramatically. But most existing procedural modeling methods for road generation put emphasis on the visual effects of the generated roads, not the geometrical and architectural fidelity. This limitation seriously restricts the applicability of the generated road models. To address this problem, this paper proposes a high-fidelity roadway generation method that takes into account road design principles practiced by civil engineering professionals, and as a result, the generated roads can support not only general applications such as games and simulations in which roads are used as 3D assets, but also demanding civil engineering applications, which requires accurate geometrical models of roads. The inputs to the proposed method include road specifications, civil engineering road design rules, terrain information, and surrounding environment. Then the proposed method generates in real time 3D roads that have both high visual and geometrical fidelities. This paper discusses in details the procedures that convert 2D roads specified in shape files into 3D roads and civil engineering road design principles. The proposed method can be used in many applications that have stringent requirements on high precision 3D models, such as driving simulations and road design prototyping. Preliminary results demonstrate the effectiveness of the proposed method.

  9. Difficulties with True Interoperability in Modeling & Simulation

    Science.gov (United States)

    2011-12-01

    Standards in M&S cover multiple layers of technical abstraction. There are middleware specifica- tions, such as the High Level Architecture (HLA) ( IEEE Xplore ... IEEE Xplore Digital Library. 2010. 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA) – Framework and Rules...using different communication protocols being able to allow da- 2642978-1-4577-2109-0/11/$26.00 ©2011 IEEE Report Documentation Page Form ApprovedOMB No

  10. Agent Based Modelling for Social Simulation

    OpenAIRE

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course of this project two workshops were organized. At these workshops, a wide range of experts, both ABM experts and domain experts, worked on several potential applications of ABM. The results and ins...

  11. A Novel Mean-Value Model of the Cardiovascular System Including a Left Ventricular Assist Device.

    Science.gov (United States)

    Ochsner, Gregor; Amacher, Raffael; Schmid Daners, Marianne

    2017-06-01

    Time-varying elastance models (TVEMs) are often used for simulation studies of the cardiovascular system with a left ventricular assist device (LVAD). Because these models are computationally expensive, they cannot be used for long-term simulation studies. In addition, their equilibria are periodic solutions, which prevent the extraction of a linear time-invariant model that could be used e.g. for the design of a physiological controller. In the current paper, we present a new type of model to overcome these problems: the mean-value model (MVM). The MVM captures the behavior of the cardiovascular system by representative mean values that do not change within the cardiac cycle. For this purpose, each time-varying element is manually converted to its mean-value counterpart. We compare the derived MVM to a similar TVEM in two simulation experiments. In both cases, the MVM is able to fully capture the inter-cycle dynamics of the TVEM. We hope that the new MVM will become a useful tool for researchers working on physiological control algorithms. This paper provides a plant model that enables for the first time the use of tools from classical control theory in the field of physiological LVAD control.

  12. Establishment of virtual three-dimensional model for intravascular interventional devices and its clinical value

    International Nuclear Information System (INIS)

    Wei Xin; Zhong Liming; Xie Xiaodong; Wang Chaohua; You Jian; Hu Hong; Hu Kongqiong; Zhao Xiaowei

    2012-01-01

    Objective: To explore virtual three-dimensional (3D) model for intravascular interventional devices,the method of preoperative simulation and its value in clinical work. Methods: The virtual models including catheter, guide wire, stent and coil were established by using the 3D moulding software of 3D Studio MAX R3. The interventional preoperative simulation was performed on personal computer including 21 patients of cerebral aneurysm embolization (anterior communicating artery 5, posterior communicating artery 10,middle cerebral artery 3, internal carotid artery 2, and vertebral artery 1), during interventional procedures, the surgeon relied on the simulation results for plastic micro-guide wire, catheter and the release of micro-coils and stents. Results: (1) All the virtual instruments and real instruments had similar shape,the overall tine for constructing virtual model was about 20 hours. The preoperative simulation took 50 to 80 minutes. (2) The simulation result of catheter insertion in the 18 cases had relevant value to guide micro-catheter, molding micro-guide wire tip, and shortened the operating time. For embolization, the simulation results of filling coil and releasing stent were similar to surgical results in 76% of the patients (16/21). (3)For teaching and training, 93% (38/41) of doctors in training believed that preoperative simulation facilitated the understanding of surgery. Conclusions: The method of virtual model of intravascular interventional devices was reliable. The preoperative simulation results could be used to guide practical clinical operation with relatively high degree of similarity, and could play a role in promoting researches on interventional virtual operations. (authors)

  13. Modeling and simulation of normal and hemiparetic gait

    Science.gov (United States)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  14. Implementing a Value Creation Model in a Startup

    OpenAIRE

    Guillaume Marceau

    2014-01-01

    In this article, we propose a value creation model based on the principle of the chain of value in corporate management. We particularly endeavor to show the incidence of a relevant allowance of a company's resources on its profitability, by distinguishing on one hand the activities that are directly profitable and on the other hand those which have a support function. This distinction is applied to the study of a services company in computer engineering, in terms of internal balance and pote...

  15. A Model for Sustainable Value Creation in Supply Chain

    OpenAIRE

    KORDİTABAR, Seyed Behzad

    2015-01-01

    Abstract. In order to survive, every company needs to achieve sustainable profitability, which is impossible unless there is sustainable value creation. Regarding the fact that sustainability is closely related with concepts of supply chain management, the present paper intends to propose through a conceptual theorization approach a new comprehensive model drawing on concepts of value creation and sustainability from the perspective of supply chain, specifying the dimensions contributing to s...

  16. The Thomas-Fermi model: momentum expectation values

    International Nuclear Information System (INIS)

    Dmitrieva, I.K.; Plindov, G.I.

    1983-01-01

    Within the Thomas-Fermi model including the exchange interaction and contributions of strongly bound electrons, analytical expressions are obtained for all momentum expectation values and for some of the expectation values of powers of the electron density for an atom with an arbitrary degree of ionization. It is shown that a correct treatment of strongly bound electrons gives a quantitative estimate of and within 3 - 1 expansion coefficients for and are given as an explicit function of the electron number

  17. TRUST MODEL FOR SOCIAL NETWORK USING SINGULAR VALUE DECOMPOSITION

    Directory of Open Access Journals (Sweden)

    Davis Bundi Ntwiga

    2016-06-01

    Full Text Available For effective interactions to take place in a social network, trust is important. We model trust of agents using the peer to peer reputation ratings in the network that forms a real valued matrix. Singular value decomposition discounts the reputation ratings to estimate the trust levels as trust is the subjective probability of future expectations based on current reputation ratings. Reputation and trust are closely related and singular value decomposition can estimate trust using the real valued matrix of the reputation ratings of the agents in the network. Singular value decomposition is an ideal technique in error elimination when estimating trust from reputation ratings. Reputation estimation of trust is optimal at the discounting of 20 %.

  18. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  19. Interactive Modelling and Simulation of Human Motion

    DEFF Research Database (Denmark)

    Engell-Nørregård, Morten Pol

    menneskers led, der udviser både ikke-konveksitet og flere frihedsgrader • En generel og alsidig model for aktivering af bløde legemer. Modellen kan anvendes som et animations værktøj, men er lige så velegnet til simulering af menneskelige muskler, da den opfylder de grundlæggende fysiske principper......Dansk resumé Denne ph.d.-afhandling beskæftiger sig med modellering og simulation af menneskelig bevægelse. Emnerne i denne afhandling har mindst to ting til fælles. For det første beskæftiger de sig med menneskelig bevægelse. Selv om de udviklede modeller også kan benyttes til andre ting,er det...... primære fokus på at modellere den menneskelige krop. For det andet, beskæftiger de sig alle med simulering som et redskab til at syntetisere bevægelse og dermed skabe animationer. Dette er en vigtigt pointe, da det betyder, at vi ikke kun skaber værktøjer til animatorer, som de kan bruge til at lave sjove...

  20. On Improving 4-km Mesoscale Model Simulations

    Science.gov (United States)

    Deng, Aijun; Stauffer, David R.

    2006-03-01

    A previous study showed that use of analysis-nudging four-dimensional data assimilation (FDDA) and improved physics in the fifth-generation Pennsylvania State University National Center for Atmospheric Research Mesoscale Model (MM5) produced the best overall performance on a 12-km-domain simulation, based on the 18 19 September 1983 Cross-Appalachian Tracer Experiment (CAPTEX) case. However, reducing the simulated grid length to 4 km had detrimental effects. The primary cause was likely the explicit representation of convection accompanying a cold-frontal system. Because no convective parameterization scheme (CPS) was used, the convective updrafts were forced on coarser-than-realistic scales, and the rainfall and the atmospheric response to the convection were too strong. The evaporative cooling and downdrafts were too vigorous, causing widespread disruption of the low-level winds and spurious advection of the simulated tracer. In this study, a series of experiments was designed to address this general problem involving 4-km model precipitation and gridpoint storms and associated model sensitivities to the use of FDDA, planetary boundary layer (PBL) turbulence physics, grid-explicit microphysics, a CPS, and enhanced horizontal diffusion. Some of the conclusions include the following: 1) Enhanced parameterized vertical mixing in the turbulent kinetic energy (TKE) turbulence scheme has shown marked improvements in the simulated fields. 2) Use of a CPS on the 4-km grid improved the precipitation and low-level wind results. 3) Use of the Hong and Pan Medium-Range Forecast PBL scheme showed larger model errors within the PBL and a clear tendency to predict much deeper PBL heights than the TKE scheme. 4) Combining observation-nudging FDDA with a CPS produced the best overall simulations. 5) Finer horizontal resolution does not always produce better simulations, especially in convectively unstable environments, and a new CPS suitable for 4-km resolution is needed. 6

  1. Reactive transport models and simulation with ALLIANCES

    International Nuclear Information System (INIS)

    Leterrier, N.; Deville, E.; Bary, B.; Trotignon, L.; Hedde, T.; Cochepin, B.; Stora, E.

    2009-01-01

    Many chemical processes influence the evolution of nuclear waste storage. As a result, simulations based only upon transport and hydraulic processes fail to describe adequately some industrial scenarios. We need to take into account complex chemical models (mass action laws, kinetics...) which are highly non-linear. In order to simulate the coupling of these chemical reactions with transport, we use a classical Sequential Iterative Approach (SIA), with a fixed point algorithm, within the mainframe of the ALLIANCES platform. This approach allows us to use the various transport and chemical modules available in ALLIANCES, via an operator-splitting method based upon the structure of the chemical system. We present five different applications of reactive transport simulations in the context of nuclear waste storage: 1. A 2D simulation of the lixiviation by rain water of an underground polluted zone high in uranium oxide; 2. The degradation of the steel envelope of a package in contact with clay. Corrosion of the steel creates corrosion products and the altered package becomes a porous medium. We follow the degradation front through kinetic reactions and the coupling with transport; 3. The degradation of a cement-based material by the injection of an aqueous solution of zinc and sulphate ions. In addition to the reactive transport coupling, we take into account in this case the hydraulic retroaction of the porosity variation on the Darcy velocity; 4. The decalcification of a concrete beam in an underground storage structure. In this case, in addition to the reactive transport simulation, we take into account the interaction between chemical degradation and the mechanical forces (cracks...), and the retroactive influence on the structure changes on transport; 5. The degradation of the steel envelope of a package in contact with a clay material under a temperature gradient. In this case the reactive transport simulation is entirely directed by the temperature changes and

  2. Optimasi Value at Risk Pada Reksa Dana Dengan Metode Historical Simulation Dan Aplikasinya Menggunakan Gui Matlab

    OpenAIRE

    Monica, Christa; Tarno, Tarno; Yasin, Hasbi

    2016-01-01

    Value at Risk (VaR) is a method used to measure financial risk within a firm or investment portfolio over a specific time period at certain confidence interval level. Historical Simulation is used in this research to compute VaR of stock mutual fund at 5% confidence interval level, with one day time period and Rp 100.000.000,00 startup investment fund. Historical Simulation ia a non parametric method where the formula doesn't require any asumption. Portfolio optimization is done by calculatin...

  3. A Continuous-Time Model for Valuing Foreign Exchange Options

    Directory of Open Access Journals (Sweden)

    James J. Kung

    2013-01-01

    Full Text Available This paper makes use of stochastic calculus to develop a continuous-time model for valuing European options on foreign exchange (FX when both domestic and foreign spot rates follow a generalized Wiener process. Using the dollar/euro exchange rate as input for parameter estimation and employing our FX option model as a yardstick, we find that the traditional Garman-Kohlhagen FX option model, which assumes constant spot rates, values incorrectly calls and puts for different values of the ratio of exchange rate to exercise price. Specifically, it undervalues calls when the ratio is between 0.70 and 1.08, and it overvalues calls when the ratio is between 1.18 and 1.30, whereas it overvalues puts when the ratio is between 0.70 and 0.82, and it undervalues puts when the ratio is between 0.86 and 1.30.

  4. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  5. Consolidation modelling for thermoplastic composites forming simulation

    Science.gov (United States)

    Xiong, H.; Rusanov, A.; Hamila, N.; Boisse, P.

    2016-10-01

    Pre-impregnated thermoplastic composites are widely used in the aerospace industry for their excellent mechanical properties, Thermoforming thermoplastic prepregs is a fast manufacturing process, the automotive industry has shown increasing interest in this manufacturing processes, in which the reconsolidation is an essential stage. The model of intimate contact is investigated as the consolidation model, compression experiments have been launched to identify the material parameters, several numerical tests show the influents of the temperature and pressure applied during processing. Finally, a new solid-shell prismatic element has been presented for the simulation of consolidation step in the thermoplastic composites forming process.

  6. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  7. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón

    2013-08-01

    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  8. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe

    2015-01-01

    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  9. Nonlinear friction model for servo press simulation

    Science.gov (United States)

    Ma, Ninshu; Sugitomo, Nobuhiko; Kyuno, Takunori; Tamura, Shintaro; Naka, Tetsuo

    2013-12-01

    The friction coefficient was measured under an idealized condition for a pulse servo motion. The measured friction coefficient and its changing with both sliding distance and a pulse motion showed that the friction resistance can be reduced due to the re-lubrication during unloading process of the pulse servo motion. Based on the measured friction coefficient and its changes with sliding distance and re-lubrication of oil, a nonlinear friction model was developed. Using the newly developed the nonlinear friction model, a deep draw simulation was performed and the formability was evaluated. The results were compared with experimental ones and the effectiveness was verified.

  10. VAR IPP-IPC Model Simulation

    Directory of Open Access Journals (Sweden)

    Juan P. Pérez Monsalve

    2014-12-01

    Full Text Available This work analyzed the relationship of the two main Price indicators in the Colombian economy, the IPP and the IPC. For this purpose, we identified the theory comprising both indexes to then develop a vector autoregressive model, which shows the reaction to shocks both in itself as in the other variable, whose impact continues propagating in the long term. Additionally, the work presents a simulation of the VAR model through the Monte Carlo method, verifying the coincidence in distributions of probability and volatility levels, as well as the existence correlation over time

  11. Modeling and Application of Customer Lifetime Value in Online Retail

    Directory of Open Access Journals (Sweden)

    Pavel Jasek

    2018-01-01

    Full Text Available This article provides an empirical statistical analysis and discussion of the predictive abilities of selected customer lifetime value (CLV models that could be used in online shopping within e-commerce business settings. The comparison of CLV predictive abilities, using selected evaluation metrics, is made on selected CLV models: Extended Pareto/NBD model (EP/NBD, Markov chain model and Status Quo model. The article uses six online store datasets with annual revenues in the order of tens of millions of euros for the comparison. The EP/NBD model has outperformed other selected models in a majority of evaluation metrics and can be considered good and stable for non-contractual relations in online shopping. The implications for the deployment of selected CLV models in practice, as well as suggestions for future research, are also discussed.

  12. Chrystal and Proudman resonances simulated with three numerical models

    Science.gov (United States)

    Bubalo, Maja; Janeković, Ivica; Orlić, Mirko

    2018-05-01

    The aim of this work was to study Chrystal and Proudman resonances in a simple closed basin and to explore and compare how well the two resonant mechanisms are reproduced with different, nowadays widely used, numerical ocean models. The test case was based on air pressure disturbances of two commonly used shapes (a sinusoidal and a boxcar), having various wave lengths, and propagating at different speeds. Our test domain was a closed rectangular basin, 300 km long with a uniform depth of 50 m, with the theoretical analytical solution available for benchmark. In total, 2250 simulations were performed for each of the three different numerical models: ADCIRC, SCHISM and ROMS. During each of the simulations, we recorded water level anomalies and computed the integral of the energy density spectrum for a number of points distributed along the basin. We have successfully documented the transition from Proudman to Chrystal resonance that occurs for a sinusoidal air pressure disturbance having a wavelength between one and two basin lengths. An inter-model comparison of the results shows that different models represent the two resonant phenomena in a slightly different way. For Chrystal resonance, all the models showed similar behavior; however, ADCIRC model providing slightly higher values of the mean resonant period than the other two models. In the case of Proudman resonance, the most consistent results, closest to the analytical solution, were obtained using ROMS model, which reproduced the mean resonant speed equal to 22.00 m/s— i.e., close to the theoretical value of 22.15 m/s. ADCIRC and SCHISM models showed small deviations from that value, with the mean speed being slightly lower—21.97 m/s (ADCIRC) and 21.93 m/s (SCHISM). The findings may seem small but could play an important role when resonance is a crucial process producing enhancing effects by two orders of magnitude (i.e., meteotsunamis).

  13. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  14. Recommendations on Model Fidelity for Wind Turbine Gearbox Simulations: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keller, J.; La Cava, W.; Austin, J.; Nejad, A. R.; Halse, C.; Bastard, L.; Helsen, J.

    2015-01-01

    This work investigates the minimum level of fidelity required to accurately simulate wind turbine gearboxes using state-of-the-art design tools. Excessive model fidelity including drivetrain complexity, gearbox complexity, excitation sources, and imperfections, significantly increases computational time, but may not provide a commensurate increase in the value of the results. Essential design parameters are evaluated, including the planetary load-sharing factor, gear tooth load distribution, and sun orbit motion. Based on the sensitivity study results, recommendations for the minimum model fidelities are provided.

  15. Effect of different heat transfer models on HCCI engine simulation

    International Nuclear Information System (INIS)

    Neshat, Elaheh; Saray, Rahim Khoshbakhti

    2014-01-01

    Highlights: • A new multi zone model is developed for HCCI combustion modeling. • New heat transfer model is used for prediction of heat transfer in HCCI engines. • Model can predict engine combustion, performance and emission characteristics well. • Appropriate mass and heat transfer models cause to accurate prediction of CO, UHC and NOx. - Abstract: Heat transfer from engine walls has an important role on engine combustion, performance and emission characteristics. The main focus of this study is offering a new relation for calculation of convective heat transfer from in-cylinder charge to combustion chamber walls of HCCI engines and providing the ability of new model in comparison with the previous models. Therefore, a multi zone model is developed for homogeneous charge compression ignition engine simulation. Model consists of four different types of zones including core zone, boundary layer zone, outer zones, which are between core and boundary layer, and crevice zone. Conductive heat transfer and mass transfer are considered between neighboring zones. For accurate calculation of initial conditions at inlet valve closing, multi zone model is coupled with a single zone model, which simulates gas exchange process. Various correlations are used as convective heat transfer correlations. Woschni, modified Woschni, Hohenberg and Annand correlations are used as convective heat transfer models. The new convection model, developed by authors, is used, too. Comparative analyses are done to recognize the accurate correlation for prediction of engine combustion, performance and emission characteristics in a wide range of operating conditions. The results indicate that utilization of various heat transfer models, except for new convective heat transfer model, leads to significant differences in prediction of in-cylinder pressure and exhaust emissions. Using Woschni, Chang and new model, convective heat transfer coefficient increases near top dead center, sharply

  16. Value increasing business model for e-hospital.

    Science.gov (United States)

    Null, Robert; Wei, June

    2009-01-01

    This paper developed a business value increasing model for electronic hospital (e-hospital) based on electronic value chain analysis. From this model, 58 hospital electronic business (e-business) solutions were developed. Additionally, this paper investigated the adoption patterns of these 58 e-business solutions within six US leading hospitals. The findings show that only 36 of 58 or 62% of the e-business solutions are fully or partially implemented within the six hospitals. Ultimately, the research results will be beneficial to managers and executives for accelerating e-business adoptions for e-hospital.

  17. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  18. Application of the Value Optimization Model of Key Factors Based on DSEM

    Directory of Open Access Journals (Sweden)

    Chao Su

    2016-01-01

    Full Text Available The key factors of the damping solvent extraction method (DSEM for the analysis of the unbounded medium are the size of bounded domain, the artificial damping ratio, and the finite element mesh density. To control the simulation accuracy and computational efficiency of the soil-structure interaction, this study establishes a value optimization model of key factors that is composed of the design variables, the objective function, and the constraint function system. Then the optimum solutions of key factors are obtained by the optimization model. According to some comparisons of the results provided by the different initial conditions, the value optimization model of key factors is feasible to govern the simulation accuracy and computational efficiency and to analyze the practical unbounded medium-structure interaction.

  19. Modeling and simulation of gamma camera

    International Nuclear Information System (INIS)

    Singh, B.; Kataria, S.K.; Samuel, A.M.

    2002-08-01

    Simulation techniques play a vital role in designing of sophisticated instruments and also for the training of operating and maintenance staff. Gamma camera systems have been used for functional imaging in nuclear medicine. Functional images are derived from the external counting of the gamma emitting radioactive tracer that after introduction in to the body mimics the behavior of native biochemical compound. The position sensitive detector yield the coordinates of the gamma ray interaction with the detector and are used to estimate the point of gamma ray emission within the tracer distribution space. This advanced imaging device is thus dependent on the performance of algorithm for coordinate computing, estimation of point of emission, generation of image and display of the image data. Contemporary systems also have protocols for quality control and clinical evaluation of imaging studies. Simulation of this processing leads to understanding of the basic camera design problems. This report describes a PC based package for design and simulation of gamma camera along with the options of simulating data acquisition and quality control of imaging studies. Image display and data processing the other options implemented in SIMCAM will be described in separate reports (under preparation). Gamma camera modeling and simulation in SIMCAM has preset configuration of the design parameters for various sizes of crystal detector with the option to pack the PMT on hexagon or square lattice. Different algorithm for computation of coordinates and spatial distortion removal are allowed in addition to the simulation of energy correction circuit. The user can simulate different static, dynamic, MUGA and SPECT studies. The acquired/ simulated data is processed for quality control and clinical evaluation of the imaging studies. Results show that the program can be used to assess these performances. Also the variations in performance parameters can be assessed due to the induced

  20. Using Active Learning for Speeding up Calibration in Simulation Models.

    Science.gov (United States)

    Cevik, Mucahit; Ergun, Mehmet Ali; Stout, Natasha K; Trentham-Dietz, Amy; Craven, Mark; Alagoz, Oguzhan

    2016-07-01

    Most cancer simulation models include unobservable parameters that determine disease onset and tumor growth. These parameters play an important role in matching key outcomes such as cancer incidence and mortality, and their values are typically estimated via a lengthy calibration procedure, which involves evaluating a large number of combinations of parameter values via simulation. The objective of this study is to demonstrate how machine learning approaches can be used to accelerate the calibration process by reducing the number of parameter combinations that are actually evaluated. Active learning is a popular machine learning method that enables a learning algorithm such as artificial neural networks to interactively choose which parameter combinations to evaluate. We developed an active learning algorithm to expedite the calibration process. Our algorithm determines the parameter combinations that are more likely to produce desired outputs and therefore reduces the number of simulation runs performed during calibration. We demonstrate our method using the previously developed University of Wisconsin breast cancer simulation model (UWBCS). In a recent study, calibration of the UWBCS required the evaluation of 378 000 input parameter combinations to build a race-specific model, and only 69 of these combinations produced results that closely matched observed data. By using the active learning algorithm in conjunction with standard calibration methods, we identify all 69 parameter combinations by evaluating only 5620 of the 378 000 combinations. Machine learning methods hold potential in guiding model developers in the selection of more promising parameter combinations and hence speeding up the calibration process. Applying our machine learning algorithm to one model shows that evaluating only 1.49% of all parameter combinations would be sufficient for the calibration. © The Author(s) 2015.

  1. Mean Value Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Muller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models which are physically based. Such models are useful for control studies, for engine control system analysis and for model based control systems. Very few published MVEMs have included the effects of Exhaust Gas Recirculation (EGR......). The purpose of this paper is to present a modified MVEM which includes EGR in a physical way. It has been tested using newly developed, ver fast manifold pressure, manifold temperature, port and EGR mass flow sensores. Reasonable agreement has been obtained on an experimental engine, mounted on a dynamometer....

  2. Mean Value Engine Modelling of an SI Engine with EGR

    DEFF Research Database (Denmark)

    Føns, Michael; Müller, Martin; Chevalier, Alain

    1999-01-01

    Mean Value Engine Models (MVEMs) are simplified, dynamic engine models what are physically based. Such models are useful for control studies, for engine control system analysis and for model based engine control systems. Very few published MVEMs have included the effects of Exhaust Gas...... Recirculation (EGR). The purpose of this paper is to present a modified MVEM which includes EGR in a physical way. It has been tested using newly developed, very fast manifold pressure, manifold temperature, port and EGR mass flow sensors. Reasonable agreement has been obtained on an experimental engine...

  3. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  4. Mean Value SI Engine Model for Control Studies

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Sorenson, Spencer C

    1990-01-01

    This paper presents a mathematically simple nonlinear three state (three differential equation) dynamic model of an SI engine which has the same steady state accuracy as a typical dynamometer measurement of the engine over its entire speed/load operating range (± 2.0%). The model's accuracy...... for large, fast transients is of the same order in the same operating region. Because the model is mathematically compact, it has few adjustable parameters and is thus simple to fit to a given engine either on the basis of measurements or given the steady state results of a larger cycle simulation package....... The model can easily be run on a Personal Computer (PC) using a ordinary differential equation (ODE) integrating routine or package. This makes the model is useful for control system design and evaluation....

  5. Can Participatory Action Research Create Value for Business Model Innovation?

    DEFF Research Database (Denmark)

    Sparre, Mogens; Rasmussen, Ole Horn; Fast, Alf Michael

    Innovation (BMI)?” – has been investigated from five different perspectives based upon The Business Model Cube and The Where to Look Model. Using both established and newly developed tools the paper presents how. Theory and data from two cases are presented and it is demonstrated how industry increase......Abstract: Participatory Action Research (PAR) has a longer academic history compared with the idea of business models (BMs). This paper indicates how industries gain by using the combined methodology. The research question "Can participatory action research create value for Business Model...... their monetary and/or non-monetary value creation doing BMI based upon PAR. The process is essential and using the methodology of PAR creates meaning. Behind the process, the RAR methodology and its link to BM and BMI may contribute to theory construction and creation of a common language in academia around...

  6. Modeling and simulation of chillers with Dymola/Modelica; Modellierung und Simulation von Kaeltemaschinen mit Dymola/Modelica

    Energy Technology Data Exchange (ETDEWEB)

    Rettich, Daniel [Hochschule Biberach (Germany). Inst. fuer Gebaeude- und Energiesysteme (IGE)

    2012-07-01

    Within the contribution under consideration, a chiller was modeled and simulated with the program package Dymola / Modelica using the TIL Toolbox. An existing refrigeration technology test bench at the University of Biberach (Federal Republic of Germany) serves as a reference for the chiller illustrated in the simulation. The aim of the simulation is the future use of the models in a hardware-in-the-Loop (HIL) test bench in order to test different controllers with respect to their function and logic under identical framework conditions. Furthermore, the determination of the energy efficiency according to the regulation VDMA 24247 is in the foreground at the test bench as well as within the simulation. Following the final completion of the test bench, the models are validated against the test bench, and the model of the refrigerator will be connected to a detailed space model. Individual models were taken from the TIL toolbox, adapted for the application and parameterized with the design values of the laboratory chiller. Modifications to the TIL models were necessary in order to reflect the dynamic effects of the chiller in detail. For this purpose, investigations on indicators of the various dynamic components were employed. Subsequently to the modeling, each model was tested on the bases of design values and documents of the manufacturer. First simulation studies showed that the simulation in Dymola including the developed models provide plausible results. In the course of the modeling and parameterization of these modified models a component library was developed. Different models for future simulation studies can be extracted.

  7. Creating Value Through the Freemium Business Model: A Consumer Perspective

    OpenAIRE

    Rietveld, Joost

    2016-01-01

    textabstractThis paper develops a consumer-centric framework for creating value through the freemium business model. Goods that are commercialized through the freemium business model offer basic functionality for free and monetize users for extended use or complementary features. Compared to premium goods, freemium goods have lower barriers to adoption and allow end-users to accurately assess and act on their willingness to pay. On the other hand, convincing users to spend money on freemium g...

  8. Giving the Expectancy-Value Model a Heart

    OpenAIRE

    Henning, V.; Hennig-Thurau, T.; Feiereisen, S.

    2012-01-01

    Over the past decade, research in consumer behavior has debated the role of emotion in consumer decision making intensively but has offered few attempts to integrate emotion-related findings with established theoretical frameworks. This manuscript augments the classical expectancy-value model of attitude with a dimensional model of emotion. An experiment involving 308 college students who face actual purchase decisions shows that predictions of attitudes, behavioral intentions and actual beha...

  9. Tokamak Simulation Code modeling of NSTX

    International Nuclear Information System (INIS)

    Jardin, S.C.; Kaye, S.; Menard, J.; Kessel, C.; Glasser, A.H.

    2000-01-01

    The Tokamak Simulation Code [TSC] is widely used for the design of new axisymmetric toroidal experiments. In particular, TSC was used extensively in the design of the National Spherical Torus eXperiment [NSTX]. The authors have now benchmarked TSC with initial NSTX results and find excellent agreement for plasma and vessel currents and magnetic flux loops when the experimental coil currents are used in the simulations. TSC has also been coupled with a ballooning stability code and with DCON to provide stability predictions for NSTX operation. TSC has also been used to model initial CHI experiments where a large poloidal voltage is applied to the NSTX vacuum vessel, causing a force-free current to appear in the plasma. This is a phenomenon that is similar to the plasma halo current that sometimes develops during a plasma disruption

  10. Tracing the value of data for flood loss modelling

    Directory of Open Access Journals (Sweden)

    Schröter Kai

    2016-01-01

    Full Text Available Flood loss modelling is associated with considerable uncertainty. If prediction uncertainty of flood loss models is large, the reliability of model outcomes is questionable, and thus challenges the practical usefulness. A key problem in flood loss estimation is the transfer of models to geographical regions and to flood events that may differ from the ones used for model development. Variations in local characteristics and continuous system changes require regional adjustments and continuous updating with current evidence. However, acquiring data on damage influencing factors is usually very costly. Therefore, it is of relevance to assess the value of additional data in terms of model performance improvement. We use empirical flood loss data on direct damage to residential buildings available from computer aided telephone interviews that were compiled after major floods in Germany. This unique data base allows us to trace the changes in predictive model performance by incrementally extending the data base used to derive flood loss models. Two models are considered: a uni-variable stage damage function and RF-FLEMO, a multi-variable probabilistic model approach using Random Forests. Additional data are useful to improve model predictive performance and increase model reliability, however the gains also seem to depend on the model approach.

  11. A matrix model for valuing anesthesia service with the resource-based relative value system

    Directory of Open Access Journals (Sweden)

    Sinclair DR

    2014-10-01

    Full Text Available David R Sinclair,1 David A Lubarsky,1 Michael M Vigoda,1 David J Birnbach,1 Eric A Harris,1 Vicente Behrens,1 Richard E Bazan,1 Steve M Williams,1 Kristopher Arheart,2 Keith A Candiotti1 1Department of Anesthesiology, Perioperative Medicine and Pain Management, 2Department of Public Health Sciences, Division of Biostatistics, University of Miami Miller School of Medicine, Miami, FL, USA Background: The purpose of this study was to propose a new crosswalk using the resource-based relative value system (RBRVS that preserves the time unit component of the anesthesia service and disaggregates anesthesia billing into component parts (preoperative evaluation, intraoperative management, and postoperative evaluation. The study was designed as an observational chart and billing data review of current and proposed payments, in the setting of a preoperative holing area, intraoperative suite, and post anesthesia care unit. In total, 1,195 charts of American Society of Anesthesiology (ASA physical status 1 through 5 patients were reviewed. No direct patient interventions were undertaken. Results: Spearman correlations between the proposed RBRVS billing matrix payments and the current ASA relative value guide methodology payments were strong (r=0.94–0.96, P<0.001 for training, test, and overall. The proposed RBRVS-based billing matrix yielded payments that were 3.0%±1.34% less than would have been expected from commercial insurers, using standard rates for commercial ASA relative value units and RBRVS relative value units. Compared with current Medicare reimbursement under the ASA relative value guide, reimbursement would almost double when converting to an RBRVS billing model. The greatest increases in Medicare reimbursement between the current system and proposed billing model occurred as anesthetic management complexity increased. Conclusion: The new crosswalk correlates with existing evaluation and management and intensive care medicine codes in an

  12. A model for measuring value for money in professional sports

    Directory of Open Access Journals (Sweden)

    Vlad ROŞCA

    2013-07-01

    Full Text Available Few to almost none sports teams measure the entertainment value they provide to fans in exchange of the money the latter ones spend on admission fees. Scientific literature oversees the issue as well. The aim of this paper is to present a model that can be used for calculating value for money in the context of spectating sports. The research question asks how can value for money be conceptualized and measured for sports marketing purposes? Using financial and sporting variables, the method calculates how much money, on average, a fan had to spend for receiving quality entertainment – defined as won matches – from his favorite team, during the last season of the Romanian first division football championship. The results only partially confirm the research hypothesis, showing that not just price and sporting performances may influence the value delivered to fans, but other factors as well.

  13. Precision Modeling Of Targets Using The VALUE Computer Program

    Science.gov (United States)

    Hoffman, George A.; Patton, Ronald; Akerman, Alexander

    1989-08-01

    The 1976-vintage LASERX computer code has been augmented to produce realistic electro-optical images of targets. Capabilities lacking in LASERX but recently incorporated into its VALUE successor include: •Shadows cast onto the ground •Shadows cast onto parts of the target •See-through transparencies (e.g.,canopies) •Apparent images due both to atmospheric scattering and turbulence •Surfaces characterized by multiple bi-directional reflectance functions VALUE provides not only realistic target modeling by its precise and comprehensive representation of all target attributes, but additionally VALUE is very user friendly. Specifically, setup of runs is accomplished by screen prompting menus in a sequence of queries that is logical to the user. VALUE also incorporates the Optical Encounter (OPEC) software developed by Tricor Systems,Inc., Elgin, IL.

  14. IT Business Value Model for Information Intensive Organizations

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Gastaud Maçada

    2012-01-01

    Full Text Available Many studies have highlighted the capacity Information Technology (IT has for generating value for organizations. Investments in IT made by organizations have increased each year. Therefore, the purpose of the present study is to analyze the IT Business Value for Information Intensive Organizations (IIO - e.g. banks, insurance companies and securities brokers. The research method consisted of a survey that used and combined the models from Weill and Broadbent (1998 and Gregor, Martin, Fernandez, Stern and Vitale (2006. Data was gathered using an adapted instrument containing 5 dimensions (Strategic, Informational, Transactional, Transformational and Infra-structure with 27 items. The instrument was refined by employing statistical techniques such as Exploratory and Confirmatory Factorial Analysis through Structural Equations (first and second order Model Measurement. The final model is composed of four factors related to IT Business Value: Strategic, Informational, Transactional and Transformational, arranged in 15 items. The dimension Infra-structure was excluded during the model refinement process because it was discovered during interviews that managers were unable to perceive it as a distinct dimension of IT Business Value.

  15. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  16. Process model simulations of the divergence effect

    Science.gov (United States)

    Anchukaitis, K. J.; Evans, M. N.; D'Arrigo, R. D.; Smerdon, J. E.; Hughes, M. K.; Kaplan, A.; Vaganov, E. A.

    2007-12-01

    We explore the extent to which the Vaganov-Shashkin (VS) model of conifer tree-ring formation can explain evidence for changing relationships between climate and tree growth over recent decades. The VS model is driven by daily environmental forcing (temperature, soil moisture, and solar radiation), and simulates tree-ring growth cell-by-cell as a function of the most limiting environmental control. This simplified representation of tree physiology allows us to examine using a selection of case studies whether instances of divergence may be explained in terms of changes in limiting environmental dependencies or transient climate change. Identification of model-data differences permits further exploration of the effects of tree-ring standardization, atmospheric composition, and additional non-climatic factors.

  17. Radiation Modeling with Direct Simulation Monte Carlo

    Science.gov (United States)

    Carlson, Ann B.; Hassan, H. A.

    1991-01-01

    Improvements in the modeling of radiation in low density shock waves with direct simulation Monte Carlo (DSMC) are the subject of this study. A new scheme to determine the relaxation collision numbers for excitation of electronic states is proposed. This scheme attempts to move the DSMC programs toward a more detailed modeling of the physics and more reliance on available rate data. The new method is compared with the current modeling technique and both techniques are compared with available experimental data. The differences in the results are evaluated. The test case is based on experimental measurements from the AVCO-Everett Research Laboratory electric arc-driven shock tube of a normal shock wave in air at 10 km/s and .1 Torr. The new method agrees with the available data as well as the results from the earlier scheme and is more easily extrapolated to di erent ow conditions.

  18. Traffic flow dynamics data, models and simulation

    CERN Document Server

    Treiber, Martin

    2013-01-01

    This textbook provides a comprehensive and instructive coverage of vehicular traffic flow dynamics and modeling. It makes this fascinating interdisciplinary topic, which to date was only documented in parts by specialized monographs, accessible to a broad readership. Numerous figures and problems with solutions help the reader to quickly understand and practice the presented concepts. This book is targeted at students of physics and traffic engineering and, more generally, also at students and professionals in computer science, mathematics, and interdisciplinary topics. It also offers material for project work in programming and simulation at college and university level. The main part, after presenting different categories of traffic data, is devoted to a mathematical description of the dynamics of traffic flow, covering macroscopic models which describe traffic in terms of density, as well as microscopic many-particle models in which each particle corresponds to a vehicle and its driver. Focus chapters on ...

  19. Biomechanics trends in modeling and simulation

    CERN Document Server

    Ogden, Ray

    2017-01-01

    The book presents a state-of-the-art overview of biomechanical and mechanobiological modeling and simulation of soft biological tissues. Seven well-known scientists working in that particular field discuss topics such as biomolecules, networks and cells as well as failure, multi-scale, agent-based, bio-chemo-mechanical and finite element models appropriate for computational analysis. Applications include arteries, the heart, vascular stents and valve implants as well as adipose, brain, collagenous and engineered tissues. The mechanics of the whole cell and sub-cellular components as well as the extracellular matrix structure and mechanotransduction are described. In particular, the formation and remodeling of stress fibers, cytoskeletal contractility, cell adhesion and the mechanical regulation of fibroblast migration in healing myocardial infarcts are discussed. The essential ingredients of continuum mechanics are provided. Constitutive models of fiber-reinforced materials with an emphasis on arterial walls ...

  20. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  1. Traffic flow dynamics. Data, models and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Treiber, Martin [Technische Univ. Dresden (Germany). Inst. fuer Wirtschaft und Verkehr; Kesting, Arne [TomTom Development Germany GmbH, Berlin (Germany)

    2013-07-01

    First comprehensive textbook of this fascinating interdisciplinary topic which explains advances in a way that it is easily accessible to engineering, physics and math students. Presents practical applications of traffic theory such as driving behavior, stability analysis, stop-and-go waves, and travel time estimation. Presents the topic in a novel and systematic way by addressing both microscopic and macroscopic models with a focus on traffic instabilities. Revised and extended edition of the German textbook ''Verkehrsdynamik und -simulation''. This textbook provides a comprehensive and instructive coverage of vehicular traffic flow dynamics and modeling. It makes this fascinating interdisciplinary topic, which to date was only documented in parts by specialized monographs, accessible to a broad readership. Numerous figures and problems with solutions help the reader to quickly understand and practice the presented concepts. This book is targeted at students of physics and traffic engineering and, more generally, also at students and professionals in computer science, mathematics, and interdisciplinary topics. It also offers material for project work in programming and simulation at college and university level. The main part, after presenting different categories of traffic data, is devoted to a mathematical description of the dynamics of traffic flow, covering macroscopic models which describe traffic in terms of density, as well as microscopic many-particle models in which each particle corresponds to a vehicle and its driver. Focus chapters on traffic instabilities and model calibration/validation present these topics in a novel and systematic way. Finally, the theoretical framework is shown at work in selected applications such as traffic-state and travel-time estimation, intelligent transportation systems, traffic operations management, and a detailed physics-based model for fuel consumption and emissions.

  2. Value stream mapping and simulation for implementation of lean manufacturing practices in a footwear company

    Directory of Open Access Journals (Sweden)

    Danilo Felipe Silva de Lima

    2016-03-01

    Full Text Available The development of the Value Stream Mapping (VSM is generally the first step for implementation of Lean Manufacturing (LM. The aim of this paper is to present an application of VSM with simulation in order to analyze the impacts of the LM adoption in the performance of a footwear plant. Therefore, a VSM was designed for the current state and, through the implementation of lean elements, a future state could be designed. Different scenarios were simulated for the future state implementation and the results were compared each other. Transfer, cutting and assembly sections were chosen to be simulated, because it was considered that would be possible to establish a one-piece flow between those processes. After the simulation, the scenario that presented the best results provided a 19% productivity increase over the current state, as well as improvement in all other process variables. The application of simulation as an additional element of VSM has helped to identify the advantages of the joint approach, since it enables to test different alternatives and better define the future state and its implementation strategies.

  3. Simulation Model of Mobile Detection Systems

    International Nuclear Information System (INIS)

    Edmunds, T.; Faissol, D.; Yao, Y.

    2009-01-01

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped with 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains

  4. Simulation Model of Mobile Detection Systems

    Energy Technology Data Exchange (ETDEWEB)

    Edmunds, T; Faissol, D; Yao, Y

    2009-01-27

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped with 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains

  5. International Business Models Developed Through Brokerage Knowledge and Value Creation

    DEFF Research Database (Denmark)

    Petersen, Nicolaj Hannesbo; Rasmussen, Erik Stavnsager

    This paper highlights theoretically and empirically international business model decisions in networks with knowledge sharing and value creation. The paper expands the conceptual in-ternational business model framework for technology-oriented companies to include the focal firm’s network role...... and strategic fit in a global embeddedness. The brokerage role in the in-ternationalization of a network is discussed from both a theoretical and empirical point of view. From a business model and social network analysis perspective, this paper will show how firms and network grow internationally through two...

  6. Economic value added model upon conditions of banking company

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2008-01-01

    Full Text Available The content of this article is the application of the economic value added model (EVA upon the conditions of a banking company. Due to the character of banking business, which is in a different structure of financial sheet, it is not possible to use the standard model EVA for this banking company. The base of this article is the outlined of basic principles of the EVA mode in a non-banking company. Basic specified banking activity dissimilarities are analysed and a directed methodology adjustment of a model such as this, so that it is possible to use it for a banking company.

  7. Cultivating a disease management partnership: a value-chain model.

    Science.gov (United States)

    Murray, Carolyn F; Monroe, Wendy; Stalder, Sharon A

    2003-01-01

    Disease management (DM) is one of the health care industry's more innovative value-chain models, whereby multiple relationships are created to bring complex and time-sensitive services to market. The very nature of comprehensive, seamless DM provided through an outsourced arrangement necessitates a level of cooperation, trust, and synergy that may be lacking from more traditional vendor-customer relationships. This discussion highlights the experience of one health plan and its vendor partner and their approach to the development and delivery of an outsourced heart failure (HF) DM program. The program design and rollout are discussed within principles adapted from the theoretical framework of a value-chain model. Within the value-chain model, added value is created by the convergence and synergistic integration of the partners' discrete strengths. Although each partner brings unique attributes to the relationship, those attributes are significantly enhanced by the value-chain model, thus allowing each party to bring the added value of the relationship to their respective customers. This partnership increases innovation, leverages critical capabilities, and improves market responsiveness. Implementing a comprehensive, outsourced DM program is no small task. DM programs incorporate a broad array of services affecting nearly every department in a health plan's organization. When true seamless integration between multiple organizations with multiple stakeholders is the objective, implementation and ongoing operations can become even more complex. To effectively address the complexities presented by an HF DM program, the parties in this case moved beyond a typical purchaser-vendor relationship to one that is more closely akin to a strategic partnership. This discussion highlights the development of this partnership from the perspective of both organizations, as revealed through contracting and implementation activities. It is intended to provide insight into the program

  8. Assessing the value relevance of current mandatory business model disclosures

    DEFF Research Database (Denmark)

    Schaper, Stefan; Nielsen, Christian; Simoni, Lorenzo

    the model developed by Ohlson (1995). Our results show no significant association between BM disclosure and share prices. The main reason behind this finding can be associated with the low level of disclosure (i.e. the low number of value drivers disclosed on average) by companies as part of their BM......Recent regulations have introduced the requirement for large companies to disclose information about their business model (BM) in the annual reports. The objective of these disclosures is to allow external users to understand better how companies create, deliver and capture value. This study aims...... reports. Ad-hoc created disclosure indexes are based on the taxonomy of business model (BM) configurations developed by Taran et al. (2016) as well as complemented by a frame of reference based on the nice BM canvas elements from Osterwalder and Pigneur (2010). After the classification of companies...

  9. CASTOR detector. Model, objectives and simulated performance

    International Nuclear Information System (INIS)

    Angelis, A. L. S.; Mavromanolakis, G.; Panagiotou, A. D.; Aslanoglou, X.; Nicolis, N.; Lobanov, M.; Erine, S.; Kharlov, Y. V.; Bogolyubsky, M. Y.; Kurepin, A. B.; Chileev, K.; Wlodarczyk, Z.

    2001-01-01

    It is presented a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. It is described the CASTOR calorimeter, a sub detector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented

  10. Modelling and simulation of railway cable systems

    Energy Technology Data Exchange (ETDEWEB)

    Teichelmann, G.; Schaub, M.; Simeon, B. [Technische Univ. Muenchen, Garching (Germany). Zentrum Mathematik M2

    2005-12-15

    Mathematical models and numerical methods for the computation of both static equilibria and dynamic oscillations of railroad catenaries are derived and analyzed. These cable systems form a complex network of string and beam elements and lead to coupled partial differential equations in space and time where constraints and corresponding Lagrange multipliers express the interaction between carrier, contact wire, and pantograph head. For computing static equilibria, three different algorithms are presented and compared, while the dynamic case is treated by a finite element method in space, combined with stabilized time integration of the resulting differential algebraic system. Simulation examples based on reference data from industry illustrate the potential of such computational tools. (orig.)

  11. A Simulation Model Of A Picture Archival And Communication System

    Science.gov (United States)

    D'Silva, Vijay; Perros, Harry; Stockbridge, Chris

    1988-06-01

    A PACS architecture was simulated to quantify its performance. The model consisted of reading stations, acquisition nodes, communication links, a database management system, and a storage system consisting of magnetic and optical disks. Two levels of storage were simulated, a high-speed magnetic disk system for short term storage, and optical disk jukeboxes for long term storage. The communications link was a single bus via which image data were requested and delivered. Real input data to the simulation model were obtained from surveys of radiology procedures (Bowman Gray School of Medicine). From these the following inputs were calculated: - the size of short term storage necessary - the amount of long term storage required - the frequency of access of each store, and - the distribution of the number of films requested per diagnosis. The performance measures obtained were - the mean retrieval time for an image, - mean queue lengths, and - the utilization of each device. Parametric analysis was done for - the bus speed, - the packet size for the communications link, - the record size on the magnetic disk, - compression ratio, - influx of new images, - DBMS time, and - diagnosis think times. Plots give the optimum values for those values of input speed and device performance which are sufficient to achieve subsecond image retrieval times

  12. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  13. Simulation model for port shunting yards

    Science.gov (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  14. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  15. Modeling VOC transport in simulated waste drums

    International Nuclear Information System (INIS)

    Liekhus, K.J.; Gresham, G.L.; Peterson, E.S.; Rae, C.; Hotz, N.J.; Connolly, M.J.

    1993-06-01

    A volatile organic compound (VOC) transport model has been developed to describe unsteady-state VOC permeation and diffusion within a waste drum. Model equations account for three primary mechanisms for VOC transport from a void volume within the drum. These mechanisms are VOC permeation across a polymer boundary, VOC diffusion across an opening in a volume boundary, and VOC solubilization in a polymer boundary. A series of lab-scale experiments was performed in which the VOC concentration was measured in simulated waste drums under different conditions. A lab-scale simulated waste drum consisted of a sized-down 55-gal metal drum containing a modified rigid polyethylene drum liner. Four polyethylene bags were sealed inside a large polyethylene bag, supported by a wire cage, and placed inside the drum liner. The small bags were filled with VOC-air gas mixture and the VOC concentration was measured throughout the drum over a period of time. Test variables included the type of VOC-air gas mixtures introduced into the small bags, the small bag closure type, and the presence or absence of a variable external heat source. Model results were calculated for those trials where the VOC permeability had been measured. Permeabilities for five VOCs [methylene chloride, 1,1,2-trichloro-1,2,2-trifluoroethane (Freon-113), 1,1,1-trichloroethane, carbon tetrachloride, and trichloroethylene] were measured across a polyethylene bag. Comparison of model and experimental results of VOC concentration as a function of time indicate that model accurately accounts for significant VOC transport mechanisms in a lab-scale waste drum

  16. The heuristic value of redundancy models of aging.

    Science.gov (United States)

    Boonekamp, Jelle J; Briga, Michael; Verhulst, Simon

    2015-11-01

    Molecular studies of aging aim to unravel the cause(s) of aging bottom-up, but linking these mechanisms to organismal level processes remains a challenge. We propose that complementary top-down data-directed modelling of organismal level empirical findings may contribute to developing these links. To this end, we explore the heuristic value of redundancy models of aging to develop a deeper insight into the mechanisms causing variation in senescence and lifespan. We start by showing (i) how different redundancy model parameters affect projected aging and mortality, and (ii) how variation in redundancy model parameters relates to variation in parameters of the Gompertz equation. Lifestyle changes or medical interventions during life can modify mortality rate, and we investigate (iii) how interventions that change specific redundancy parameters within the model affect subsequent mortality and actuarial senescence. Lastly, as an example of data-directed modelling and the insights that can be gained from this, (iv) we fit a redundancy model to mortality patterns observed by Mair et al. (2003; Science 301: 1731-1733) in Drosophila that were subjected to dietary restriction and temperature manipulations. Mair et al. found that dietary restriction instantaneously reduced mortality rate without affecting aging, while temperature manipulations had more transient effects on mortality rate and did affect aging. We show that after adjusting model parameters the redundancy model describes both effects well, and a comparison of the parameter values yields a deeper insight in the mechanisms causing these contrasting effects. We see replacement of the redundancy model parameters by more detailed sub-models of these parameters as a next step in linking demographic patterns to underlying molecular mechanisms. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Modeling and control simulation of the China CLEAR-IB

    International Nuclear Information System (INIS)

    Yan, Shoujun; Wan, Jiashuang; Wang, Pengfei; Fang, Huawei; Sun, Changyi; Zhao, Fuyu

    2014-01-01

    Highlights: • A model for the reactor for CLEAR-IB was developed. • A PI controller was designed to control the power. • A control strategy was adopted to control the water enthalpy of air cooler. • Dynamic simulation of the whole system was performed. - Abstract: To investigate the dynamic and control characteristics of the plant, a model for the main components of the reactor and the most relevant interactions among them is developed. The system comprises of the primary system with lead bismuth eutectic (LBE) as the coolant, the secondary circuit with steam water mixture as the coolant and the associated air cooling system for an effective rejection of thermal power to the environment as a final heat sink. A Proportional-Integral (PI) controller is designed to keep the power following the set value as quickly as possible. To keep outlet coolant of air coolers and inlet coolant of HXs being saturated water, a control strategy based on a simultaneous feed-forward and feedback scheme has been adopted. Based on the developed model and control strategy, dynamic simulation of the whole system in the cases of step changes of external source and load is performed. The simulation results show that the proposed model is accurate enough to describe the dynamic behaviors of the plant in spite of its simplicity. It has also been demonstrated that the developed controllers for the CLEAR-IB can provide superior reactor control capabilities due to the efficiency of the control strategy adopted

  18. A sEMG model with experimentally based simulation parameters.

    Science.gov (United States)

    Wheeler, Katherine A; Shimada, Hiroshima; Kumar, Dinesh K; Arjunan, Sridhar P

    2010-01-01

    A differential, time-invariant, surface electromyogram (sEMG) model has been implemented. While it is based on existing EMG models, the novelty of this implementation is that it assigns more accurate distributions of variables to create realistic motor unit (MU) characteristics. Variables such as muscle fibre conduction velocity, jitter (the change in the interpulse interval between subsequent action potential firings) and motor unit size have been considered to follow normal distributions about an experimentally obtained mean. In addition, motor unit firing frequencies have been considered to have non-linear and type based distributions that are in accordance with experimental results. Motor unit recruitment thresholds have been considered to be related to the MU type. The model has been used to simulate single channel differential sEMG signals from voluntary, isometric contractions of the biceps brachii muscle. The model has been experimentally verified by conducting experiments on three subjects. Comparison between simulated signals and experimental recordings shows that the Root Mean Square (RMS) increases linearly with force in both cases. The simulated signals also show similar values and rates of change of RMS to the experimental signals.

  19. An interval-valued reliability model with bounded failure rates

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, Victor

    2012-01-01

    The approach to deriving interval-valued reliability measures described in this paper is distinctive from other imprecise reliability models in that it overcomes the issue of having to impose an upper bound on time to failure. It rests on the presupposition that a constant interval-valued failure...... rate is known possibly along with other reliability measures, precise or imprecise. The Lagrange method is used to solve the constrained optimization problem to derive new reliability measures of interest. The obtained results call for an exponential-wise approximation of failure probability density...

  20. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  1. Mean Value Modelling of a Turbocharged SI Engine

    DEFF Research Database (Denmark)

    Müller, Martin; Hendricks, Elbert; Sorenson, Spencer C.

    1998-01-01

    An important paradigm for the modelling of naturallly aspirated (NA) spark ignition (SI) engines for control purposes is the Mean Value Engine Model (MVEM). Such models have a time resolution which is just sufficient to capture the main details of the dynamic performance of NA SI engines...... but not the cycle-by-cycle behavior. In principle such models are also physically based,are very compact in a mathematical sense but nevertheless can have reasonable prediction accuracy. Presently no MVEMs have been constructed for intercooled turbocharged SI engines because their complexity confounds the simple...... physical understanding and description of such engines. This paper presents a newly constructed MVEM for a turbocharged SI engine which contains the details of the compressor and turbine characteristics in a compact way. The model has been tested against the responses of an experimental engine and has...

  2. [Homeostasis model assessment (HOMA) values in Chilean elderly subjects].

    Science.gov (United States)

    Garmendia, María Luisa; Lera, Lydia; Sánchez, Hugo; Uauy, Ricardo; Albala, Cecilia

    2009-11-01

    The homeostasis assessment model for insulin resistance (HOMA-IR) estimates insulin resistance using basal insulin and glucose values and has a good concordance with values obtained with the euglycemic clamp. However it has a high variability that depends on environmental, genetic and physiologic factors. Therefore it is imperative to establish normal HOMA values in different populations. To report HOMA-IR values in Chilean elderly subjects and to determine the best cutoff point to diagnose insulin resistance. Cross sectional study of 1003 subjects older than 60 years of whom 803 (71% women) did not have diabetes. In 154 subjects, an oral glucose tolerance test was also performed. Insulin resistance (IR) was defined as the HOMA value corresponding to percentile 75 of subjects without over or underweight. The behavior of HOMA-IR in metabolic syndrome was studied and receiver operating curves (ROC) were calculated, using glucose intolerance defined as a blood glucose over 140 mg/dl and hyperinsulinemia, defined as a serum insulin over 60 microU/ml, two hours after the glucose load. Median HOMA-IR values were 1.7. Percentile 75 in subjects without obesity or underweight was 2.57. The area under the ROC curve, when comparing HOMA-IR with glucose intolerance and hyperinsulinemia, was 0.8 (95% confidence values 0.72-0.87), with HOMA-IR values ranging from 2.04 to 2.33. HOMA-IR is a useful method to determine insulin resistance in epidemiological studies. The HOMA-IR cutoff point for insulin resistance defined in thi spopulation was 2.6.

  3. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  4. Moisture performance of building materials: From material characterization to building simulation using the Moisture Buffer Value concept

    Energy Technology Data Exchange (ETDEWEB)

    Abadie, Marc Olivier [Mechanical Engineering Graduate Program, Pontifical Catholic University of Parana, PUC-PR/CCET, Curitiba, PR 80215-901 (Brazil); LEPTAB, University of La Rochelle, La Rochelle, 17042 Cedex 1 (France); Mendonca, Katia Cordeiro [Mechanical Engineering Graduate Program, Pontifical Catholic University of Parana, PUC-PR/CCET, Curitiba, PR 80215-901 (Brazil)

    2009-02-15

    Predicting the indoor air relative humidity evolution is of great importance to evaluate people thermal comfort, perceived air quality and energy consumption. In building environments, porous materials of the envelope and furniture act on the indoor air humidity by reducing its variations. Solving the physical processes involved inside the porous materials requires the knowledge of the material hygrothermal properties that needs multiple and, for some of them, time-consuming experimental procedures. Recently, both the NORDTEST Project and Japanese Industrial Standard described a new Moisture Buffer Capacity index that accounts for surrounding air vapor concentration variation. The Moisture Buffer Value (MBV) indicates the amount of water vapor that is transported in or out of a material, during a certain period of time, when the vapor concentration of the surrounding air varies. The MBV evaluation requires only one experimental procedure and its value permits a direct comparison of the building materials moisture performance. However, two limitations can be distinguished: first, no relation between the MBV and the usual material hygrothermal properties has been clearly identified and second, no model has been proposed to actually use the MBV in building simulation. The present study aims to solve these two problems. First, the MBV fundamentals are introduced and discussed; followed by its relation with the usual material properties. Then, a lumped model for building simulation, whose parameters can be determined from the MBV experimental procedure, is described. To finish, examples of the use of this MBV-based lumped model for moisture prediction in buildings are presented. (author)

  5. Modeling and visual simulation of Microalgae photobioreactor

    Science.gov (United States)

    Zhao, Ming; Hou, Dapeng; Hu, Dawei

    Microalgae is a kind of nutritious and high photosynthetic efficiency autotrophic plant, which is widely distributed in the land and the sea. It can be extensively used in medicine, food, aerospace, biotechnology, environmental protection and other fields. Photobioreactor which is important equipment is mainly used to cultivate massive and high-density microalgae. In this paper, based on the mathematical model of microalgae which grew under different light intensity, three-dimensional visualization model was built and implemented in 3ds max, Virtools and some other three dimensional software. Microalgae is photosynthetic organism, it can efficiently produce oxygen and absorb carbon dioxide. The goal of the visual simulation is to display its change and impacting on oxygen and carbon dioxide intuitively. In this paper, different temperatures and light intensities were selected to control the photobioreactor, and dynamic change of microalgal biomass, Oxygen and carbon dioxide was observed with the aim of providing visualization support for microalgal and photobioreactor research.

  6. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    Wind turbine wakes can cause 10-20% annual energy losses in wind farms, and wake turbulence can decrease the lifetime of wind turbine blades. One way of estimating these effects is the use of computational fluid dynamics (CFD) to simulate wind turbines wakes in the atmospheric boundary layer. Since...... this flow is in the high Reynolds number regime, it is mainly dictated by turbulence. As a result, the turbulence modeling in CFD dominates the wake characteristics, especially in Reynolds-averaged Navier-Stokes (RANS). The present work is dedicated to study and develop RANS-based turbulence models...... verified with a grid dependency study. With respect to the standard k-ε EVM, the k-ε- fp EVM compares better with measurements of the velocity deficit, especially in the near wake, which translates to improved power deficits of the first wind turbines in a row. When the CFD metholody is applied to a large...

  7. Molecular models and simulations of layered materials

    International Nuclear Information System (INIS)

    Kalinichev, Andrey G.; Cygan, Randall Timothy; Heinz, Hendrik; Greathouse, Jeffery A.

    2008-01-01

    The micro- to nano-sized nature of layered materials, particularly characteristic of naturally occurring clay minerals, limits our ability to fully interrogate their atomic dispositions and crystal structures. The low symmetry, multicomponent compositions, defects, and disorder phenomena of clays and related phases necessitate the use of molecular models and modern simulation methods. Computational chemistry tools based on classical force fields and quantum-chemical methods of electronic structure calculations provide a practical approach to evaluate structure and dynamics of the materials on an atomic scale. Combined with classical energy minimization, molecular dynamics, and Monte Carlo techniques, quantum methods provide accurate models of layered materials such as clay minerals, layered double hydroxides, and clay-polymer nanocomposites

  8. Dynamical Downscaling of NASA/GISS ModelE: Continuous, Multi-Year WRF Simulations

    Science.gov (United States)

    Otte, T.; Bowden, J. H.; Nolte, C. G.; Otte, M. J.; Herwehe, J. A.; Faluvegi, G.; Shindell, D. T.

    2010-12-01

    The WRF Model is being used at the U.S. EPA for dynamical downscaling of the NASA/GISS ModelE fields to assess regional impacts of climate change in the United States. The WRF model has been successfully linked to the ModelE fields in their raw hybrid vertical coordinate, and continuous, multi-year WRF downscaling simulations have been performed. WRF will be used to downscale decadal time slices of ModelE for recent past, current, and future climate as the simulations being conducted for the IPCC Fifth Assessment Report become available. This presentation will focus on the sensitivity to interior nudging within the RCM. The use of interior nudging for downscaled regional climate simulations has been somewhat controversial over the past several years but has been recently attracting attention. Several recent studies that have used reanalysis (i.e., verifiable) fields as a proxy for GCM input have shown that interior nudging can be beneficial toward achieving the desired downscaled fields. In this study, the value of nudging will be shown using fields from ModelE that are downscaled using WRF. Several different methods of nudging are explored, and it will be shown that the method of nudging and the choices made with respect to how nudging is used in WRF are critical to balance the constraint of ModelE against the freedom of WRF to develop its own fields.

  9. Assessing the value of increased model resolution in forecasting fire danger

    Science.gov (United States)

    Jeanne Hoadley; Miriam Rorig; Ken Westrick; Larry Bradshaw; Sue Ferguson; Scott Goodrick; Paul Werth

    2003-01-01

    The fire season of 2000 was used as a case study to assess the value of increasing mesoscale model resolution for fire weather and fire danger forecasting. With a domain centered on Western Montana and Northern Idaho, MM5 simulations were run at 36, 12, and 4-km resolutions for a 30 day period at the height of the fire season. Verification analyses for meteorological...

  10. At the biological modeling and simulation frontier.

    Science.gov (United States)

    Hunt, C Anthony; Ropella, Glen E P; Lam, Tai Ning; Tang, Jonathan; Kim, Sean H J; Engelberg, Jesse A; Sheikh-Bahaei, Shahab

    2009-11-01

    We provide a rationale for and describe examples of synthetic modeling and simulation (M&S) of biological systems. We explain how synthetic methods are distinct from familiar inductive methods. Synthetic M&S is a means to better understand the mechanisms that generate normal and disease-related phenomena observed in research, and how compounds of interest interact with them to alter phenomena. An objective is to build better, working hypotheses of plausible mechanisms. A synthetic model is an extant hypothesis: execution produces an observable mechanism and phenomena. Mobile objects representing compounds carry information enabling components to distinguish between them and react accordingly when different compounds are studied simultaneously. We argue that the familiar inductive approaches contribute to the general inefficiencies being experienced by pharmaceutical R&D, and that use of synthetic approaches accelerates and improves R&D decision-making and thus the drug development process. A reason is that synthetic models encourage and facilitate abductive scientific reasoning, a primary means of knowledge creation and creative cognition. When synthetic models are executed, we observe different aspects of knowledge in action from different perspectives. These models can be tuned to reflect differences in experimental conditions and individuals, making translational research more concrete while moving us closer to personalized medicine.

  11. Plasma simulation studies using multilevel physics models

    International Nuclear Information System (INIS)

    Park, W.; Belova, E.V.; Fu, G.Y.; Tang, X.Z.; Strauss, H.R.; Sugiyama, L.E.

    1999-01-01

    The question of how to proceed toward ever more realistic plasma simulation studies using ever increasing computing power is addressed. The answer presented here is the M3D (Multilevel 3D) project, which has developed a code package with a hierarchy of physics levels that resolve increasingly complete subsets of phase-spaces and are thus increasingly more realistic. The rationale for the multilevel physics models is given. Each physics level is described and examples of its application are given. The existing physics levels are fluid models (3D configuration space), namely magnetohydrodynamic (MHD) and two-fluids; and hybrid models, namely gyrokinetic-energetic-particle/MHD (5D energetic particle phase-space), gyrokinetic-particle-ion/fluid-electron (5D ion phase-space), and full-kinetic-particle-ion/fluid-electron level (6D ion phase-space). Resolving electron phase-space (5D or 6D) remains a future project. Phase-space-fluid models are not used in favor of δf particle models. A practical and accurate nonlinear fluid closure for noncollisional plasmas seems not likely in the near future. copyright 1999 American Institute of Physics

  12. Plasma simulation studies using multilevel physics models

    International Nuclear Information System (INIS)

    Park, W.; Belova, E.V.; Fu, G.Y.

    2000-01-01

    The question of how to proceed toward ever more realistic plasma simulation studies using ever increasing computing power is addressed. The answer presented here is the M3D (Multilevel 3D) project, which has developed a code package with a hierarchy of physics levels that resolve increasingly complete subsets of phase-spaces and are thus increasingly more realistic. The rationale for the multilevel physics models is given. Each physics level is described and examples of its application are given. The existing physics levels are fluid models (3D configuration space), namely magnetohydrodynamic (MHD) and two-fluids; and hybrid models, namely gyrokinetic-energetic-particle/MHD (5D energetic particle phase-space), gyrokinetic-particle-ion/fluid-electron (5D ion phase-space), and full-kinetic-particle-ion/fluid-electron level (6D ion phase-space). Resolving electron phase-space (5D or 6D) remains a future project. Phase-space-fluid models are not used in favor of delta f particle models. A practical and accurate nonlinear fluid closure for noncollisional plasmas seems not likely in the near future

  13. Stabilising the global greenhouse. A simulation model

    International Nuclear Information System (INIS)

    Michaelis, P.

    1993-01-01

    This paper investigates the economic implications of a comprehensive approach to greenhouse policies that strives to stabilise the atmospheric concentration of greenhouse gases at an ecolocially determined threshold level. In a theoretical optimisation model conditions for an efficient allocation of abatement effort among pollutants and over time are derived. The model is empirically specified and adapted to a dynamic Gams-algorithm. By various simulation runs for the period of 1990 to 2110, the economics of greenhouse gas accumulation are explored. In particular, the long-run cost associated with the above stabilisation target are evaluated for three different policy scenarios: i) A comprehensive approach that covers all major greenhouse gases simultaneously, ii) a piecemeal approach that is limited to reducing CO 2 emissions, and iii) a ten-year moratorium that postpones abatement effort until new scientific evidence on the greenhouse effect will become available. Comparing the simulation results suggests that a piecemeal approach would considerably increase total cost, whereas a ten-year moratorium might be reasonable even if the probability of 'good news' is comparatively small. (orig.)

  14. Modeling lift operations with SASmacr Simulation Studio

    Science.gov (United States)

    Kar, Leow Soo

    2016-10-01

    Lifts or elevators are an essential part of multistorey buildings which provide vertical transportation for its occupants. In large and high-rise apartment buildings, its occupants are permanent, while in buildings, like hospitals or office blocks, the occupants are temporary or users of the buildings. They come in to work or to visit, and thus, the population of such buildings are much higher than those in residential apartments. It is common these days that large office blocks or hospitals have at least 8 to 10 lifts serving its population. In order to optimize the level of service performance, different transportation schemes are devised to control the lift operations. For example, one lift may be assigned to solely service the even floors and another solely for the odd floors, etc. In this paper, a basic lift system is modelled using SAS Simulation Studio to study the effect of factors such as the number of floors, capacity of the lift car, arrival rate and exit rate of passengers at each floor, peak and off peak periods on the system performance. The simulation is applied to a real lift operation in Sunway College's North Building to validate the model.

  15. Simulation as a vehicle for enhancing collaborative practice models.

    Science.gov (United States)

    Jeffries, Pamela R; McNelis, Angela M; Wheeler, Corinne A

    2008-12-01

    Clinical simulation used in a collaborative practice approach is a powerful tool to prepare health care providers for shared responsibility for patient care. Clinical simulations are being used increasingly in professional curricula to prepare providers for quality practice. Little is known, however, about how these simulations can be used to foster collaborative practice across disciplines. This article provides an overview of what simulation is, what collaborative practice models are, and how to set up a model using simulations. An example of a collaborative practice model is presented, and nursing implications of using a collaborative practice model in simulations are discussed.

  16. Modeling and numerical simulations of the influenced Sznajd model

    Science.gov (United States)

    Karan, Farshad Salimi Naneh; Srinivasan, Aravinda Ramakrishnan; Chakraborty, Subhadeep

    2017-08-01

    This paper investigates the effects of independent nonconformists or influencers on the behavioral dynamic of a population of agents interacting with each other based on the Sznajd model. The system is modeled on a complete graph using the master equation. The acquired equation has been numerically solved. Accuracy of the mathematical model and its corresponding assumptions have been validated by numerical simulations. Regions of initial magnetization have been found from where the system converges to one of two unique steady-state PDFs, depending on the distribution of influencers. The scaling property and entropy of the stationary system in presence of varying level of influence have been presented and discussed.

  17. Modelling and Simulation of Gas Engines Using Aspen HYSYS

    Directory of Open Access Journals (Sweden)

    M. C. Ekwonu

    2013-12-01

    Full Text Available In this paper gas engine model was developed in Aspen HYSYS V7.3 and validated with Waukesha 16V275GL+ gas engine. Fuel flexibility, fuel types and part load performance of the gas engine were investigated. The design variability revealed that the gas engine can operate on poor fuel with low lower heating value (LHV such as landfill gas, sewage gas and biogas with biogas offering potential integration with bottoming cycles when compared to natural gas. The result of the gas engine simulation gave an efficiency 40.7% and power output of 3592kW.

  18. Geomechanical Simulation of Bayou Choctaw Strategic Petroleum Reserve - Model Calibration.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    A finite element numerical analysis model has been constructed that consists of a realistic mesh capturing the geometries of Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) site and multi - mechanism deformation ( M - D ) salt constitutive model using the daily data of actual wellhead pressure and oil - brine interface. The salt creep rate is not uniform in the salt dome, and the creep test data for BC salt is limited. Therefore, the model calibration is necessary to simulate the geomechanical behavior of the salt dome. The cavern volumetric closures of SPR caverns calculated from CAVEMAN are used for the field baseline measurement. The structure factor, A 2 , and transient strain limit factor, K 0 , in the M - D constitutive model are used for the calibration. The A 2 value obtained experimentally from the BC salt and K 0 value of Waste Isolation Pilot Plant (WIPP) salt are used for the baseline values. T o adjust the magnitude of A 2 and K 0 , multiplication factors A2F and K0F are defined, respectively. The A2F and K0F values of the salt dome and salt drawdown skins surrounding each SPR cavern have been determined through a number of back fitting analyses. The cavern volumetric closures calculated from this model correspond to the predictions from CAVEMAN for six SPR caverns. Therefore, this model is able to predict past and future geomechanical behaviors of the salt dome, caverns, caprock , and interbed layers. The geological concerns issued in the BC site will be explained from this model in a follow - up report .

  19. Cultural ecosystem services of mountain regions: Modelling the aesthetic value

    OpenAIRE

    Schirpke, Uta; Timmermann, Florian; Tappeiner, Ulrike; Tasser, Erich

    2016-01-01

    Mountain regions meet an increasing demand for pleasant landscapes, offering many cultural ecosystem services to both their residents and tourists. As a result of global change, land managers and policy makers are faced with changes to this landscape and need efficient evaluation techniques to assess cultural ecosystem services. This study provides a spatially explicit modelling approach to estimating aesthetic landscape values by relating spatial landscape patterns to human perceptions via a...

  20. Dispersion modeling by kinematic simulation: Cloud dispersion model

    International Nuclear Information System (INIS)

    Fung, J C H; Perkins, R J

    2008-01-01

    A new technique has been developed to compute mean and fluctuating concentrations in complex turbulent flows (tidal current near a coast and deep ocean). An initial distribution of material is discretized into any small clouds which are advected by a combination of the mean flow and large scale turbulence. The turbulence can be simulated either by kinematic simulation (KS) or direct numerical simulation. The clouds also diffuse relative to their centroids; the statistics for this are obtained from a separate calculation of the growth of individual clouds in small scale turbulence, generated by KS. The ensemble of discrete clouds is periodically re-discretized, to limit the size of the small clouds and prevent overlapping. The model is illustrated with simulations of dispersion in uniform flow, and the results are compared with analytic, steady state solutions. The aim of this study is to understand how pollutants disperses in a turbulent flow through a numerical simulation of fluid particle motion in a random flow field generated by Fourier modes. Although this homogeneous turbulent is rather a 'simple' flow, it represents a building block toward understanding pollutant dispersion in more complex flow. The results presented here are preliminary in nature, but we expect that similar qualitative results should be observed in a genuine turbulent flow.

  1. Forecasting Lightning Threat using Cloud-resolving Model Simulations

    Science.gov (United States)

    McCaul, E. W., Jr.; Goodman, S. J.; LaCasse, K. M.; Cecil, D. J.

    2009-01-01

    As numerical forecasts capable of resolving individual convective clouds become more common, it is of interest to see if quantitative forecasts of lightning flash rate density are possible, based on fields computed by the numerical model. Previous observational research has shown robust relationships between observed lightning flash rates and inferred updraft and large precipitation ice fields in the mixed phase regions of storms, and that these relationships might allow simulated fields to serve as proxies for lightning flash rate density. It is shown in this paper that two simple proxy fields do indeed provide reasonable and cost-effective bases for creating time-evolving maps of predicted lightning flash rate density, judging from a series of diverse simulation case study events in North Alabama for which Lightning Mapping Array data provide ground truth. One method is based on the product of upward velocity and the mixing ratio of precipitating ice hydrometeors, modeled as graupel only, in the mixed phase region of storms at the -15\\dgc\\ level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domainwide statistics of the peak values of simulated flash rate proxy fields against domainwide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. A blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Weather Research and Forecast Model simulations of selected North Alabama cases show that this model can distinguish the general character and intensity of most convective events, and that the proposed methods show promise as a means of generating

  2. Intercomparison of terrestrial carbon fluxes and carbon use efficiency simulated by CMIP5 Earth System Models

    Science.gov (United States)

    Kim, Dongmin; Lee, Myong-In; Jeong, Su-Jong; Im, Jungho; Cha, Dong Hyun; Lee, Sanggyun

    2017-12-01

    This study compares historical simulations of the terrestrial carbon cycle produced by 10 Earth System Models (ESMs) that participated in the fifth phase of the Coupled Model Intercomparison Project (CMIP5). Using MODIS satellite estimates, this study validates the simulation of gross primary production (GPP), net primary production (NPP), and carbon use efficiency (CUE), which depend on plant function types (PFTs). The models show noticeable deficiencies compared to the MODIS data in the simulation of the spatial patterns of GPP and NPP and large differences among the simulations, although the multi-model ensemble (MME) mean provides a realistic global mean value and spatial distributions. The larger model spreads in GPP and NPP compared to those of surface temperature and precipitation suggest that the differences among simulations in terms of the terrestrial carbon cycle are largely due to uncertainties in the parameterization of terrestrial carbon fluxes by vegetation. The models also exhibit large spatial differences in their simulated CUE values and at locations where the dominant PFT changes, primarily due to differences in the parameterizations. While the MME-simulated CUE values show a strong dependence on surface temperatures, the observed CUE values from MODIS show greater complexity, as well as non-linear sensitivity. This leads to the overall underestimation of CUE using most of the PFTs incorporated into current ESMs. The results of this comparison suggest that more careful and extensive validation is needed to improve the terrestrial carbon cycle in terms of ecosystem-level processes.

  3. The Deficit Model and the Forgotten Moral Values

    Directory of Open Access Journals (Sweden)

    Marko Ahteensuu

    2011-03-01

    Full Text Available This paper was presented at the first meeting of the NSU study group “Conceptions of ethical and social values in post-secular society: Towards a new ethical imagination in a cosmopolitan world society”, held on January 28-30, 2011 at Copenhagen Business School. The deficit model explains the general public’s negative attitudes towards science and/or certain scientific applications with the public’s scientific ignorance. The deficit model is commonly criticized for oversimplifying the connection between scientific knowledge and attitudes. Other relevant factors – such as ideology, social identity, trust, culture, and worldviews – should be taken into consideration to a greater extent. We argue that explanations based on the proposed factors sometimes implicitly reintroduce the deficit model type of thinking. The strength of the factors is that they broaden the explanations to concern moral issues. We analyse two central argument types of GMO discussion, and show the central role of moral values in them. Thus, as long as arguments are seen to affect the attitudes of the general public, the role of moral values should be made explicit in the explanations concerning their attitudes.

  4. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  5. Conceptual Model of Quantities, Units, Dimensions, and Values

    Science.gov (United States)

    Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar

    2011-01-01

    JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.

  6. A complex-valued firing-rate model that approximates the dynamics of spiking networks.

    Directory of Open Access Journals (Sweden)

    Evan S Schaffer

    2013-10-01

    Full Text Available Firing-rate models provide an attractive approach for studying large neural networks because they can be simulated rapidly and are amenable to mathematical analysis. Traditional firing-rate models assume a simple form in which the dynamics are governed by a single time constant. These models fail to replicate certain dynamic features of populations of spiking neurons, especially those involving synchronization. We present a complex-valued firing-rate model derived from an eigenfunction expansion of the Fokker-Planck equation and apply it to the linear, quadratic and exponential integrate-and-fire models. Despite being almost as simple as a traditional firing-rate description, this model can reproduce firing-rate dynamics due to partial synchronization of the action potentials in a spiking model, and it successfully predicts the transition to spike synchronization in networks of coupled excitatory and inhibitory neurons.

  7. A complex-valued firing-rate model that approximates the dynamics of spiking networks.

    Science.gov (United States)

    Schaffer, Evan S; Ostojic, Srdjan; Abbott, L F

    2013-10-01

    Firing-rate models provide an attractive approach for studying large neural networks because they can be simulated rapidly and are amenable to mathematical analysis. Traditional firing-rate models assume a simple form in which the dynamics are governed by a single time constant. These models fail to replicate certain dynamic features of populations of spiking neurons, especially those involving synchronization. We present a complex-valued firing-rate model derived from an eigenfunction expansion of the Fokker-Planck equation and apply it to the linear, quadratic and exponential integrate-and-fire models. Despite being almost as simple as a traditional firing-rate description, this model can reproduce firing-rate dynamics due to partial synchronization of the action potentials in a spiking model, and it successfully predicts the transition to spike synchronization in networks of coupled excitatory and inhibitory neurons.

  8. Representation of Solar Capacity Value in the ReEDS Capacity Expansion Model

    Energy Technology Data Exchange (ETDEWEB)

    Sigrin, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, P. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ibanez, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-03-01

    An important issue for electricity system operators is the estimation of renewables' capacity contributions to reliably meeting system demand, or their capacity value. While the capacity value of thermal generation can be estimated easily, assessment of wind and solar requires a more nuanced approach due to the resource variability. Reliability-based methods, particularly assessment of the Effective Load-Carrying Capacity, are considered to be the most robust and widely-accepted techniques for addressing this resource variability. This report compares estimates of solar PV capacity value by the Regional Energy Deployment System (ReEDS) capacity expansion model against two sources. The first comparison is against values published by utilities or other entities for known electrical systems at existing solar penetration levels. The second comparison is against a time-series ELCC simulation tool for high renewable penetration scenarios in the Western Interconnection. Results from the ReEDS model are found to compare well with both comparisons, despite being resolved at a super-hourly temporal resolution. Two results are relevant for other capacity-based models that use a super-hourly resolution to model solar capacity value. First, solar capacity value should not be parameterized as a static value, but must decay with increasing penetration. This is because -- for an afternoon-peaking system -- as solar penetration increases, the system's peak net load shifts to later in the day -- when solar output is lower. Second, long-term planning models should determine system adequacy requirements in each time period in order to approximate LOLP calculations. Within the ReEDS model we resolve these issues by using a capacity value estimate that varies by time-slice. Within each time period the net load and shadow price on ReEDS's planning reserve constraint signals the relative importance of additional firm capacity.

  9. Heat waves over Central Europe in regional climate model simulations

    Science.gov (United States)

    Lhotka, Ondřej; Kyselý, Jan

    2014-05-01

    Regional climate models (RCMs) have become a powerful tool for exploring impacts of global climate change on a regional scale. The aim of the study is to evaluate the capability of RCMs to reproduce characteristics of major heat waves over Central Europe in their simulations of the recent climate (1961-2000), with a focus on the most severe and longest Central European heat wave that occurred in 1994. We analyzed 7 RCM simulations with a high resolution (0.22°) from the ENSEMBLES project, driven by the ERA-40 reanalysis. In observed data (the E-OBS 9.0 dataset), heat waves were defined on the basis of deviations of daily maximum temperature (Tmax) from the 95% quantile of summer Tmax distribution in grid points over Central Europe. The same methodology was applied in the RCM simulations; we used corresponding 95% quantiles (calculated for each RCM and grid point) in order to remove the bias of modelled Tmax. While climatological characteristics of heat waves are reproduced reasonably well in the RCM ensemble, we found major deficiencies in simulating heat waves in individual years. For example, METNOHIRHAM simulated very severe heat waves in 1996, when no heat wave was observed. Focusing on the major 1994 heat wave, considerable differences in simulated temperature patterns were found among the RCMs. The differences in the temperature patterns were clearly linked to the simulated amount of precipitation during this event. The 1994 heat wave was almost absent in all RCMs that did not capture the observed precipitation deficit, while it was by far most pronounced in KNMI-RACMO that simulated virtually no precipitation over Central Europe during the 15-day period of the heat wave. By contrast to precipitation, values of evaporative fraction in the RCMs were not linked to severity of the simulated 1994 heat wave. This suggests a possible major contribution of other factors such as cloud cover and associated downward shortwave radiation. Therefore, a more detailed

  10. Systematic simulations of modified gravity: symmetron and dilaton models

    International Nuclear Information System (INIS)

    Brax, Philippe; Davis, Anne-Christine; Li, Baojiu; Winther, Hans A.; Zhao, Gong-Bo

    2012-01-01

    We study the linear and nonlinear structure formation in the dilaton and symmetron models of modified gravity using a generic parameterisation which describes a large class of scenarios using only a few parameters, such as the coupling between the scalar field and the matter, and the range of the scalar force on very large scales. For this we have modified the N-body simulation code ECOSMOG, which is a variant of RAMSES working in modified gravity scenarios, to perform a set of 110 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a large portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM template cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc −1 . Our results show the full effect of screening on nonlinear structure formation and the associated deviation from ΛCDM. We also investigate how differences in the force mediated by the scalar field in modified gravity models lead to qualitatively different features for the nonlinear power spectrum and the halo mass function, and how varying the individual model parameters changes these observables. The differences are particularly large in the nonlinear power spectra whose shapes for f(R), dilaton and symmetron models vary greatly, and where the characteristic bump around 1 hMpc −1 of f(R) models is preserved for symmetrons, whereas an increase on much smaller scales is particular to symmetrons. No bump is present for dilatons where a flattening of the power spectrum takes place on small scales. These deviations from ΛCDM and the differences between modified gravity models, such as dilatons and symmetrons, could be tested with future surveys

  11. Tecnomatix Plant Simulation modeling and programming by means of examples

    CERN Document Server

    Bangsow, Steffen

    2015-01-01

    This book systematically introduces the development of simulation models as well as the implementation and evaluation of simulation experiments with Tecnomatix Plant Simulation. It deals with all users of Plant Simulation, who have more complex tasks to handle. It also looks for an easy entry into the program. Particular attention has been paid to introduce the simulation flow language SimTalk and its use in various areas of the simulation. The author demonstrates with over 200 examples how to combine the blocks for simulation models and how to deal with SimTalk for complex control and analys

  12. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M

    2011-01-01

    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  13. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  14. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  15. Modeling and Simulation Techniques for Large-Scale Communications Modeling

    National Research Council Canada - National Science Library

    Webb, Steve

    1997-01-01

    .... Tests of random number generators were also developed and applied to CECOM models. It was found that synchronization of random number strings in simulations is easy to implement and can provide significant savings for making comparative studies. If synchronization is in place, then statistical experiment design can be used to provide information on the sensitivity of the output to input parameters. The report concludes with recommendations and an implementation plan.

  16. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Sheng [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China); Qu, Xiaobo [Griffith School of Engineering, Griffith University, Gold Coast, 4222 Australia (Australia); Xu, Cheng [Department of Transportation Management Engineering, Zhejiang Police College, Hangzhou, 310053 China (China); College of Transportation, Jilin University, Changchun, 130022 China (China); Ma, Dongfang, E-mail: mdf2004@zju.edu.cn [Ocean College, Zhejiang University, Hangzhou, 310058 China (China); Wang, Dianhai [College of Civil Engineering and Architecture, Zhejiang University, Hangzhou, 310058 China (China)

    2015-10-16

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated.

  17. An improved multi-value cellular automata model for heterogeneous bicycle traffic flow

    International Nuclear Information System (INIS)

    Jin, Sheng; Qu, Xiaobo; Xu, Cheng; Ma, Dongfang; Wang, Dianhai

    2015-01-01

    This letter develops an improved multi-value cellular automata model for heterogeneous bicycle traffic flow taking the higher maximum speed of electric bicycles into consideration. The update rules of both regular and electric bicycles are improved, with maximum speeds of two and three cells per second respectively. Numerical simulation results for deterministic and stochastic cases are obtained. The fundamental diagrams and multiple states effects under different model parameters are analyzed and discussed. Field observations were made to calibrate the slowdown probabilities. The results imply that the improved extended Burgers cellular automata (IEBCA) model is more consistent with the field observations than previous models and greatly enhances the realism of the bicycle traffic model. - Highlights: • We proposed an improved multi-value CA model with higher maximum speed. • Update rules are introduced for heterogeneous bicycle traffic with maximum speed 2 and 3 cells/s. • Simulation results of the proposed model are consistent with field bicycle data. • Slowdown probabilities of both regular and electric bicycles are calibrated

  18. Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures

    Science.gov (United States)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.

    2012-12-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  19. Modeling of pathogen survival during simulated gastric digestion.

    Science.gov (United States)

    Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

    2011-02-01

    The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens.

  20. Modeling of Pathogen Survival during Simulated Gastric Digestion ▿

    Science.gov (United States)

    Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

    2011-01-01

    The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens. PMID:21131530

  1. Confirming the Value of Swimming-Performance Models for Adolescents.

    Science.gov (United States)

    Dormehl, Shilo J; Robertson, Samuel J; Barker, Alan R; Williams, Craig A

    2017-10-01

    To evaluate the efficacy of existing performance models to assess the progression of male and female adolescent swimmers through a quantitative and qualitative mixed-methods approach. Fourteen published models were tested using retrospective data from an independent sample of Dutch junior national-level swimmers from when they were 12-18 y of age (n = 13). The degree of association by Pearson correlations was compared between the calculated differences from the models and quadratic functions derived from the Dutch junior national qualifying times. Swimmers were grouped based on their differences from the models and compared with their swimming histories that were extracted from questionnaires and follow-up interviews. Correlations of the deviations from both the models and quadratic functions derived from the Dutch qualifying times were all significant except for the 100-m breaststroke and butterfly and the 200-m freestyle for females (P motivation appeared to be synonymous with higher-level career performance. This mixed-methods approach helped confirm the validity of the models that were found to be applicable to adolescent swimmers at all levels, allowing coaches to track performance and set goals. The value of the models in being able to account for the expected performance gains during adolescence enables quantification of peripheral factors that could affect performance.

  2. Autoregressive-model-based missing value estimation for DNA microarray time series data.

    Science.gov (United States)

    Choong, Miew Keen; Charbit, Maurice; Yan, Hong

    2009-01-01

    Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.

  3. Monte Carlo simulation of Markov unreliability models

    International Nuclear Information System (INIS)

    Lewis, E.E.; Boehm, F.

    1984-01-01

    A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)

  4. Comparing Productivity Simulated with Inventory Data Using Different Modelling Technologies

    Science.gov (United States)

    Klopf, M.; Pietsch, S. A.; Hasenauer, H.

    2009-04-01

    The Lime Stone National Park in Austria was established in 1997 to protect sensible lime stone soils from degradation due to heavy forest management. Since 1997 the management activities were successively reduced and standing volume and coarse woody debris (CWD) increased and degraded soils began to recover. One option to study the rehabilitation process towards natural virgin forest state is the use of modelling technology. In this study we will test two different modelling approaches for their applicability to Lime Stone National Park. We will compare standing tree volume simulated resulting from (i) the individual tree growth model MOSES, and (ii) the species and management sensitive adaptation of the biogeochemical-mechanistic model Biome-BGC. The results from the two models are compared with filed observations form repeated permanent forest inventory plots of the Lime Stone National Park in Austria. The simulated CWD predictions of the BGC-model were compared with dead wood measurements (standing and lying dead wood) recorded at the permanent inventory plots. The inventory was established between 1994 and 1996 and remeasured from 2004 to 2005. For this analysis 40 plots of this inventory were selected which comprise the required dead wood components and are dominated by a single tree species. First we used the distance dependant individual tree growth model MOSES to derive the standing timber and the amount of mortality per hectare. MOSES is initialized with the inventory data at plot establishment and each sampling plot is treated as forest stand. The Biome-BGC is a process based biogeochemical model with extensions for Austrian tree species, a self initialization and a forest management tool. The initialization for the actual simulations with the BGC model was done as follows: We first used spin up runs to derive a balanced forest vegetation, similar to an undisturbed forest. Next we considered the management history of the past centuries (heavy clear cuts

  5. Value of the distant future: Model-independent results

    Science.gov (United States)

    Katz, Yuri A.

    2017-01-01

    This paper shows that the model-independent account of correlations in an interest rate process or a log-consumption growth process leads to declining long-term tails of discount curves. Under the assumption of an exponentially decaying memory in fluctuations of risk-free real interest rates, I derive the analytical expression for an apt value of the long run discount factor and provide a detailed comparison of the obtained result with the outcome of the benchmark risk-free interest rate models. Utilizing the standard consumption-based model with an isoelastic power utility of the representative economic agent, I derive the non-Markovian generalization of the Ramsey discounting formula. Obtained analytical results allowing simple calibration, may augment the rigorous cost-benefit and regulatory impact analysis of long-term environmental and infrastructure projects.

  6. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  7. Model for Simulating Fasting Glucose in Type 2 Diabetes and the Effect of Adherence to Treatment

    DEFF Research Database (Denmark)

    Aradóttir, Tinna Björk; Boiroux, Dimitri; Bengtsson, Henrik

    2017-01-01

    trial results where a dose guidance algorithm was used. We investigate sources of variance and through simulations evaluate the contribution of adherence to variance and dose guidance quality. The results suggest that the model for simulation of T2D patients is sufficient for simulating fasting glucose......The primary goal of this paper is to predict fasting glucose levels in type 2 diabetes (T2D) in long-acting insulin treatment. The paper presents a model for simulating insulin-glucose dynamics in T2D patients. The model combines a physiological model of type 1 diabetes (T1D) and an endogenous...... insulin production model in T2D. We include a review of sources of variance in fasting glucose values in long-acting insulin treatment, with respect to dose guidance algorithms. We use the model to simulate fasting glucose levels in T2D long-acting insulin treatment and compare the results with clinical...

  8. Business Value of Information Technology Service Quality Based on Probabilistic Business-Driven Model

    Directory of Open Access Journals (Sweden)

    Jaka Sembiring

    2015-08-01

    Full Text Available The business value of information technology (IT services is often difficult to assess, especially from the point of view of a non-IT manager. This condition could severely impact organizational IT strategic decisions. Various approaches have been proposed to quantify the business value, but some are trapped in technical complexity while others misguide managers into directly and subjectively judging some technical entities outside their domain of expertise. This paper describes a method on how to properly capture both perspectives based on a  probabilistic business-driven model. The proposed model presents a procedure to calculate the business value of IT services. The model also covers IT security services and their business value as an important aspect of IT services that is not covered in previously published researches. The impact of changes in the quality of IT services on business value will also be discussed. A simulation and a case illustration are provided to show the possible application of the proposed model for a simple business process in an enterprise.

  9. Developing Cognitive Models for Social Simulation from Survey Data

    Science.gov (United States)

    Alt, Jonathan K.; Lieberman, Stephen

    The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.

  10. A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation

    Science.gov (United States)

    Wee, Loo Kang; Goh, Giam Hwee

    2013-01-01

    We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…

  11. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  12. Four Models of In Situ Simulation

    DEFF Research Database (Denmark)

    Musaeus, Peter; Krogh, Kristian; Paltved, Charlotte

    2014-01-01

    Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest that there are f......Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest...... that there are four fruitful approaches to in situ simulation: (1) In situ simulation informed by reported critical incidents and adverse events from emergency departments (ED) in which team training is about to be conducted to write scenarios. (2) In situ simulation through ethnographic studies at the ED. (3) Using...... the following processes: Transition processes, Action processes and Interpersonal processes. Design and purpose This abstract suggests four approaches to in situ simulation. A pilot study will evaluate the different approaches in two emergency departments in the Central Region of Denmark. Methods The typology...

  13. Modeling and simulation of the SDC data collection chip

    International Nuclear Information System (INIS)

    Hughes, E.; Haney, M.; Golin, E.; Jones, L.; Knapp, D.; Tharakan, G.; Downing, R.

    1992-01-01

    This paper describes modeling and simulation of the Data Collection Chip (DCC) design for the Solenoidal Detector Collaboration (SDC). Models of the DCC written in Verilog and VHDL are described, and results are presented. The models have been simulated to study queue depth requirements and to compare control feedback alternatives. Insight into the management of models and simulation tools is given. Finally, techniques useful in the design process for data acquisition systems are discussed

  14. Molecular Simulation towards Efficient and Representative Subsurface Reservoirs Modeling

    KAUST Repository

    Kadoura, Ahmad Salim

    2016-01-01

    This dissertation focuses on the application of Monte Carlo (MC) molecular simulation and Molecular Dynamics (MD) in modeling thermodynamics and flow of subsurface reservoir fluids. At first, MC molecular simulation is proposed as a promising method

  15. Modelization and simulation of capillary barriers

    International Nuclear Information System (INIS)

    Lisbona Cortes, F.; Aguilar Villa, G.; Clavero Gracia, C.; Gracia Lozano, J.L.

    1998-01-01

    Among the different underground transport phenomena, that due to water flows is of great relevance. Water flows in infiltration and percolation processes are responsible of the transport of hazardous wastes towards phreatic layers. From the industrial and geological standpoints, there is a great interest in the design of natural devices to avoid the flows transporting polluting substances. This interest is increased when devices are used to isolate radioactive waste repositories, whose life is to be longer than several hundred years. The so-called natural devices are those based on the superimposition of material with different hydraulic properties. In particular, the flow retention in this kind stratified media, in unsaturated conditions, is basically due to the capillary barrier effect, resulting from placing a low conductivity material over another with a high hydraulic conductivity. Covers designed from the effect above have also to allow a drainage of the upper layer. The lower cost of these covers, with respect to other kinds of protection systems, and the stability in time of their components make them very attractive. However, a previous investigation to determine their effectivity is required. In this report we present the computer code BCSIM, useful for easy simulations of unsaturated flows in a capillary barrier configuration with drainage, and which is intended to serve as a tool for designing efficient covers. The model, the numerical algorithm and several implementation aspects are described. Results obtained in several simulations, confirming the effectivity of capillary barriers as a technique to build safety covers for hazardous waste repositories, are presented. (Author)

  16. Simulation Model developed for a Small-Scale PV-System in a Distribution Network

    DEFF Research Database (Denmark)

    Koch-Ciobotaru, C.; Mihet-Popa, Lucian; Isleifsson, Fridrik Rafn

    2012-01-01

    This paper presents a PV panel simulation model using the single-diode four-parameter model based on data sheet values. The model was implemented first in MATLAB/Simulink, and the results have been compared with the data sheet values and characteristics of the PV panels in standard test condition...... and implemented in PowerFactory to study load flow, steady-state voltage stability and dynamic behavior of a distributed power system....

  17. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Niko Speybroeck

    2013-11-01

    Full Text Available Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks.

  18. An electrical circuit model for simulation of indoor radon concentration.

    Science.gov (United States)

    Musavi Nasab, S M; Negarestani, A

    2013-01-01

    In this study, a new model based on electric circuit theory was introduced to simulate the behaviour of indoor radon concentration. In this model, a voltage source simulates radon generation in walls, conductivity simulates migration through walls and voltage across a capacitor simulates radon concentration in a room. This simulation considers migration of radon through walls by diffusion mechanism in one-dimensional geometry. Data reported in a typical Greek house were employed to examine the application of this technique of simulation to the behaviour of radon.

  19. Aircraft vulnerability analysis by modeling and simulation

    Science.gov (United States)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    guidance acceleration and seeker sensitivity. For the purpose of this investigation the aircraft is equipped with conventional pyrotechnic decoy flares and the missile has no counter-countermeasure means (security restrictions on open publication). This complete simulation is used to calculate the missile miss distance, when the missile is launched from different locations around the aircraft. The miss distance data is then graphically presented showing miss distance (aircraft vulnerability) as a function of launch direction and range. The aircraft vulnerability graph accounts for aircraft and missile characteristics, but does not account for missile deployment doctrine. A Bayesian network is constructed to fuse the doctrinal rules with the aircraft vulnerability data. The Bayesian network now provides the capability to evaluate the combined risk of missile launch and aircraft vulnerability. It is shown in this paper that it is indeed possible to predict the aircraft vulnerability to missile attack in a comprehensive modelling and a holistic process. By using the appropriate real-world models, this approach is used to evaluate the effectiveness of specific countermeasure techniques against specific missile threats. The use of a Bayesian network provides the means to fuse simulated performance data with more abstract doctrinal rules to provide a realistic assessment of the aircraft vulnerability.

  20. Using Transport Diagnostics to Understand Chemistry Climate Model Ozone Simulations

    Science.gov (United States)

    Strahan, S. E.; Douglass, A. R.; Stolarski, R. S.; Akiyoshi, H.; Bekki, S.; Braesicke, P.; Butchart, N.; Chipperfield, M. P.; Cugnet, D.; Dhomse, S.; hide

    2010-01-01

    We demonstrate how observations of N2O and mean age in the tropical and midlatitude lower stratosphere (LS) can be used to identify realistic transport in models. The results are applied to 15 Chemistry Climate Models (CCMs) participating in the 2010 WMO assessment. Comparison of the observed and simulated N2O/mean age relationship identifies models with fast or slow circulations and reveals details of model ascent and tropical isolation. The use of this process-oriented N2O/mean age diagnostic identifies models with compensating transport deficiencies that produce fortuitous agreement with mean age. We compare the diagnosed model transport behavior with a model's ability to produce realistic LS O3 profiles in the tropics and midlatitudes. Models with the greatest tropical transport problems show the poorest agreement with observations. Models with the most realistic LS transport agree more closely with LS observations and each other. We incorporate the results of the chemistry evaluations in the SPARC CCMVal Report (2010) to explain the range of CCM predictions for the return-to-1980 dates for global (60 S-60 N) and Antarctic column ozone. Later (earlier) Antarctic return dates are generally correlated to higher (lower) vortex Cl(sub y) levels in the LS, and vortex Cl(sub y) is generally correlated with the model's circulation although model Cl(sub y) chemistry or Cl(sub y) conservation can have a significant effect. In both regions, models that have good LS transport produce a smaller range of predictions for the return-to-1980 ozone values. This study suggests that the current range of predicted return dates is unnecessarily large due to identifiable model transport deficiencies.

  1. Modeling Value Chain Analysis of Distance Education using UML

    Science.gov (United States)

    Acharya, Anal; Mukherjee, Soumen

    2010-10-01

    Distance education continues to grow as a methodology for the delivery of course content in higher education in India as well as abroad. To manage this growing demand and to provide certain flexibility, there must be certain strategic planning about the use of ICT tools. Value chain analysis is a framework for breaking down the sequence of business functions into a set of activities through which utility could be added to service. Thus it can help to determine the competitive advantage that is enjoyed by an institute. To implement these business functions certain visual representation is required. UML allows for this representation by using a set of structural and behavioral diagrams. In this paper, the first section defines a framework for value chain analysis and highlights its advantages. The second section gives a brief overview of related work in this field. The third section gives a brief discussion on distance education. The fourth section very briefly introduces UML. The fifth section models value chain of distance education using UML. Finally we discuss the limitations and the problems posed in this domain.

  2. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  3. Sunspot Modeling: From Simplified Models to Radiative MHD Simulations

    Directory of Open Access Journals (Sweden)

    Rolf Schlichenmaier

    2011-09-01

    Full Text Available We review our current understanding of sunspots from the scales of their fine structure to their large scale (global structure including the processes of their formation and decay. Recently, sunspot models have undergone a dramatic change. In the past, several aspects of sunspot structure have been addressed by static MHD models with parametrized energy transport. Models of sunspot fine structure have been relying heavily on strong assumptions about flow and field geometry (e.g., flux-tubes, "gaps", convective rolls, which were motivated in part by the observed filamentary structure of penumbrae or the necessity of explaining the substantial energy transport required to maintain the penumbral brightness. However, none of these models could self-consistently explain all aspects of penumbral structure (energy transport, filamentation, Evershed flow. In recent years, 3D radiative MHD simulations have been advanced dramatically to the point at which models of complete sunspots with sufficient resolution to capture sunspot fine structure are feasible. Here overturning convection is the central element responsible for energy transport, filamentation leading to fine-structure and the driving of strong outflows. On the larger scale these models are also in the progress of addressing the subsurface structure of sunspots as well as sunspot formation. With this shift in modeling capabilities and the recent advances in high resolution observations, the future research will be guided by comparing observation and theory.

  4. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    Science.gov (United States)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  5. The European Model of Sport: Values, Rules and Interests

    Directory of Open Access Journals (Sweden)

    Zuev V.

    2018-03-01

    Full Text Available Recent transformations in the ways that modern sport is managed have fundamentally changed its role in society; previously a simple form of leisure activity and health promotion, sport has become a complex phenomenon and a multibillion dollar business. The combination of sociocultural and economic dimensions makes sport an important tool for the promotion of interests. A leading role in the development of sport throughout history gives the European Union (EU an advantage in setting the rules for its management, while the size of the sports market in Europe further facilitates the EU’s leading role in developing the regulatory basis in this field. The sports model developed by EU institutions plays an important role in the deepening of regional integration processes, promoting the European model outside the region and also the EU’s transformation into one of the drivers of the development of the global sports management system. The goal of this article is to identify the specificities of the European model of sport, the instruments and resources used by the EU to promote European values in this field and the universal features of the European approach that make it applicable in other regions. The analysis shows that the EU actively promotes its values, norms and interests by entrenching them into the European sport model and then promoting this model to other countries and regions. Practices and norms developed in the European context are being actively transferred to the international level. Sport, and especially football which is the most popular and among the most profitable sports, has become another area in which European management practices demonstrate their consistency and are being actively applied at the global level. The spread of the European sports model is facilitated by the “spillover” of EU law to the organizations and institutions in which it participates. The EU model is promoted through soft power supported by the

  6. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  7. Mass balances for a biological life support system simulation model

    Science.gov (United States)

    Volk, Tyler; Rummel, John D.

    1987-01-01

    Design decisions to aid the development of future space based biological life support systems (BLSS) can be made with simulation models. The biochemistry stoichiometry was developed for: (1) protein, carbohydrate, fat, fiber, and lignin production in the edible and inedible parts of plants; (2) food consumption and production of organic solids in urine, feces, and wash water by the humans; and (3) operation of the waste processor. Flux values for all components are derived for a steady state system with wheat as the sole food source. The large scale dynamics of a materially closed (BLSS) computer model is described in a companion paper. An extension of this methodology can explore multifood systems and more complex biochemical dynamics while maintaining whole system closure as a focus.

  8. Monte Carlo modelling of Schottky diode for rectenna simulation

    Science.gov (United States)

    Bernuchon, E.; Aniel, F.; Zerounian, N.; Grimault-Jacquin, A. S.

    2017-09-01

    Before designing a detector circuit, the electrical parameters extraction of the Schottky diode is a critical step. This article is based on a Monte-Carlo (MC) solver of the Boltzmann Transport Equation (BTE) including different transport mechanisms at the metal-semiconductor contact such as image force effect or tunneling. The weight of tunneling and thermionic current is quantified according to different degrees of tunneling modelling. The I-V characteristic highlights the dependence of the ideality factor and the current saturation with bias. Harmonic Balance (HB) simulation on a rectifier circuit within Advanced Design System (ADS) software shows that considering non-linear ideality factor and saturation current for the electrical model of the Schottky diode does not seem essential. Indeed, bias independent values extracted in forward regime on I-V curve are sufficient. However, the non-linear series resistance extracted from a small signal analysis (SSA) strongly influences the conversion efficiency at low input powers.

  9. Developing the Value Management Maturity Model (VM3©

    Directory of Open Access Journals (Sweden)

    Saipol Bari Abd Karim

    2013-06-01

    Full Text Available Value management (VM practices have been expanded and became a well-received technique globally. Organisations are now progressing towards a better implementation of VM and should be assessing their strengths and weaknesses in order to move forward competitively. There is a need to benchmark the existing VM practices to reflect their maturing levels which is currently not available. This paper outlines the concept of Value Management Maturity Model (VM3' as a structured plan of maturity and performance growth for businesses. It proposes five levels of maturity and each level has its own criteria or attributes to be achieved before progressing to a higher level. The framework for VM3' has been developed based on the review of literatures related to VM and maturity models (MM. Data is collected through questionnaire surveys to organisations that have implemented VM methodology. Additionally, semi-structured interviews were conducted to select individuals involved in implementing VM. The questions were developed to achieve the research objectives; investigating the current implementation of VM and, exploring the organisation's MM knowledge and practices. However, this research was limited to VM implementation in the Malaysian government's projects and programmes. VM3' introduces a new paradigm in VM as it provides a rating method for capabilities or performance. It is advocated that this VM3' framework is still being refined in the advance stage in order to provide a comprehensive and well accepted method to provide ratings for organisations' maturity.

  10. Values of Land and Renewable Resources in a Three-Sector Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Zhang Wei-Bin

    2015-04-01

    Full Text Available This paper studies dynamic interdependence of capital, land and resource values in a three sector growth model with endogenous wealth and renewable resources. The model is based on the neoclassical growth theory, Ricardian theory and growth theory with renewable resources. The household’s decision is modeled with an alternative approach proposed by Zhang two decades ago. The economic system consists of the households, industrial, agricultural, and resource sectors. The model describes a dynamic interdependence between wealth accumulation, resource change, and division of labor under perfect competition. We simulate the model to demonstrate the existence of a unique stable equilibrium point and plot the motion of the dynamic system. The study conducts comparative dynamic analysis with regard to changes in the propensity to consume resources, the propensity to consume housing, the propensity to consume agricultural goods, the propensity to consume industrial goods, the propensity to save, the population, and the output elasticity of capital of the resource sector.

  11. Evaluation of the value of radar QPE data and rain gauge data for hydrological modeling

    DEFF Research Database (Denmark)

    He, Xin; Sonnenborg, Torben Obel; Refsgaard, Jens Christian

    2013-01-01

    rainfall and subsequently the simulated hydrological responses. A headwater catchment located in western Denmark is chosen as the study site. Two hydrological models are built using the MIKE SHE code, where they have identical model structures expect for the rainfall forcing: one model is based on rain...... value of the extra information from radar when rain gauge density decreases; however it is not able to sustain the level of model performance preceding the reduction in number of rain gauges......Weather radar-based quantitative precipitation estimation (QPE) is in principle superior to the areal precipitation estimated by using rain gauge data only, and therefore has become increasingly popular in applications such as hydrological modeling. The present study investigates the potential...

  12. ICFD modeling of final settlers - developing consistent and effective simulation model structures

    DEFF Research Database (Denmark)

    Plósz, Benedek G.; Guyonvarch, Estelle; Ramin, Elham

    CFD concept. The case of secondary settling tanks (SSTs) is used to demonstrate the methodological steps using the validated CFD model with the hindered-transientcompression settling velocity model by (10). Factor screening and latin hypercube sampling (LSH) are used to degenerate a 2-D axi-symmetrical CFD...... of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Results suggest that the iCFD model developed...... the feed-layer. These scenarios were inspired by literature (1; 2; 9). As for the D0--iCFD model, values of SSRE obtained are below 1 with an average SSRE=0.206. The simulation model thus can predict the solids distribution inside the tank with a satisfactory accuracy. Averaged relative errors of 8.1 %, 3...

  13. Beyond Modeling: All-Atom Olfactory Receptor Model Simulations

    Directory of Open Access Journals (Sweden)

    Peter C Lai

    2012-05-01

    Full Text Available Olfactory receptors (ORs are a type of GTP-binding protein-coupled receptor (GPCR. These receptors are responsible for mediating the sense of smell through their interaction with odor ligands. OR-odorant interactions marks the first step in the process that leads to olfaction. Computational studies on model OR structures can validate experimental functional studies as well as generate focused and novel hypotheses for further bench investigation by providing a view of these interactions at the molecular level. Here we have shown the specific advantages of simulating the dynamic environment that is associated with OR-odorant interactions. We present a rigorous methodology that ranges from the creation of a computationally-derived model of an olfactory receptor to simulating the interactions between an OR and an odorant molecule. Given the ubiquitous occurrence of GPCRs in the membranes of cells, we anticipate that our OR-developed methodology will serve as a model for the computational structural biology of all GPCRs.

  14. Evaluation of the usefulness of ecological simulation models in power plant impact assessment

    International Nuclear Information System (INIS)

    Swartzman, G.L.; Haar, R.T.; McKenzie, D.H.

    1981-05-01

    Comparisons were made of the equations, rationale, data sources and parameter values of 26 simulation models of fish and zooplankton population dynamics and energetics and results were compared in standard notation and units, process by process. The major process categories considered were consumption, predation, metabolic processes, assimilation, growth, fecundity, recruitment and mortality. A model simulation language, AEGIS (Aquatic Ecosystem General Impact Simulator) was built to compare model equations process by process allowing convenient interchange of model equations for any process module. This simulator was parameterized to a test site, Lake Keowee, South Carolina, on which resides the Oconee Nuclear Power Station. Model parameter estimation and comparison of these models with biological monitoring data allows eveluation of ecosystem models from the standpoint of prediction of behavior under normal and perturbed conditions, organization of data into an ecosystem framework, and evaluation of data to address impact questions

  15. Evaluation of scalar mixing and time scale models in PDF simulations of a turbulent premixed flame

    Energy Technology Data Exchange (ETDEWEB)

    Stoellinger, Michael; Heinz, Stefan [Department of Mathematics, University of Wyoming, Laramie, WY (United States)

    2010-09-15

    Numerical simulation results obtained with a transported scalar probability density function (PDF) method are presented for a piloted turbulent premixed flame. The accuracy of the PDF method depends on the scalar mixing model and the scalar time scale model. Three widely used scalar mixing models are evaluated: the interaction by exchange with the mean (IEM) model, the modified Curl's coalescence/dispersion (CD) model and the Euclidean minimum spanning tree (EMST) model. The three scalar mixing models are combined with a simple model for the scalar time scale which assumes a constant C{sub {phi}}=12 value. A comparison of the simulation results with available measurements shows that only the EMST model calculates accurately the mean and variance of the reaction progress variable. An evaluation of the structure of the PDF's of the reaction progress variable predicted by the three scalar mixing models confirms this conclusion: the IEM and CD models predict an unrealistic shape of the PDF. Simulations using various C{sub {phi}} values ranging from 2 to 50 combined with the three scalar mixing models have been performed. The observed deficiencies of the IEM and CD models persisted for all C{sub {phi}} values considered. The value C{sub {phi}}=12 combined with the EMST model was found to be an optimal choice. To avoid the ad hoc choice for C{sub {phi}}, more sophisticated models for the scalar time scale have been used in simulations using the EMST model. A new model for the scalar time scale which is based on a linear blending between a model for flamelet combustion and a model for distributed combustion is developed. The new model has proven to be very promising as a scalar time scale model which can be applied from flamelet to distributed combustion. (author)

  16. Global SWOT Data Assimilation of River Hydrodynamic Model; the Twin Simulation Test of CaMa-Flood

    Science.gov (United States)

    Ikeshima, D.; Yamazaki, D.; Kanae, S.

    2016-12-01

    CaMa-Flood is a global scale model for simulating hydrodynamics in large scale rivers. It can simulate river hydrodynamics such as river discharge, flooded area, water depth and so on by inputting water runoff derived from land surface model. Recently many improvements at parameters or terrestrial data are under process to enhance the reproducibility of true natural phenomena. However, there are still some errors between nature and simulated result due to uncertainties in each model. SWOT (Surface water and Ocean Topography) is a satellite, which is going to be launched in 2021, can measure open water surface elevation. SWOT observed data can be used to calibrate hydrodynamics model at river flow forecasting and is expected to improve model's accuracy. Combining observation data into model to calibrate is called data assimilation. In this research, we developed data-assimilated river flow simulation system in global scale, using CaMa-Flood as river hydrodynamics model and simulated SWOT as observation data. Generally at data assimilation, calibrating "model value" with "observation value" makes "assimilated value". However, the observed data of SWOT satellite will not be available until its launch in 2021. Instead, we simulated the SWOT observed data using CaMa-Flood. Putting "pure input" into CaMa-Flood produce "true water storage". Extracting actual daily swath of SWOT from "true water storage" made simulated observation. For "model value", we made "disturbed water storage" by putting "noise disturbed input" to CaMa-Flood. Since both "model value" and "observation value" are made by same model, we named this twin simulation. At twin simulation, simulated observation of "true water storage" is combined with "disturbed water storage" to make "assimilated value". As the data assimilation method, we used ensemble Kalman filter. If "assimilated value" is closer to "true water storage" than "disturbed water storage", the data assimilation can be marked effective. Also

  17. VHDL-AMS modelling and simulation of a planar electrostatic micromotor

    Science.gov (United States)

    Endemaño, A.; Fourniols, J. Y.; Camon, H.; Marchese, A.; Muratet, S.; Bony, F.; Dunnigan, M.; Desmulliez, M. P. Y.; Overton, G.

    2003-09-01

    System level simulation results of a planar electrostatic micromotor, based on analytical models of the static and dynamic torque behaviours, are presented. A planar variable capacitance (VC) electrostatic micromotor designed, fabricated and tested at LAAS (Toulouse) in 1995 is simulated using the high level language VHDL-AMS (VHSIC (very high speed integrated circuits) hardware description language-analog mixed signal). The analytical torque model is obtained by first calculating the overlaps and capacitances between different electrodes based on a conformal mapping transformation. Capacitance values in the order of 10-16 F and torque values in the order of 10-11 N m have been calculated in agreement with previous measurements and simulations from this type of motor. A dynamic model has been developed for the motor by calculating the inertia coefficient and estimating the friction-coefficient-based values calculated previously for other similar devices. Starting voltage results obtained from experimental measurement are in good agreement with our proposed simulation model. Simulation results of starting voltage values, step response, switching response and continuous operation of the micromotor, based on the dynamic model of the torque, are also presented. Four VHDL-AMS blocks were created, validated and simulated for power supply, excitation control, micromotor torque creation and micromotor dynamics. These blocks can be considered as the initial phase towards the creation of intellectual property (IP) blocks for microsystems in general and electrostatic micromotors in particular.

  18. Simulation of upward flux from shallow water-table using UPFLOW model

    Directory of Open Access Journals (Sweden)

    M. H. Ali

    2013-11-01

    Full Text Available The upward movement of water by capillary rise from shallow water-table to the root zone is an important incoming flux. For determining exact amount of irrigation requirement, estimation of capillary flux or upward flux is essential. Simulation model can provide a reliable estimate of upward flux under variable soil and climatic conditions. In this study, the performance of model UPFLOW to estimate upward flux was evaluated. Evaluation of model performance was performed with both graphical display and statistical criteria. In distribution of simulated capillary rise values against observed field data, maximum data points lie around the 1:1 line, which means that the model output is reliable and reasonable. The coefficient of determination between observed and simulated values was 0.806 (r = 0.93, which indicates a good inter-relation between observed and simulated values. The relative error, model efficiency, and index of agreement were found as 27.91%, 85.93% and 0.96, respectively. Considering the graphical display of observed and simulated upward flux and statistical indicators, it can be concluded that the overall performance of the UPFLOW model in simulating actual upward flux from a crop field under variable water-table condition is satisfactory. Thus, the model can be used to estimate capillary rise from shallow water-table for proper estimation of irrigation requirement, which would save valuable water from over-irrigation.

  19. Application of wildfire simulation models for risk analysis

    Science.gov (United States)

    Ager, A.; Finney, M.

    2009-04-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of fires and generate burn probability and intensity maps over large areas (10,000 - 2,000,000 ha). The MTT algorithm is parallelized for multi-threaded processing and is imbedded in a number of research and applied fire modeling applications. High performance computers (e.g., 32-way 64 bit SMP) are typically used for MTT simulations, although the algorithm is also implemented in the 32 bit desktop FlamMap3 program (www.fire.org). Extensive testing has shown that this algorithm can replicate large fire boundaries in the heterogeneous landscapes that typify much of the wildlands in the western U.S. In this paper, we describe the application of the MTT algorithm to understand spatial patterns of burn probability (BP), and to analyze wildfire risk to key human and ecological values. The work is focused on a federally-managed 2,000,000 ha landscape in the central interior region of Oregon State, USA. The fire-prone study area encompasses a wide array of topography and fuel types and a number of highly valued resources that are susceptible to fire. We quantitatively defined risk as the product of the probability of a fire and the resulting consequence. Burn probabilities at specific intensity classes were estimated for each 100 x 100 m pixel by simulating 100,000 wildfires under burn conditions that replicated recent severe wildfire events that occurred under conditions where fire suppression was generally ineffective (97th percentile, August weather). We repeated the simulation under milder weather (70th percentile, August weather) to replicate a "wildland fire use scenario" where suppression is minimized to

  20. Hybrid Simulation Modeling to Estimate U.S. Energy Elasticities

    Science.gov (United States)

    Baylin-Stern, Adam C.

    This paper demonstrates how an U.S. application of CIMS, a technologically explicit and behaviourally realistic energy-economy simulation model which includes macro-economic feedbacks, can be used to derive estimates of elasticity of substitution (ESUB) and autonomous energy efficiency index (AEEI) parameters. The ability of economies to reduce greenhouse gas emissions depends on the potential for households and industry to decrease overall energy usage, and move from higher to lower emissions fuels. Energy economists commonly refer to ESUB estimates to understand the degree of responsiveness of various sectors of an economy, and use estimates to inform computable general equilibrium models used to study climate policies. Using CIMS, I have generated a set of future, 'pseudo-data' based on a series of simulations in which I vary energy and capital input prices over a wide range. I then used this data set to estimate the parameters for transcendental logarithmic production functions using regression techniques. From the production function parameter estimates, I calculated an array of elasticity of substitution values between input pairs. Additionally, this paper demonstrates how CIMS can be used to calculate price-independent changes in energy-efficiency in the form of the AEEI, by comparing energy consumption between technologically frozen and 'business as usual' simulations. The paper concludes with some ideas for model and methodological improvement, and how these might figure into future work in the estimation of ESUBs from CIMS. Keywords: Elasticity of substitution; hybrid energy-economy model; translog; autonomous energy efficiency index; rebound effect; fuel switching.

  1. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  2. Monte Carlo simulations of the cellular S-value, lineal energy and RBE for BNCT

    International Nuclear Information System (INIS)

    Liu Chingsheng; Tung Chuanjong

    2006-01-01

    Due to the non-uniform uptake of boron-containing pharmaceuticals in cells and the short-ranged alpha and lithium particles, microdosimetry provides useful information on the cellular dose and response of boron neutron capture therapy (BNCT). Radiation dose and quality in BNCT may be expressed in terms of the cellular S-value and the lineal energy spectrum. In the present work, Monte Carlo simulations were performed to calculate these microdosimetric parameters for different source-target configurations and sizes in cells. The effective relative biological effectiveness (RBE) of the Tsing Hua Open-pool Reactor (THOR) epithermal neutron beam was evaluated using biological weighting functions that depended on the lineal energy. RBE changes with source-target configurations and sizes were analyzed. (author)

  3. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  4. Optical Imaging and Radiometric Modeling and Simulation

    Science.gov (United States)

    Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.

    2010-01-01

    OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge

  5. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  6. Simulation Modeling of a Facility Layout in Operations Management Classes

    Science.gov (United States)

    Yazici, Hulya Julie

    2006-01-01

    Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…

  7. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  8. A New Model for Simulating TSS Washoff in Urban Areas

    Directory of Open Access Journals (Sweden)

    E. Crobeddu

    2011-01-01

    Full Text Available This paper presents the formulation and validation of the conceptual Runoff Quality Simulation Model (RQSM that was developed to simulate the erosion and transport of solid particles in urban areas. The RQSM assumes that solid particle accumulation on pervious and impervious areas is infinite. The RQSM simulates soil erosion using rainfall kinetic energy and solid particle transport with linear system theory. A sensitivity analysis was conducted on the RQSM to show the influence of each parameter on the simulated load. Total suspended solid (TSS loads monitored at the outlet of the borough of Verdun in Canada and at three catchment outlets of the City of Champaign in the United States were used to validate the RQSM. TSS loads simulated by the RQSM were compared to measured loads and to loads simulated by the Rating Curve model and the Exponential model of the SWMM software. The simulation performance of the RQSM was comparable to the Exponential and Rating Curve models.

  9. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  10. Value creation in the cloud: understanding business model factors affecting value of cloud computing

    OpenAIRE

    Morgan, Lorraine; Conboy, Kieran

    2013-01-01

    peer-reviewed Despite the rapid emergence of cloud technology, its prevalence and accessibility to all types of organizations and its potential to predominantly shift competitive landscapes by providing a new platform for creating and delivering business value, empirical research on the business value of cloud computing, and in particular how service providers create value for their customers, is quite limited. Of what little research exists to date, most focuses on technical issu...

  11. Structure simulation of a pre-stressed concrete containment model

    International Nuclear Information System (INIS)

    Grebner, H.; Sievers, J.

    2004-01-01

    An axisymmetric Finite-Element-Model of the 1:4 pre-stressed containment model tested at SANDIA was developed. The model is loaded by the pre-stressing of the tendons and by increasing internal pressure (up to 1.3 MPa). The analyses results in terms of displacements and strains in the liner, the rebars, the tendons and the concrete of the cylindrical part agree well with measured data up to about 0.6 MPa internal pressure (i.e. 1.5 times design pressure). First circumferential micro-cracks in the concrete are found at about 0.75 MPa. With increasing pressure micro-cracks are present through the whole wall. Above about 0.9 MPa the formation of micro-cracks in radial and meridional direction is calculated. At the maximum load (1.3 MPa) almost all concrete parts of the model have micro-cracks which may cause leaks. Nevertheless the failure of the containment model is not expected for loads up to 1.3 MPa without consideration of geometric inhomogeneities due to penetrations in the wall. Although the calculated strains in liner, rebars and tendons show some plastification, the maximum values are below the critical ones. The safety margin against failure is smallest in some hoop tendons. At present parametric studies are performed to investigate the differences between calculations and measured data. Furthermore three-dimensional models are developed for a better simulation of the meridional tendons in the dome region. (orig.)

  12. A new method to assess the added value of high-resolution regional climate simulations: application to the EURO-CORDEX dataset

    Science.gov (United States)

    Soares, P. M. M.; Cardoso, R. M.

    2017-12-01

    Regional climate models (RCM) are used with increasing resolutions pursuing to represent in an improved way regional to local scale atmospheric phenomena. The EURO-CORDEX simulations at 0.11° and simulations exploiting finer grid spacing approaching the convective-permitting regimes are representative examples. The climate runs are computationally very demanding and do not always show improvements. These depend on the region, variable and object of study. The gains or losses associated with the use of higher resolution in relation to the forcing model (global climate model or reanalysis), or to different resolution RCM simulations, is known as added value. Its characterization is a long-standing issue, and many different added-value measures have been proposed. In the current paper, a new method is proposed to assess the added value of finer resolution simulations, in comparison to its forcing data or coarser resolution counterparts. This approach builds on a probability density function (PDF) matching score, giving a normalised measure of the difference between diverse resolution PDFs, mediated by the observational ones. The distribution added value (DAV) is an objective added value measure that can be applied to any variable, region or temporal scale, from hindcast or historical (non-synchronous) simulations. The DAVs metric and an application to the EURO-CORDEX simulations, for daily temperatures and precipitation, are here presented. The EURO-CORDEX simulations at both resolutions (0.44o,0.11o) display a clear added value in relation to ERA-Interim, with values around 30% in summer and 20% in the intermediate seasons, for precipitation. When both RCM resolutions are directly compared the added value is limited. The regions with the larger precipitation DAVs are areas where convection is relevant, e.g. Alps and Iberia. When looking at the extreme precipitation PDF tail, the higher resolution improvement is generally greater than the low resolution for seasons

  13. EVT in electricity price modeling : extreme value theory not only on the extreme events

    International Nuclear Information System (INIS)

    Marossy, Z.

    2007-01-01

    The extreme value theory (EVT) is commonly used in electricity and financial risk modeling. In this study, EVT was used to model the distribution of electricity prices. The model was built on the price formation in electricity auction markets. This paper reviewed the 3 main modeling approaches used to describe the distribution of electricity prices. The first approach is based on a stochastic model of the electricity price time series and uses this stochastic model to generate the given distribution. The second approach involves electricity supply and demand factors that determine the price distribution. The third approach involves agent-based models which use simulation techniques to write down the price distribution. A fourth modeling approach was then proposed to describe the distribution of electricity prices. The new approach determines the distribution of electricity prices directly without knowing anything about the data generating process or market driving forces. Empirical data confirmed that the distribution of electricity prices have a generalized extreme value (GEV) distribution. 8 refs., 2 tabs., 5 figs

  14. Mathematical model and simulations of radiation fluxes from buried radionuclides

    International Nuclear Information System (INIS)

    Ahmad Saat

    1999-01-01

    A mathematical model and a simple Monte Carlo simulations were developed to predict radiation fluxes from buried radionuclides. The model and simulations were applied to measured (experimental) data. The results of the mathematical model showed good acceptable order of magnitude agreement. A good agreement was also obtained between the simple simulations and the experimental results. Thus, knowing the radionuclide distribution profiles in soil from a core sample, it can be applied to the model or simulations to estimate the radiation fluxes emerging from the soil surface. (author)

  15. Quantum Link Models and Quantum Simulation of Gauge Theories

    International Nuclear Information System (INIS)

    Wiese, U.J.

    2015-01-01

    This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)

  16. Modeling and Simulation of U-tube Steam Generator

    Science.gov (United States)

    Zhang, Mingming; Fu, Zhongguang; Li, Jinyao; Wang, Mingfei

    2018-03-01

    The U-tube natural circulation steam generator was mainly researched with modeling and simulation in this article. The research is based on simuworks system simulation software platform. By analyzing the structural characteristics and the operating principle of U-tube steam generator, there are 14 control volumes in the model, including primary side, secondary side, down channel and steam plenum, etc. The model depends completely on conservation laws, and it is applied to make some simulation tests. The results show that the model is capable of simulating properly the dynamic response of U-tube steam generator.

  17. Functional Decomposition of Modeling and Simulation Terrain Database Generation Process

    National Research Council Canada - National Science Library

    Yakich, Valerie R; Lashlee, J. D

    2008-01-01

    .... This report documents the conceptual procedure as implemented by Lockheed Martin Simulation, Training, and Support and decomposes terrain database construction using the Integration Definition for Function Modeling (IDEF...

  18. Global Information Enterprise (GIE) Modeling and Simulation (GIESIM)

    National Research Council Canada - National Science Library

    Bell, Paul

    2005-01-01

    ... AND S) toolkits into the Global Information Enterprise (GIE) Modeling and Simulation (GIESim) framework to create effective user analysis of candidate communications architectures and technologies...

  19. Modeling, Simulation and Position Control of 3DOF Articulated Manipulator

    Directory of Open Access Journals (Sweden)

    Hossein Sadegh Lafmejani

    2014-08-01

    Full Text Available In this paper, the modeling, simulation and control of 3 degrees of freedom articulated robotic manipulator have been studied. First, we extracted kinematics and dynamics equations of the mentioned manipulator by using the Lagrange method. In order to validate the analytical model of the manipulator we compared the model simulated in the simulation environment of Matlab with the model was simulated with the SimMechanics toolbox. A sample path has been designed for analyzing the tracking subject. The system has been linearized with feedback linearization and then a PID controller was applied to track a reference trajectory. Finally, the control results have been compared with a nonlinear PID controller.

  20. Evaluation of Three Models for Simulating Pesticide Runoff from Irrigated Agricultural Fields.

    Science.gov (United States)

    Zhang, Xuyang; Goh, Kean S

    2015-11-01

    Three models were evaluated for their accuracy in simulating pesticide runoff at the edge of agricultural fields: Pesticide Root Zone Model (PRZM), Root Zone Water Quality Model (RZWQM), and OpusCZ. Modeling results on runoff volume, sediment erosion, and pesticide loss were compared with measurements taken from field studies. Models were also compared on their theoretical foundations and ease of use. For runoff events generated by sprinkler irrigation and rainfall, all models performed equally well with small errors in simulating water, sediment, and pesticide runoff. The mean absolute percentage errors (MAPEs) were between 3 and 161%. For flood irrigation, OpusCZ simulated runoff and pesticide mass with the highest accuracy, followed by RZWQM and PRZM, likely owning to its unique hydrological algorithm for runoff simulations during flood irrigation. Simulation results from cold model runs by OpusCZ and RZWQM using measured values for model inputs matched closely to the observed values. The MAPE ranged from 28 to 384 and 42 to 168% for OpusCZ and RZWQM, respectively. These satisfactory model outputs showed the models' abilities in mimicking reality. Theoretical evaluations indicated that OpusCZ and RZWQM use mechanistic approaches for hydrology simulation, output data on a subdaily time-step, and were able to simulate management practices and subsurface flow via tile drainage. In contrast, PRZM operates at daily time-step and simulates surface runoff using the USDA Soil Conservation Service's curve number method. Among the three models, OpusCZ and RZWQM were suitable for simulating pesticide runoff in semiarid areas where agriculture is heavily dependent on irrigation. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.