WorldWideScience

Sample records for event simulation modeling

  1. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  2. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  3. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  4. Discrete event simulation: Modeling simultaneous complications and outcomes

    NARCIS (Netherlands)

    Quik, E.H.; Feenstra, T.L.; Krabbe, P.F.M.

    2012-01-01

    OBJECTIVES: To present an effective and elegant model approach to deal with specific characteristics of complex modeling. METHODS: A discrete event simulation (DES) model with multiple complications and multiple outcomes that each can occur simultaneously was developed. In this DES model parameters,

  5. Optimization of Operations Resources via Discrete Event Simulation Modeling

    Science.gov (United States)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  6. Discrete-Event Simulation

    Directory of Open Access Journals (Sweden)

    Prateek Sharma

    2015-04-01

    Full Text Available Abstract Simulation can be regarded as the emulation of the behavior of a real-world system over an interval of time. The process of simulation relies upon the generation of the history of a system and then analyzing that history to predict the outcome and improve the working of real systems. Simulations can be of various kinds but the topic of interest here is one of the most important kind of simulation which is Discrete-Event Simulation which models the system as a discrete sequence of events in time. So this paper aims at introducing about Discrete-Event Simulation and analyzing how it is beneficial to the real world systems.

  7. Powering stochastic reliability models by discrete event simulation

    DEFF Research Database (Denmark)

    Kozine, Igor; Wang, Xiaoyun

    2012-01-01

    it difficult to find a solution to the problem. The power of modern computers and recent developments in discrete-event simulation (DES) software enable to diminish some of the drawbacks of stochastic models. In this paper we describe the insights we have gained based on using both Markov and DES models...

  8. Modeling and simulation of single-event effect in CMOS circuit

    International Nuclear Information System (INIS)

    Yue Suge; Zhang Xiaolin; Zhao Yuanfu; Liu Lin; Wang Hanning

    2015-01-01

    This paper reviews the status of research in modeling and simulation of single-event effects (SEE) in digital devices and integrated circuits. After introducing a brief historical overview of SEE simulation, different level simulation approaches of SEE are detailed, including material-level physical simulation where two primary methods by which ionizing radiation releases charge in a semiconductor device (direct ionization and indirect ionization) are introduced, device-level simulation where the main emerging physical phenomena affecting nanometer devices (bipolar transistor effect, charge sharing effect) and the methods envisaged for taking them into account are focused on, and circuit-level simulation where the methods for predicting single-event response about the production and propagation of single-event transients (SETs) in sequential and combinatorial logic are detailed, as well as the soft error rate trends with scaling are particularly addressed. (review)

  9. A Framework for the Optimization of Discrete-Event Simulation Models

    Science.gov (United States)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  10. Synchronization Of Parallel Discrete Event Simulations

    Science.gov (United States)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  11. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah; Ross, Robert; Carns, Philip

    2016-05-15

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the model size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.

  12. Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device

    Science.gov (United States)

    2016-06-01

    release; distribution is unlimited DISCRETE EVENT SIMULATION MODEL OF THE POLARIS 2.1 GAMMA RAY IMAGING RADIATION DETECTION DEVICE by Andres T...ONLY (Leave blank) 2. REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE DISCRETE EVENT SIMULATION MODEL...modeled. The platform, Simkit, was utilized to create a discrete event simulation (DES) model of the Polaris. After carefully constructing the DES

  13. Synchronization Techniques in Parallel Discrete Event Simulation

    OpenAIRE

    Lindén, Jonatan

    2018-01-01

    Discrete event simulation is an important tool for evaluating system models in many fields of science and engineering. To improve the performance of large-scale discrete event simulations, several techniques to parallelize discrete event simulation have been developed. In parallel discrete event simulation, the work of a single discrete event simulation is distributed over multiple processing elements. A key challenge in parallel discrete event simulation is to ensure that causally dependent ...

  14. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  15. Can discrete event simulation be of use in modelling major depression?

    Science.gov (United States)

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-12-05

    Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.

  16. DeMO: An Ontology for Discrete-event Modeling and Simulation

    Science.gov (United States)

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  17. Can discrete event simulation be of use in modelling major depression?

    Directory of Open Access Journals (Sweden)

    François Clément

    2006-12-01

    Full Text Available Abstract Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors, our aim was to clarify to what extent "Discrete Event Simulation" (DES models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.. Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful

  18. Discrete-Event Simulation

    OpenAIRE

    Prateek Sharma

    2015-01-01

    Abstract Simulation can be regarded as the emulation of the behavior of a real-world system over an interval of time. The process of simulation relies upon the generation of the history of a system and then analyzing that history to predict the outcome and improve the working of real systems. Simulations can be of various kinds but the topic of interest here is one of the most important kind of simulation which is Discrete-Event Simulation which models the system as a discrete sequence of ev...

  19. Parallel discrete event simulation using shared memory

    Science.gov (United States)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1988-01-01

    With traditional event-list techniques, evaluating a detailed discrete-event simulation-model can often require hours or even days of computation time. By eliminating the event list and maintaining only sufficient synchronization to ensure causality, parallel simulation can potentially provide speedups that are linear in the numbers of processors. A set of shared-memory experiments, using the Chandy-Misra distributed-simulation algorithm, to simulate networks of queues is presented. Parameters of the study include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential-simulation of most queueing network models.

  20. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    Science.gov (United States)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  1. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  2. Nuclear facility safeguards systems modeling using discrete event simulation

    International Nuclear Information System (INIS)

    Engi, D.

    1977-01-01

    The threat of theft or dispersal of special nuclear material at a nuclear facility is treated by studying the temporal relationships between adversaries having authorized access to the facility (insiders) and safeguards system events by using a GASP IV discrete event simulation. The safeguards system events--detection, assessment, delay, communications, and neutralization--are modeled for the general insider adversary strategy which includes degradation of the safeguards system elements followed by an attempt to steal or disperse special nuclear material. The performance measure used in the analysis is the estimated probability of safeguards system success in countering the adversary based upon a predetermined set of adversary actions. An exemplary problem which includes generated results is presented for a hypothetical nuclear facility. The results illustrate representative information that could be utilized by safeguards decision-makers

  3. Disaster Response Modeling Through Discrete-Event Simulation

    Science.gov (United States)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  4. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  5. A novel approach for modelling complex maintenance systems using discrete event simulation

    International Nuclear Information System (INIS)

    Alrabghi, Abdullah; Tiwari, Ashutosh

    2016-01-01

    Existing approaches for modelling maintenance rely on oversimplified assumptions which prevent them from reflecting the complexity found in industrial systems. In this paper, we propose a novel approach that enables the modelling of non-identical multi-unit systems without restrictive assumptions on the number of units or their maintenance characteristics. Modelling complex interactions between maintenance strategies and their effects on assets in the system is achieved by accessing event queues in Discrete Event Simulation (DES). The approach utilises the wide success DES has achieved in manufacturing by allowing integration with models that are closely related to maintenance such as production and spare parts systems. Additional advantages of using DES include rapid modelling and visual interactive simulation. The proposed approach is demonstrated in a simulation based optimisation study of a published case. The current research is one of the first to optimise maintenance strategies simultaneously with their parameters while considering production dynamics and spare parts management. The findings of this research provide insights for non-conflicting objectives in maintenance systems. In addition, the proposed approach can be used to facilitate the simulation and optimisation of industrial maintenance systems. - Highlights: • This research is one of the first to optimise maintenance strategies simultaneously. • New insights for non-conflicting objectives in maintenance systems. • The approach can be used to optimise industrial maintenance systems.

  6. Parallel discrete event simulation: A shared memory approach

    Science.gov (United States)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  7. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    Science.gov (United States)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  8. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  9. A participative and facilitative conceptual modelling framework for discrete event simulation studies in healthcare

    OpenAIRE

    Kotiadis, Kathy; Tako, Antuela; Vasilakis, Christos

    2014-01-01

    Existing approaches to conceptual modelling (CM) in discrete-event simulation do not formally support the participation of a group of stakeholders. Simulation in healthcare can benefit from stakeholder participation as it makes possible to share multiple views and tacit knowledge from different parts of the system. We put forward a framework tailored to healthcare that supports the interaction of simulation modellers with a group of stakeholders to arrive at a common conceptual model. The fra...

  10. Numerical simulations of an advection fog event over Shanghai Pudong International Airport with the WRF model

    Science.gov (United States)

    Lin, Caiyan; Zhang, Zhongfeng; Pu, Zhaoxia; Wang, Fengyun

    2017-10-01

    A series of numerical simulations is conducted to understand the formation, evolution, and dissipation of an advection fog event over Shanghai Pudong International Airport (ZSPD) with the Weather Research and Forecasting (WRF) model. Using the current operational settings at the Meteorological Center of East China Air Traffic Management Bureau, the WRF model successfully predicts the fog event at ZSPD. Additional numerical experiments are performed to examine the physical processes associated with the fog event. The results indicate that prediction of this particular fog event is sensitive to microphysical schemes for the time of fog dissipation but not for the time of fog onset. The simulated timing of the arrival and dissipation of the fog, as well as the cloud distribution, is substantially sensitive to the planetary boundary layer and radiation (both longwave and shortwave) processes. Moreover, varying forecast lead times also produces different simulation results for the fog event regarding its onset and duration, suggesting a trade-off between more accurate initial conditions and a proper forecast lead time that allows model physical processes to spin up adequately during the fog simulation. The overall outcomes from this study imply that the complexity of physical processes and their interactions within the WRF model during fog evolution and dissipation is a key area of future research.

  11. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    Science.gov (United States)

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  12. Discrete-Event Simulation with Agents for Modeling of Dynamic Asymmetric Threats in Maritime Security

    National Research Council Canada - National Science Library

    Ng, Chee W

    2007-01-01

    .... Discrete-event simulation (DES) was used to simulate a typical port-security, local, waterside-threat response model and to test the adaptive response of asymmetric threats in reaction to port-security procedures, while a multi-agent system (MAS...

  13. Manual for the Jet Event and Background Simulation Library

    Energy Technology Data Exchange (ETDEWEB)

    Heinz, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Soltz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Angerami, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-11

    Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momenta are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.

  14. Manufacturing plant performance evaluation by discrete event simulation

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mohd Rasid Osman; Rosnah Mohd Yusuff; Napsiah Ismail; Zulkiflie Leman

    2002-01-01

    A case study was conducted to evaluate the performance of a manufacturing plant using discrete event simulation technique. The study was carried out on animal feed production plant. Sterifeed plant at Malaysian Institute for Nuclear Technology Research (MINT), Selangor, Malaysia. The plant was modelled base on the actual manufacturing activities recorded by the operators. The simulation was carried out using a discrete event simulation software. The model was validated by comparing the simulation results with the actual operational data of the plant. The simulation results show some weaknesses with the current plant design and proposals were made to improve the plant performance. (Author)

  15. Markov modeling and discrete event simulation in health care: a systematic comparison.

    Science.gov (United States)

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  16. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    Science.gov (United States)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  17. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  18. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, H.; Zhao, S.; Yuan, S.; Jin, F.; Michielsen, K.; Miyashita, S.

    We discuss recent progress in the development of simulation algorithms that do not rely on any concept of quantum theory but are nevertheless capable of reproducing the averages computed from quantum theory through an event-by-event simulation. The simulation approach is illustrated by applications

  19. Modeling energy market dynamics using discrete event system simulation

    International Nuclear Information System (INIS)

    Gutierrez-Alcaraz, G.; Sheble, G.B.

    2009-01-01

    This paper proposes the use of Discrete Event System Simulation to study the interactions among fuel and electricity markets and consumers, and the decision-making processes of fuel companies (FUELCOs), generation companies (GENCOs), and consumers in a simple artificial energy market. In reality, since markets can reach a stable equilibrium or fail, it is important to observe how they behave in a dynamic framework. We consider a Nash-Cournot model in which marketers are depicted as Nash-Cournot players that determine supply to meet end-use consumption. Detailed engineering considerations such as transportation network flows are omitted, because the focus is upon the selection and use of appropriate market models to provide answers to policy questions. (author)

  20. Modeling Anti-Air Warfare With Discrete Event Simulation and Analyzing Naval Convoy Operations

    Science.gov (United States)

    2016-06-01

    W., & Scheaffer, R. L. (2008). Mathematical statistics with applications . Belmont, CA: Cengage Learning. 118 THIS PAGE INTENTIONALLY LEFT BLANK...WARFARE WITH DISCRETE EVENT SIMULATION AND ANALYZING NAVAL CONVOY OPERATIONS by Ali E. Opcin June 2016 Thesis Advisor: Arnold H. Buss Co...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MODELING ANTI-AIR WARFARE WITH DISCRETE EVENT

  1. Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.

    Science.gov (United States)

    Caro, J Jaime

    2016-07-01

    Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically.

  2. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    Science.gov (United States)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  3. Rare event simulation using Monte Carlo methods

    CERN Document Server

    Rubino, Gerardo

    2009-01-01

    In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...

  4. On constructing optimistic simulation algorithms for the discrete event system specification

    International Nuclear Information System (INIS)

    Nutaro, James J.

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models

  5. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  6. Synchronization of autonomous objects in discrete event simulation

    Science.gov (United States)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  7. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, Hans; Michielsen, Kristel

    A discrete-event simulation approach is reviewed that does not require the knowledge of the solution of the wave equation of the whole system, yet reproduces the statistical distributions of wave theory by generating detection events one-by-one. The simulation approach is illustrated by applications

  8. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    Science.gov (United States)

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  9. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    Science.gov (United States)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  10. Simulation of Flash-Flood-Producing Storm Events in Saudi Arabia Using the Weather Research and Forecasting Model

    KAUST Repository

    Deng, Liping

    2015-05-01

    The challenges of monitoring and forecasting flash-flood-producing storm events in data-sparse and arid regions are explored using the Weather Research and Forecasting (WRF) Model (version 3.5) in conjunction with a range of available satellite, in situ, and reanalysis data. Here, we focus on characterizing the initial synoptic features and examining the impact of model parameterization and resolution on the reproduction of a number of flood-producing rainfall events that occurred over the western Saudi Arabian city of Jeddah. Analysis from the European Centre for Medium-Range Weather Forecasts (ECMWF) interim reanalysis (ERA-Interim) data suggests that mesoscale convective systems associated with strong moisture convergence ahead of a trough were the major initial features for the occurrence of these intense rain events. The WRF Model was able to simulate the heavy rainfall, with driving convective processes well characterized by a high-resolution cloud-resolving model. The use of higher (1 km vs 5 km) resolution along the Jeddah coastline favors the simulation of local convective systems and adds value to the simulation of heavy rainfall, especially for deep-convection-related extreme values. At the 5-km resolution, corresponding to an intermediate study domain, simulation without a cumulus scheme led to the formation of deeper convective systems and enhanced rainfall around Jeddah, illustrating the need for careful model scheme selection in this transition resolution. In analysis of multiple nested WRF simulations (25, 5, and 1 km), localized volume and intensity of heavy rainfall together with the duration of rainstorms within the Jeddah catchment area were captured reasonably well, although there was evidence of some displacements of rainstorm events.

  11. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    Science.gov (United States)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  12. Developing a discrete event simulation model for university student shuttle buses

    Science.gov (United States)

    Zulkepli, Jafri; Khalid, Ruzelan; Nawawi, Mohd Kamal Mohd; Hamid, Muhammad Hafizan

    2017-11-01

    Providing shuttle buses for university students to attend their classes is crucial, especially when their number is large and the distances between their classes and residential halls are far. These factors, in addition to the non-optimal current bus services, typically require the students to wait longer which eventually opens a space for them to complain. To considerably reduce the waiting time, providing the optimal number of buses to transport them from location to location and the effective route schedules to fulfil the students' demand at relevant time ranges are thus important. The optimal bus number and schedules are to be determined and tested using a flexible decision platform. This paper thus models the current services of student shuttle buses in a university using a Discrete Event Simulation approach. The model can flexibly simulate whatever changes configured to the current system and report its effects to the performance measures. How the model was conceptualized and formulated for future system configurations are the main interest of this paper.

  13. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    Science.gov (United States)

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  14. DECISION WITH ARTIFICIAL NEURAL NETWORKS IN DISCRETE EVENT SIMULATION MODELS ON A TRAFFIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Marília Gonçalves Dutra da Silva

    2016-04-01

    Full Text Available ABSTRACT This work aims to demonstrate the use of a mechanism to be applied in the development of the discrete-event simulation models that perform decision operations through the implementation of an artificial neural network. Actions that involve complex operations performed by a human agent in a process, for example, are often modeled in simplified form with the usual mechanisms of simulation software. Therefore, it was chosen a traffic system controlled by a traffic officer with a flow of vehicles and pedestrians to demonstrate the proposed solution. From a module built in simulation software itself, it was possible to connect the algorithm for intelligent decision to the simulation model. The results showed that the model elaborated responded as expected when it was submitted to actions, which required different decisions to maintain the operation of the system with changes in the flow of people and vehicles.

  15. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model

    Directory of Open Access Journals (Sweden)

    Fabian M Patricia

    2012-09-01

    Full Text Available Abstract Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%, which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens

  16. Assessment of the Weather Research and Forecasting (WRF) model for simulation of extreme rainfall events in the upper Ganga Basin

    Science.gov (United States)

    Chawla, Ila; Osuri, Krishna K.; Mujumdar, Pradeep P.; Niyogi, Dev

    2018-02-01

    Reliable estimates of extreme rainfall events are necessary for an accurate prediction of floods. Most of the global rainfall products are available at a coarse resolution, rendering them less desirable for extreme rainfall analysis. Therefore, regional mesoscale models such as the advanced research version of the Weather Research and Forecasting (WRF) model are often used to provide rainfall estimates at fine grid spacing. Modelling heavy rainfall events is an enduring challenge, as such events depend on multi-scale interactions, and the model configurations such as grid spacing, physical parameterization and initialization. With this background, the WRF model is implemented in this study to investigate the impact of different processes on extreme rainfall simulation, by considering a representative event that occurred during 15-18 June 2013 over the Ganga Basin in India, which is located at the foothills of the Himalayas. This event is simulated with ensembles involving four different microphysics (MP), two cumulus (CU) parameterizations, two planetary boundary layers (PBLs) and two land surface physics options, as well as different resolutions (grid spacing) within the WRF model. The simulated rainfall is evaluated against the observations from 18 rain gauges and the Tropical Rainfall Measuring Mission Multi-Satellite Precipitation Analysis (TMPA) 3B42RT version 7 data. From the analysis, it should be noted that the choice of MP scheme influences the spatial pattern of rainfall, while the choice of PBL and CU parameterizations influences the magnitude of rainfall in the model simulations. Further, the WRF run with Goddard MP, Mellor-Yamada-Janjic PBL and Betts-Miller-Janjic CU scheme is found to perform best in simulating this heavy rain event. The selected configuration is evaluated for several heavy to extremely heavy rainfall events that occurred across different months of the monsoon season in the region. The model performance improved through incorporation

  17. The cost of conservative synchronization in parallel discrete event simulations

    Science.gov (United States)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  18. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    Science.gov (United States)

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  19. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis

    NARCIS (Netherlands)

    A. Tran-Duy (An); A. Boonen (Annelies); M.A.F.J. van de Laar (Mart); A. Franke (Andre); J.L. Severens (Hans)

    2011-01-01

    textabstractObjective: To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Methods: Discrete event simulation paradigm was selected for model

  20. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis

    NARCIS (Netherlands)

    Tran-Duy, A.; Boonen, A.; Laar, M.A.F.J.; Franke, A.C.; Severens, J.L.

    2011-01-01

    Objective To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Methods Discrete event simulation paradigm was selected for model development. Drug

  1. Core discrete event simulation model for the evaluation of health care technologies in major depressive disorder.

    Science.gov (United States)

    Vataire, Anne-Lise; Aballéa, Samuel; Antonanzas, Fernando; Roijen, Leona Hakkaart-van; Lam, Raymond W; McCrone, Paul; Persson, Ulf; Toumi, Mondher

    2014-03-01

    A review of existing economic models in major depressive disorder (MDD) highlighted the need for models with longer time horizons that also account for heterogeneity in treatment pathways between patients. A core discrete event simulation model was developed to estimate health and cost outcomes associated with alternative treatment strategies. This model simulated short- and long-term clinical events (partial response, remission, relapse, recovery, and recurrence), adverse events, and treatment changes (titration, switch, addition, and discontinuation) over up to 5 years. Several treatment pathways were defined on the basis of fictitious antidepressants with three levels of efficacy, tolerability, and price (low, medium, and high) from first line to third line. The model was populated with input data from the literature for the UK setting. Model outputs include time in different health states, quality-adjusted life-years (QALYs), and costs from National Health Service and societal perspectives. The codes are open source. Predicted costs and QALYs from this model are within the range of results from previous economic evaluations. The largest cost components from the payer perspective were physician visits and hospitalizations. Key parameters driving the predicted costs and QALYs were utility values, effectiveness, and frequency of physician visits. Differences in QALYs and costs between two strategies with different effectiveness increased approximately twofold when the time horizon increased from 1 to 5 years. The discrete event simulation model can provide a more comprehensive evaluation of different therapeutic options in MDD, compared with existing Markov models, and can be used to compare a wide range of health care technologies in various groups of patients with MDD. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  2. Manual for the Jet Event and Background Simulation Library(JEBSimLib)

    Energy Technology Data Exchange (ETDEWEB)

    Heinz, Matthias [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Soltz, Ron [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Angerami, Aaron [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-08-29

    Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momenta are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.

  3. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  4. Parallel Stochastic discrete event simulation of calcium dynamics in neuron.

    Science.gov (United States)

    Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W

    2017-09-26

    The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.

  5. Evaluation of a proposed optimization method for discrete-event simulation models

    Directory of Open Access Journals (Sweden)

    Alexandre Ferreira de Pinho

    2012-12-01

    Full Text Available Optimization methods combined with computer-based simulation have been utilized in a wide range of manufacturing applications. However, in terms of current technology, these methods exhibit low performance levels which are only able to manipulate a single decision variable at a time. Thus, the objective of this article is to evaluate a proposed optimization method for discrete-event simulation models based on genetic algorithms which exhibits more efficiency in relation to computational time when compared to software packages on the market. It should be emphasized that the variable's response quality will not be altered; that is, the proposed method will maintain the solutions' effectiveness. Thus, the study draws a comparison between the proposed method and that of a simulation instrument already available on the market and has been examined in academic literature. Conclusions are presented, confirming the proposed optimization method's efficiency.

  6. Simulation of interim spent fuel storage system with discrete event model

    International Nuclear Information System (INIS)

    Yoon, Wan Ki; Song, Ki Chan; Lee, Jae Sol; Park, Hyun Soo

    1989-01-01

    This paper describes dynamic simulation of the spent fuel storage system which is described by statistical discrete event models. It visualizes flow and queue of system over time, assesses the operational performance of the system activities and establishes the system components and streams. It gives information on system organization and operation policy with reference to the design. System was tested and analyzed over a number of critical parameters to establish the optimal system. Workforce schedule and resources with long processing time dominate process. A combination of two workforce shifts a day and two cooling pits gives the optimal solution of storage system. Discrete system simulation is an useful tool to get information on optimal design and operation of the storage system. (Author)

  7. The null-event method in computer simulation

    International Nuclear Information System (INIS)

    Lin, S.L.

    1978-01-01

    The simulation of collisions of ions moving under the influence of an external field through a neutral gas to non-zero temperatures is discussed as an example of computer models of processes in which a probe particle undergoes a series of interactions with an ensemble of other particles, such that the frequency and outcome of the events depends on internal properties of the second particles. The introduction of null events removes the need for much complicated algebra, leads to a more efficient simulation and reduces the likelihood of logical error. (Auth.)

  8. Event-by-event simulation of Einstein-Podolsky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    Zhao, Shuang; De Raedt, Hans; Michielsen, Kristel

    We construct an event-based computer simulation model of the Einstein-Podolsky-Rosen-Bohm experiments with photons. The algorithm is a one-to-one copy of the data gathering and analysis procedures used in real laboratory experiments. We consider two types of experiments, those with a source emitting

  9. A PC-based discrete event simulation model of the civilian radioactive waste management system

    International Nuclear Information System (INIS)

    Airth, G.L.; Joy, D.S.; Nehls, J.W.

    1992-01-01

    This paper discusses a System Simulation Model which has been developed for the Department of Energy to simulate the movement of individual waste packages (spent fuel assemblies and fuel containers) through the Civilian Radioactive Waste Management System (CRWMS). A discrete event simulation language, GPSS/PC, which runs on an IBM/PC and operates under DOS 5.0, mathematically represents the movement and processing of radioactive waste packages through the CRWMS and the interaction of these packages with the equipment in the various facilities. The major features of the System Simulation Model are: the ability to reference characteristics of the different types of radioactive waste (age, burnup, etc.) in order to make operational and/or system design decisions, the ability to place stochastic variations on operational parameters such as processing time and equipment outages, and the ability to include a rigorous simulation of the transportation system. Output from the model includes the numbers, types, and characteristics of waste packages at selected points in the CRWMS and the extent to which various resources will be utilized in order to transport, process, and emplace the waste

  10. A Discrete-Event Simulation Model for Evaluating Air Force Reusable Military Launch Vehicle Post-Landing Operations

    National Research Council Canada - National Science Library

    Martindale, Michael

    2006-01-01

    The purpose of this research was to develop a discrete-event computer simulation model of the post-landing vehicle recoveoperations to allow the Air Force Research Laboratory, Air Vehicles Directorate...

  11. Discrete-event simulation of nuclear-waste transport in geologic sites subject to disruptive events. Final report

    International Nuclear Information System (INIS)

    Aggarwal, S.; Ryland, S.; Peck, R.

    1980-01-01

    This report outlines a methodology to study the effects of disruptive events on nuclear waste material in stable geologic sites. The methodology is based upon developing a discrete events model that can be simulated on the computer. This methodology allows a natural development of simulation models that use computer resources in an efficient manner. Accurate modeling in this area depends in large part upon accurate modeling of ion transport behavior in the storage media. Unfortunately, developments in this area are not at a stage where there is any consensus on proper models for such transport. Consequently, our work is directed primarily towards showing how disruptive events can be properly incorporated in such a model, rather than as a predictive tool at this stage. When and if proper geologic parameters can be determined, then it would be possible to use this as a predictive model. Assumptions and their bases are discussed, and the mathematical and computer model are described

  12. Discrete event simulation of Maglev transport considering traffic waves

    Directory of Open Access Journals (Sweden)

    Moo Hyun Cha

    2014-10-01

    Full Text Available A magnetically levitated vehicle (Maglev system is under commercialization as a new transportation system in Korea. The Maglev is operated by an unmanned automatic control system. Therefore, the plan of train operation should be carefully established and validated in advance. In general, when making a train operation plan, statistically predicted traffic data is used. However, a traffic wave often occurs in real train service, and demand-driven simulation technology is required to review a train operation plan and service quality considering traffic waves. We propose a method and model to simulate Maglev operation considering continuous demand changes. For this purpose, we employed a discrete event model that is suitable for modeling the behavior of railway passenger transportation. We modeled the system hierarchically using discrete event system specification (DEVS formalism. In addition, through implementation and an experiment using the DEVSim++ simulation environment, we tested the feasibility of the proposed model. Our experimental results also verified that our demand-driven simulation technology can be used for a priori review of train operation plans and strategies.

  13. A Generic Discrete-Event Simulation Model for Outpatient Clinics in a Large Public Hospital

    Directory of Open Access Journals (Sweden)

    Waressara Weerawat

    2013-01-01

    Full Text Available The orthopedic outpatient department (OPD ward in a large Thai public hospital is modeled using Discrete-Event Stochastic (DES simulation. Key Performance Indicators (KPIs are used to measure effects across various clinical operations during different shifts throughout the day. By considering various KPIs such as wait times to see doctors, percentage of patients who can see a doctor within a target time frame, and the time that the last patient completes their doctor consultation, bottlenecks are identified and resource-critical clinics can be prioritized. The simulation model quantifies the chronic, high patient congestion that is prevalent amongst Thai public hospitals with very high patient-to-doctor ratios. Our model can be applied across five different OPD wards by modifying the model parameters. Throughout this work, we show how DES models can be used as decision-support tools for hospital management.

  14. Running Parallel Discrete Event Simulators on Sierra

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jefferson, D. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  15. Episodes, events, and models

    Directory of Open Access Journals (Sweden)

    Sangeet eKhemlani

    2015-10-01

    Full Text Available We describe a novel computational theory of how individuals segment perceptual information into representations of events. The theory is inspired by recent findings in the cognitive science and cognitive neuroscience of event segmentation. In line with recent theories, it holds that online event segmentation is automatic, and that event segmentation yields mental simulations of events. But it posits two novel principles as well: first, discrete episodic markers track perceptual and conceptual changes, and can be retrieved to construct event models. Second, the process of retrieving and reconstructing those episodic markers is constrained and prioritized. We describe a computational implementation of the theory, as well as a robotic extension of the theory that demonstrates the processes of online event segmentation and event model construction. The theory is the first unified computational account of event segmentation and temporal inference. We conclude by demonstrating now neuroimaging data can constrain and inspire the construction of process-level theories of human reasoning.

  16. A regional model simulation of the 1991 severe precipitation event over the Yangtze-Huai River Valley. Part 2: Model bias

    Energy Technology Data Exchange (ETDEWEB)

    Gong, W.; Wang, W.C.

    2000-01-01

    This is the second part of a study investigating the 1991 severe precipitation event over the Uangtze-Huai River valley (YHRV) in China using both observations and regional model simulations. While Part 1 reported on the Mei-yu front and its association with large-scale circulation, this study documents the biases associated with the treatment of the lateral boundary in the regional model. Two aspects of the biases were studied: the driving field, which provides large-scale boundary forcing, and the coupling scheme, which specified how the forcing is adopted by the model. The former bias is defined as model uncertainty because it is not related to the model itself, while the latter bias (as well as those biases attributed to other sources) is referred to as model error. These two aspects were examined by analyzing the regional model simulations of the 1991 summer severe precipitation event over YHRV using different driving fields (ECMWF-TOGA objective analysis, ECMWF reanalysis, and NCEP-NCAR reanalysis) and coupling scheme (distribution function of the nudging coefficient and width of the buffer zone). Spectral analysis was also used to study the frequency distribution of the bias.

  17. Corpuscular event-by-event simulation of quantum optics experiments : application to a quantum-controlled delayed-choice experiment

    NARCIS (Netherlands)

    De Raedt, Hans; Delina, M; Jin, Fengping; Michielsen, Kristel

    2012-01-01

    A corpuscular simulation model of optical phenomena that does not require knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one by one is discussed. The event-based corpuscular model gives a unified

  18. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  19. Reproductive Health Services Discrete-Event Simulation

    OpenAIRE

    Lee, Sungjoo; Giles, Denise F.; Goldsman, David; Cook, Douglas A.; Mishra, Ninad; McCarthy, Brian

    2006-01-01

    Low resource healthcare environments are often characteristic of patient flow patterns with varying patient risks, extensive patient waiting times, uneven workload distributions, and inefficient service delivery. Models from industrial and systems engineering allow for a greater examination of processes by applying discrete-event computer simulation techniques to evaluate and optimize hospital performance.

  20. A discrete event simulation model for evaluating time delays in a pipeline network

    Energy Technology Data Exchange (ETDEWEB)

    Spricigo, Deisi; Muggiati, Filipe V.; Lueders, Ricardo; Neves Junior, Flavio [Federal University of Technology of Parana (UTFPR), Curitiba, PR (Brazil)

    2009-07-01

    Currently in the oil industry the logistic chain stands out as a strong candidate to obtain highest profit, since recent studies have pointed out to a cost reduction by adoption of better policies for distribution of oil derivatives, particularly those where pipelines are used to transport products. Although there are models to represent transfers of oil derivatives in pipelines, they are quite complex and computationally burden. In this paper, we are interested on models that are less detailed in terms of fluid dynamics but provide more information about operational decisions in a pipeline network. We propose a discrete event simulation model in ARENA that allows simulating a pipeline network based on average historical data. Time delays for transferring different products can be evaluated through different routes. It is considered that transport operations follow a historical behavior and average time delays can thus be estimated within certain bounds. Due to its stochastic nature, time quantities are characterized by average and dispersion measures. This allows comparing different operational scenarios for product transportation. Simulation results are compared to data obtained from a real world pipeline network and different scenarios of production and demand are analyzed. (author)

  1. A PC-based discrete event simulation model of the Civilian Radioactive Waste Management System

    International Nuclear Information System (INIS)

    Airth, G.L.; Joy, D.S.; Nehls, J.W.

    1991-01-01

    A System Simulation Model has been developed for the Department of Energy to simulate the movement of individual waste packages (spent fuel assemblies and fuel containers) through the Civilian Radioactive Waste Management System (CRWMS). A discrete event simulation language, GPSS/PC, which runs on an IBM/PC and operates under DOS 5.0, mathematically represents the movement and processing of radioactive waste packages through the CRWMS and the interaction of these packages with the equipment in the various facilities. This model can be used to quantify the impacts of different operating schedules, operational rules, system configurations, and equipment reliability and availability considerations on the performance of processes comprising the CRWMS and how these factors combine to determine overall system performance for the purpose of making system design decisions. The major features of the System Simulation Model are: the ability to reference characteristics of the different types of radioactive waste (age, burnup, etc.) in order to make operational and/or system design decisions, the ability to place stochastic variations on operational parameters such as processing time and equipment outages, and the ability to include a rigorous simulation of the transportation system. Output from the model includes the numbers, types, and characteristics of waste packages at selected points in the CRWMS and the extent to which various resources will be utilized in order to transport, process, and emplace the waste

  2. Discrete Event Modeling and Simulation-Driven Engineering for the ATLAS Data Acquisition Network

    CERN Document Server

    Bonaventura, Matias Alejandro; The ATLAS collaboration; Castro, Rodrigo Daniel

    2016-01-01

    We present an iterative and incremental development methodology for simulation models in network engineering projects. Driven by the DEVS (Discrete Event Systems Specification) formal framework for modeling and simulation we assist network design, test, analysis and optimization processes. A practical application of the methodology is presented for a case study in the ATLAS particle physics detector, the largest scientific experiment built by man where scientists around the globe search for answers about the origins of the universe. The ATLAS data network convey real-time information produced by physics detectors as beams of particles collide. The produced sub-atomic evidences must be filtered and recorded for further offline scrutiny. Due to the criticality of the transported data, networks and applications undergo careful engineering processes with stringent quality of service requirements. A tight project schedule imposes time pressure on design decisions, while rapid technology evolution widens the palett...

  3. Event-by-event simulation of quantum cryptography protocols

    NARCIS (Netherlands)

    Zhao, S.; Raedt, H. De

    We present a new approach to simulate quantum cryptography protocols using event-based processes. The method is validated by simulating the BB84 protocol and the Ekert protocol, both without and with the presence of an eavesdropper.

  4. Discrete event simulation as an ergonomic tool to predict workload exposures during systems design

    NARCIS (Netherlands)

    Perez, J.; Looze, M.P. de; Bosch, T.; Neumann, W.P.

    2014-01-01

    This methodological paper presents a novel approach to predict operator's mechanical exposure and fatigue accumulation in discrete event simulations. A biomechanical model of work-cycle loading is combined with a discrete event simulation model which provides work cycle patterns over the shift

  5. Event-driven simulation of neural population synchronization facilitated by electrical coupling.

    Science.gov (United States)

    Carrillo, Richard R; Ros, Eduardo; Barbour, Boris; Boucheny, Christian; Coenen, Olivier

    2007-02-01

    Most neural communication and processing tasks are driven by spikes. This has enabled the application of the event-driven simulation schemes. However the simulation of spiking neural networks based on complex models that cannot be simplified to analytical expressions (requiring numerical calculation) is very time consuming. Here we describe briefly an event-driven simulation scheme that uses pre-calculated table-based neuron characterizations to avoid numerical calculations during a network simulation, allowing the simulation of large-scale neural systems. More concretely we explain how electrical coupling can be simulated efficiently within this computation scheme, reproducing synchronization processes observed in detailed simulations of neural populations.

  6. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis.

    Science.gov (United States)

    Tran-Duy, An; Boonen, Annelies; van de Laar, Mart A F J; Franke, Angelinus C; Severens, Johan L

    2011-12-01

    To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Discrete event simulation paradigm was selected for model development. Drug efficacy was modelled as changes in disease activity (Bath Ankylosing Spondylitis Disease Activity Index (BASDAI)) and functional status (Bath Ankylosing Spondylitis Functional Index (BASFI)), which were linked to costs and health utility using statistical models fitted based on an observational AS cohort. Published clinical data were used to estimate drug efficacy and time to events. Two strategies were compared: (1) five available non-steroidal anti-inflammatory drugs (strategy 1) and (2) same as strategy 1 plus two tumour necrosis factor α inhibitors (strategy 2). 13,000 patients were followed up individually until death. For probability sensitivity analysis, Monte Carlo simulations were performed with 1000 sets of parameters sampled from the appropriate probability distributions. The models successfully generated valid data on treatments, BASDAI, BASFI, utility, quality-adjusted life years (QALYs) and costs at time points with intervals of 1-3 months during the simulation length of 70 years. Incremental cost per QALY gained in strategy 2 compared with strategy 1 was €35,186. At a willingness-to-pay threshold of €80,000, it was 99.9% certain that strategy 2 was cost-effective. The modelling framework provides great flexibility to implement complex algorithms representing treatment selection, disease progression and changes in costs and utilities over time of patients with AS. Results obtained from the simulation are plausible.

  7. Discrete event simulation model of sudden cardiac death predicts high impact of preventive interventions.

    Science.gov (United States)

    Andreev, Victor P; Head, Trajen; Johnson, Neil; Deo, Sapna K; Daunert, Sylvia; Goldschmidt-Clermont, Pascal J

    2013-01-01

    Sudden Cardiac Death (SCD) is responsible for at least 180,000 deaths a year and incurs an average cost of $286 billion annually in the United States alone. Herein, we present a novel discrete event simulation model of SCD, which quantifies the chains of events associated with the formation, growth, and rupture of atheroma plaques, and the subsequent formation of clots, thrombosis and on-set of arrhythmias within a population. The predictions generated by the model are in good agreement both with results obtained from pathological examinations on the frequencies of three major types of atheroma, and with epidemiological data on the prevalence and risk of SCD. These model predictions allow for identification of interventions and importantly for the optimal time of intervention leading to high potential impact on SCD risk reduction (up to 8-fold reduction in the number of SCDs in the population) as well as the increase in life expectancy.

  8. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    people who use computers every moment of their waking lives, others even ... How is discrete event simulation different from other kinds of simulation? ... time, energy consumption .... Schedule the CustomerDeparture event for this customer.

  9. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  10. Discrete Event Simulation for the Analysis of Artillery Fired Projectiles from Shore

    Science.gov (United States)

    2017-06-01

    model. 2.1 Discrete Event Simulation with Simkit Simkit is a library of classes and interfaces, written in Java , that support ease of implemen- tation...Simkit allows simulation modelers to break complex systems into components through a framework of Listener Event Graph Objects (LEGOs), described in...Classes A disadvantage to using Java Enum Types is the inability to change the values of Enum Type parameters while conducting a designed experiment

  11. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  12. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  13. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    Science.gov (United States)

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. Copyright © 2015 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  14. Discrete Event Simulation for Decision Modeling in Health Care: Lessons from Abdominal Aortic Aneurysm Screening

    Science.gov (United States)

    Jones, Edmund; Masconi, Katya L.; Sweeting, Michael J.; Thompson, Simon G.; Powell, Janet T.

    2018-01-01

    Markov models are often used to evaluate the cost-effectiveness of new healthcare interventions but they are sometimes not flexible enough to allow accurate modeling or investigation of alternative scenarios and policies. A Markov model previously demonstrated that a one-off invitation to screening for abdominal aortic aneurysm (AAA) for men aged 65 y in the UK and subsequent follow-up of identified AAAs was likely to be highly cost-effective at thresholds commonly adopted in the UK (£20,000 to £30,000 per quality adjusted life-year). However, new evidence has emerged and the decision problem has evolved to include exploration of the circumstances under which AAA screening may be cost-effective, which the Markov model is not easily able to address. A new model to handle this more complex decision problem was needed, and the case of AAA screening thus provides an illustration of the relative merits of Markov models and discrete event simulation (DES) models. An individual-level DES model was built using the R programming language to reflect possible events and pathways of individuals invited to screening v. those not invited. The model was validated against key events and cost-effectiveness, as observed in a large, randomized trial. Different screening protocol scenarios were investigated to demonstrate the flexibility of the DES. The case of AAA screening highlights the benefits of DES, particularly in the context of screening studies.

  15. Simulating an extreme over-the-horizon optical propagation event over Lake Michigan using a coupled mesoscale modeling and ray tracing framework

    NARCIS (Netherlands)

    Basu, S.

    2017-01-01

    Accurate simulation and forecasting of over-the-horizon propagation events are essential for various civilian and defense applications. We demonstrate the prowess of a newly proposed coupled mesoscale modeling and ray tracing framework in reproducing such an event. Wherever possible, routinely

  16. Out-of-order parallel discrete event simulation for electronic system-level design

    CERN Document Server

    Chen, Weiwei

    2014-01-01

    This book offers readers a set of new approaches and tools a set of tools and techniques for facing challenges in parallelization with design of embedded systems.? It provides an advanced parallel simulation infrastructure for efficient and effective system-level model validation and development so as to build better products in less time.? Since parallel discrete event simulation (PDES) has the potential to exploit the underlying parallel computational capability in today's multi-core simulation hosts, the author begins by reviewing the parallelization of discrete event simulation, identifyin

  17. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    Science.gov (United States)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  18. Event-by-event simulation of quantum phenomena : Application to Einstein-Podolosky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    De Raedt, H.; De Raedt, K.; Michielsen, K.; Keimpema, K.; Miyashita, S.

    We review the data gathering and analysis procedure used in real E instein-Podolsky-Rosen-Bohm experiments with photons and we illustrate the procedure by analyzing experimental data. Based on this analysis, we construct event-based computer simulation models in which every essential element in the

  19. Rare event simulation in radiation transport

    International Nuclear Information System (INIS)

    Kollman, C.

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ''learning'' algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution

  20. Modeled seasonality of glacial abrupt climate events

    Energy Technology Data Exchange (ETDEWEB)

    Flueckiger, Jacqueline [Institute of Arctic and Alpine Research, University of Colorado, Boulder, CO (United States); Environmental Physics, Institute of Biogeochemistry and Pollutant Dynamics, ETH Zuerich, Zurich (Switzerland); Knutti, Reto [Institute for Atmospheric and Climate Science, ETH Zuerich, Zurich (Switzerland); White, James W.C. [Institute of Arctic and Alpine Research, University of Colorado, Boulder, CO (United States); Renssen, Hans [Vrije Universiteit Amsterdam, Faculty of Earth and Life Sciences, Amsterdam (Netherlands)

    2008-11-15

    Greenland ice cores, as well as many other paleo-archives from the northern hemisphere, recorded a series of 25 warm interstadial events, the so-called Dansgaard-Oeschger (D-O) events, during the last glacial period. We use the three-dimensional coupled global ocean-atmosphere-sea ice model ECBILT-CLIO and force it with freshwater input into the North Atlantic to simulate abrupt glacial climate events, which we use as analogues for D-O events. We focus our analysis on the Northern Hemisphere. The simulated events show large differences in the regional and seasonal distribution of the temperature and precipitation changes. While the temperature changes in high northern latitudes and in the North Atlantic region are dominated by winter changes, the largest temperature increases in most other land regions are seen in spring. Smallest changes over land are found during the summer months. Our model simulations also demonstrate that the temperature and precipitation change patterns for different intensifications of the Atlantic meridional overturning circulation are not linear. The extent of the transitions varies, and local non-linearities influence the amplitude of the annual mean response as well as the response in different seasons. Implications for the interpretation of paleo-records are discussed. (orig.)

  1. Comparison of discrete event simulation tools in an academic environment

    Directory of Open Access Journals (Sweden)

    Mario Jadrić

    2014-12-01

    Full Text Available A new research model for simulation software evaluation is proposed consisting of three main categories of criteria: modeling and simulation capabilities of the explored tools, and tools’ input/output analysis possibilities, all with respective sub-criteria. Using the presented model, two discrete event simulation tools are evaluated in detail using the task-centred scenario. Both tools (Arena and ExtendSim were used for teaching discrete event simulation in preceding academic years. With the aim to inspect their effectiveness and to help us determine which tool is more suitable for students i.e. academic purposes, we used a simple simulation model of entities competing for limited resources. The main goal was to measure subjective (primarily attitude and objective indicators while using the tools when the same simulation scenario is given. The subjects were first year students of Master studies in Information Management at the Faculty of Economics in Split taking a course in Business Process Simulations (BPS. In a controlled environment – in a computer lab, two groups of students were given detailed, step-by-step instructions for building models using both tools - first using ExtendSim then Arena or vice versa. Subjective indicators (students’ attitudes were collected using an online survey completed immediately upon building each model. Subjective indicators primarily include students’ personal estimations of Arena and ExtendSim capabilities/features for model building, model simulation and result analysis. Objective indicators were measured using specialised software that logs information on user's behavior while performing a particular task on their computer such as distance crossed by mouse during model building, the number of mouse clicks, usage of the mouse wheel and speed achieved. The results indicate that ExtendSim is well preferred comparing to Arena with regards to subjective indicators while the objective indicators are

  2. Discrete event simulation modelling of patient service management with Arena

    Science.gov (United States)

    Guseva, Elena; Varfolomeyeva, Tatyana; Efimova, Irina; Movchan, Irina

    2018-05-01

    This paper describes the simulation modeling methodology aimed to aid in solving the practical problems of the research and analysing the complex systems. The paper gives the review of a simulation platform sand example of simulation model development with Arena 15.0 (Rockwell Automation).The provided example of the simulation model for the patient service management helps to evaluate the workload of the clinic doctors, determine the number of the general practitioners, surgeons, traumatologists and other specialized doctors required for the patient service and develop recommendations to ensure timely delivery of medical care and improve the efficiency of the clinic operation.

  3. Device simulation of charge collection and single-event upset

    International Nuclear Information System (INIS)

    Dodd, P.E.

    1996-01-01

    In this paper the author reviews the current status of device simulation of ionizing-radiation-induced charge collection and single-event upset (SEU), with an emphasis on significant results of recent years. The author presents an overview of device-modeling techniques applicable to the SEU problem and the unique challenges this task presents to the device modeler. He examines unloaded simulations of radiation-induced charge collection in simple p/n diodes, SEU in dynamic random access memories (DRAM's), and SEU in static random access memories (SRAM's). The author concludes with a few thoughts on future issues likely to confront the SEU device modeler

  4. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  5. A Simbol-X Event Simulator

    International Nuclear Information System (INIS)

    Puccetti, S.; Giommi, P.; Fiore, F.

    2009-01-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  6. A Simbol-X Event Simulator

    Science.gov (United States)

    Puccetti, S.; Fiore, F.; Giommi, P.

    2009-05-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  7. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schellen, H.L.; Schijndel, van A.W.M.; Blades, N.

    2014-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  8. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schijndel, van A.W.M.; Schellen, H.L.; Blades, N.; Mahdavi, A.; Mertens, B.

    2013-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  9. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  10. Simulation of overpressure events with a Laguna Verde model for the RELAP code to conditions of extended power up rate

    International Nuclear Information System (INIS)

    Rodriguez H, A.; Araiza M, E.; Fuentes M, L.; Ortiz V, J.

    2012-10-01

    In this work the main results of the simulation of overpressure events are presented using a model of the nuclear power plant of Laguna Verde developed for the RELAP/SCDAPSIM code. As starting point we have the conformation of a Laguna Verde model that represents a stationary state to similar conditions to the operation of the power station with Extended Power Up rate (EPU). The transitory of simulated pressure are compared with those documented in the Final Safety Analysis Report of Laguna Verde (FSAR). The results of the turbine shot transitory with and without by-pass of the main turbine are showed, and the event of closes of all the valves of main vapor isolation. A preliminary simulation was made and with base in the results some adjustments were made for the operation with EPU, taking into account the Operation Technical Specifications of the power station. The results of the final simulations were compared and analyzed with the content in the FSAR. The response of the power station to the transitory, reflected in the model for RELAP, was satisfactory. Finally, comments about the improvement of the model are included, for example, the response time of the protection and mitigation systems of the power station. (Author)

  11. Synchronous Parallel System for Emulation and Discrete Event Simulation

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor)

    2001-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to the state variables of the simulation object attributable to the event object and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  12. MHD simulation of the Bastille day event

    Energy Technology Data Exchange (ETDEWEB)

    Linker, Jon, E-mail: linkerj@predsci.com; Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete [Predictive Science Inc., 9990 Mesa Rim Road, Suite 170, San Diego CA, USA 92121 (United States)

    2016-03-25

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 10{sup 33} ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  13. Simulating events

    Energy Technology Data Exchange (ETDEWEB)

    Ferretti, C; Bruzzone, L [Techint Italimpianti, Milan (Italy)

    2000-06-01

    The Petacalco Marine terminal on the Pacific coast in the harbour of Lazaro Carclenas (Michoacan) in Mexico, provides coal to the thermoelectric power plant at Pdte Plutarco Elias Calles in the port area. The plant is being converted from oil to burn coal to generate 2100 MW of power. The article describes the layout of the terminal and equipment employed in the unloading, coal stacking, coal handling areas and the receiving area at the power plant. The contractor Techint Italimpianti has developed a software system, MHATIS, for marine terminal management which is nearly complete. The discrete event simulator with its graphic interface provides a real-type decision support system for simulating changes to the terminal operations and evaluating impacts. The article describes how MHATIS is used. 7 figs.

  14. Discrete Event Simulation-Based Resource Modelling in Health Technology Assessment.

    Science.gov (United States)

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Dixon, Simon

    2017-10-01

    The objective of this article was to conduct a systematic review of published research on the use of discrete event simulation (DES) for resource modelling (RM) in health technology assessment (HTA). RM is broadly defined as incorporating and measuring effects of constraints on physical resources (e.g. beds, doctors, nurses) in HTA models. Systematic literature searches were conducted in academic databases (JSTOR, SAGE, SPRINGER, SCOPUS, IEEE, Science Direct, PubMed, EMBASE) and grey literature (Google Scholar, NHS journal library), enhanced by manual searchers (i.e. reference list checking, citation searching and hand-searching techniques). The search strategy yielded 4117 potentially relevant citations. Following the screening and manual searches, ten articles were included. Reviewing these articles provided insights into the applications of RM: firstly, different types of economic analyses, model settings, RM and cost-effectiveness analysis (CEA) outcomes were identified. Secondly, variation in the characteristics of the constraints such as types and nature of constraints and sources of data for the constraints were identified. Thirdly, it was found that including the effects of constraints caused the CEA results to change in these articles. The review found that DES proved to be an effective technique for RM but there were only a small number of studies applied in HTA. However, these studies showed the important consequences of modelling physical constraints and point to the need for a framework to be developed to guide future applications of this approach.

  15. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  16. Use cases of discrete event simulation. Appliance and research

    Energy Technology Data Exchange (ETDEWEB)

    Bangsow, Steffen (ed.)

    2012-11-01

    Use Cases of Discrete Event Simulation. Includes case studies from various important industries such as automotive, aerospace, robotics, production industry. Written by leading experts in the field. Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book. The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and optimization this book provides a contribution to the orientation, what specific problems could be solved with the help of Discrete Event Simulation within the organization.

  17. Application of discrete event simulation to MRS design

    International Nuclear Information System (INIS)

    Bali, M.; Standley, W.

    1993-01-01

    The application of discrete event simulation to the Monitored, Retrievable Storage (MRS) material handling operations supported the MRS conceptual design effort and established a set of tools for use during MRS detail design and license application. The effort to develop a design analysis tool to support the MRS project started in 1991. The MRS simulation has so far identified potential savings and suggested methods of improving operations to enhance throughput. Immediately, simulation aided the MRS conceptual design effort through the investigation of alternative cask handling operations and the sizing and sharing of expensive equipment. The simulation also helped analyze the operability of the current design of MRS under various waste acceptance scenarios. Throughout the simulation effort, the model development and experimentation resulted in early identification and resolution of several design and operational issues

  18. Characterizing Drought Events from a Hydrological Model Ensemble

    Science.gov (United States)

    Smith, Katie; Parry, Simon; Prudhomme, Christel; Hannaford, Jamie; Tanguy, Maliko; Barker, Lucy; Svensson, Cecilia

    2017-04-01

    Hydrological droughts are a slow onset natural hazard that can affect large areas. Within the United Kingdom there have been eight major drought events over the last 50 years, with several events acting at the continental scale, and covering the entire nation. Many of these events have lasted several years and had significant impacts on agriculture, the environment and the economy. Generally in the UK, due to a northwest-southeast gradient in rainfall and relief, as well as varying underlying geology, droughts tend to be most severe in the southeast, which can threaten water supplies to the capital in London. With the impacts of climate change likely to increase the severity and duration of drought events worldwide, it is crucial that we gain an understanding of the characteristics of some of the longer and more extreme droughts of the 19th and 20th centuries, so we may utilize this information in planning for the future. Hydrological models are essential both for reconstructing such events that predate streamflow records, and for use in drought forecasting. However, whilst the uncertainties involved in modelling hydrological extremes on the flooding end of the flow regime have been studied in depth over the past few decades, the uncertainties in simulating droughts and low flow events have not yet received such rigorous academic attention. The "Cascade of Uncertainty" approach has been applied to explore uncertainty and coherence across simulations of notable drought events from the past 50 years using the airGR family of daily lumped catchment models. Parameter uncertainty has been addressed using a Latin Hypercube sampled experiment of 500,000 parameter sets per model (GR4J, GR5J and GR6J), over more than 200 catchments across the UK. The best performing model parameterisations, determined using a multi-objective function approach, have then been taken forward for use in the assessment of the impact of model parameters and model structure on drought event

  19. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Energy Technology Data Exchange (ETDEWEB)

    Guerrier, C. [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Holcman, D., E-mail: david.holcman@ens.fr [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Mathematical Institute, Oxford OX2 6GG, Newton Institute (United Kingdom)

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  20. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    International Nuclear Information System (INIS)

    Guerrier, C.; Holcman, D.

    2017-01-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  1. ATLAS simulated black hole event

    CERN Multimedia

    Pequenão, J

    2008-01-01

    The simulated collision event shown is viewed along the beampipe. The event is one in which a microscopic-black-hole was produced in the collision of two protons (not shown). The microscopic-black-hole decayed immediately into many particles. The colors of the tracks show different types of particles emerging from the collision (at the center).

  2. Event-by-event simulation of single-neutron experiments to test uncertainty relations

    International Nuclear Information System (INIS)

    Raedt, H De; Michielsen, K

    2014-01-01

    Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. (paper)

  3. Use Cases of Discrete Event Simulation Appliance and Research

    CERN Document Server

    2012-01-01

    Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book.   The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and o...

  4. Event-by-event simulation of a quantum delayed-choice experiment

    NARCIS (Netherlands)

    Donker, Hylke C.; De Raedt, Hans; Michielsen, Kristel

    2014-01-01

    The quantum delayed-choice experiment of Tang et al. (2012) is simulated on the level of individual events without making reference to concepts of quantum theory or without solving a wave equation. The simulation results are in excellent agreement with the quantum theoretical predictions of this

  5. PRODUCTION SYSTEM MODELING AND SIMULATION USING DEVS FORMALISM

    OpenAIRE

    Amaya Hurtado, Darío; Castillo Estepa, Ricardo Andrés; Avilés Montaño, Óscar Fernando; Ramos Sandoval, Olga Lucía

    2014-01-01

    This article presents the Discrete Event System Specification (DEVS) formalism, in their atomic and coupled configurations; it is used for discrete event systems modeling and simulation. Initially this work describes the analysis of discrete event systems concepts and its applicability. Then a comprehensive description of the DEVS formalism structure is presented, in order to model and simulate an industrial process, taking into account changes in parameters such as process service time, each...

  6. Program For Parallel Discrete-Event Simulation

    Science.gov (United States)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  7. Dynamic information architecture system (DIAS) : multiple model simulation management

    International Nuclear Information System (INIS)

    Simunich, K. L.; Sydelko, P.; Dolph, J.; Christiansen, J.

    2002-01-01

    Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application contexts. The modeling domain of a specific DIAS-based simulation is determined by (1) software Entity (domain-specific) objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. In DIAS, models communicate only with Entity objects, never with each other. Each Entity object has a number of Parameter and Aspect (of behavior) objects associated with it. The Parameter objects contain the state properties of the Entity object. The Aspect objects represent the behaviors of the Entity object and how it interacts with other objects. DIAS extends the ''Object'' paradigm by abstraction of the object's dynamic behaviors, separating the ''WHAT'' from the ''HOW.'' DIAS object class definitions contain an abstract description of the various aspects of the object's behavior (the WHAT), but no implementation details (the HOW). Separate DIAS models/applications carry the implementation of object behaviors (the HOW). Any model deemed appropriate, including existing legacy-type models written in other languages, can drive entity object behavior. The DIAS design promotes plug-and-play of alternative models, with minimal recoding of existing applications. The DIAS Context Builder object builds a constructs or scenario for the simulation, based on developer specification and user inputs. Because DIAS is a discrete event simulation system, there is a Simulation Manager object with which all events are processed. Any class that registers to receive events must implement an event handler (method) to process the event during execution. Event handlers can schedule other events; create or remove Entities from the

  8. Simulation of the Tornado Event of 22 March, 2013 over ...

    Indian Academy of Sciences (India)

    An attempt has been made to simulate this rare event using the Weather Research and Forecasting (WRF) model. The model was run in a single domain at 9 km resolution for a period of 24 hrs, starting at 0000 UTC on 22 March, 2013. The meteorological conditions that led to form this tornado have been analyzed.

  9. Event Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2001-01-01

    The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics.We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...

  10. Simulation and verification of transient events in large wind power installations

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, P.; Hansen, A.D.; Christensen, P.; Meritz, M.; Bech, J.; Bak-Jensen, B.; Nielsen, H.

    2003-10-01

    Models for wind power installations excited by transient events have been developed and verified. A number of cases have been investigated, including comparisons of simulations of a three-phase short circuit, validation with measurements of tripping of single wind turbine, islanding of a group of two wind turbines, and voltage steps caused by tripping of wind turbines and by manual transformer tap-changing. A Benchmark model is also presented, enabling the reader to test own simulation results against results obtained with models developed in EMTDC and DIgSILENT. (au)

  11. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    Science.gov (United States)

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  12. Managing emergency department overcrowding via ambulance diversion: A discrete event simulation model

    Directory of Open Access Journals (Sweden)

    Chih-Hao Lin

    2015-01-01

    Conclusion: An input–throughput–output simulation model is proposed for simulating ED operation. Effectiveness of several AD strategies on relieving ED overcrowding was assessed via computer simulations based on this model. By appropriate parameter settings, the model can represent medical resource providers of different scales. It is also feasible to expand the simulations to evaluate the effect of AD strategies on a community basis. The results may offer insights for making effective AD policies.

  13. Estimating ICU bed capacity using discrete event simulation.

    Science.gov (United States)

    Zhu, Zhecheng; Hen, Bee Hoon; Teow, Kiok Liang

    2012-01-01

    The intensive care unit (ICU) in a hospital caters for critically ill patients. The number of the ICU beds has a direct impact on many aspects of hospital performance. Lack of the ICU beds may cause ambulance diversion and surgery cancellation, while an excess of ICU beds may cause a waste of resources. This paper aims to develop a discrete event simulation (DES) model to help the healthcare service providers determine the proper ICU bed capacity which strikes the balance between service level and cost effectiveness. The DES model is developed to reflect the complex patient flow of the ICU system. Actual operational data, including emergency arrivals, elective arrivals and length of stay, are directly fed into the DES model to capture the variations in the system. The DES model is validated by open box test and black box test. The validated model is used to test two what-if scenarios which the healthcare service providers are interested in: the proper number of the ICU beds in service to meet the target rejection rate and the extra ICU beds in service needed to meet the demand growth. A 12-month period of actual operational data was collected from an ICU department with 13 ICU beds in service. Comparison between the simulation results and the actual situation shows that the DES model accurately captures the variations in the system, and the DES model is flexible to simulate various what-if scenarios. DES helps the healthcare service providers describe the current situation, and simulate the what-if scenarios for future planning.

  14. Rare Event Simulation in Radiation Transport

    Science.gov (United States)

    Kollman, Craig

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiplied by the likelihood ratio between the true and simulated probabilities so as to keep our estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive "learning" algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give, with probability one, a sequence of estimates converging exponentially fast to the true solution. In the final chapter, an attempt to generalize this algorithm to a continuous

  15. Dynamic information architecture system (DIAS) : multiple model simulation management.

    Energy Technology Data Exchange (ETDEWEB)

    Simunich, K. L.; Sydelko, P.; Dolph, J.; Christiansen, J.

    2002-05-13

    Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application contexts. The modeling domain of a specific DIAS-based simulation is determined by (1) software Entity (domain-specific) objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. In DIAS, models communicate only with Entity objects, never with each other. Each Entity object has a number of Parameter and Aspect (of behavior) objects associated with it. The Parameter objects contain the state properties of the Entity object. The Aspect objects represent the behaviors of the Entity object and how it interacts with other objects. DIAS extends the ''Object'' paradigm by abstraction of the object's dynamic behaviors, separating the ''WHAT'' from the ''HOW.'' DIAS object class definitions contain an abstract description of the various aspects of the object's behavior (the WHAT), but no implementation details (the HOW). Separate DIAS models/applications carry the implementation of object behaviors (the HOW). Any model deemed appropriate, including existing legacy-type models written in other languages, can drive entity object behavior. The DIAS design promotes plug-and-play of alternative models, with minimal recoding of existing applications. The DIAS Context Builder object builds a constructs or scenario for the simulation, based on developer specification and user inputs. Because DIAS is a discrete event simulation system, there is a Simulation Manager object with which all events are processed. Any class that registers to receive events must implement an event handler (method) to process the event during execution. Event handlers

  16. Event-based model diagnosis of rainfall-runoff model structures

    International Nuclear Information System (INIS)

    Stanzel, P.

    2012-01-01

    The objective of this research is a comparative evaluation of different rainfall-runoff model structures. Comparative model diagnostics facilitate the assessment of strengths and weaknesses of each model. The application of multiple models allows an analysis of simulation uncertainties arising from the selection of model structure, as compared with effects of uncertain parameters and precipitation input. Four different model structures, including conceptual and physically based approaches, are compared. In addition to runoff simulations, results for soil moisture and the runoff components of overland flow, interflow and base flow are analysed. Catchment runoff is simulated satisfactorily by all four model structures and shows only minor differences. Systematic deviations from runoff observations provide insight into model structural deficiencies. While physically based model structures capture some single runoff events better, they do not generally outperform conceptual model structures. Contributions to uncertainty in runoff simulations stemming from the choice of model structure show similar dimensions to those arising from parameter selection and the representation of precipitation input. Variations in precipitation mainly affect the general level and peaks of runoff, while different model structures lead to different simulated runoff dynamics. Large differences between the four analysed models are detected for simulations of soil moisture and, even more pronounced, runoff components. Soil moisture changes are more dynamical in the physically based model structures, which is in better agreement with observations. Streamflow contributions of overland flow are considerably lower in these models than in the more conceptual approaches. Observations of runoff components are rarely made and are not available in this study, but are shown to have high potential for an effective selection of appropriate model structures (author) [de

  17. Discrete event simulation for petroleum transfers involving harbors, refineries and pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Martins, Marcella S.R.; Lueders, Ricardo; Delgado, Myriam R.B.S. [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil)

    2009-07-01

    Nowadays a great effort has been spent by companies to improve their logistics in terms of programming of events that affect production and distribution of products. In this case, simulation can be a valuable tool for evaluating different behaviors. The objective of this work is to build a discrete event simulation model for scheduling of operational activities in complexes containing one harbor and two refineries interconnected by a pipeline infrastructure. The model was developed in Arena package, based on three sub-models that control pier allocation, loading of tanks, and transfers to refineries through pipelines. Preliminary results obtained for a given control policy, show that profit can be calculated by taking into account many parameters such as oil costs on ships, pier using, over-stay of ships and interface costs. Such problem has already been considered in the literature but using different strategies. All these factors should be considered in a real-world operation where decision making tools are necessary to obtain high returns. (author)

  18. Event simulation for the WA80 experiment

    International Nuclear Information System (INIS)

    Sorensen, S.P.

    1986-01-01

    The HIJET and LUND event generators are compared. It is concluded that for detector construction and design of experimental setups, the differences between the two models are marginal. The coverage of the WA80 setup in pseudorapidity and energy is demonstrated. The performance of some of the WA80 detectors (zero-degree calorimeter, wall calorimeter, multiplicity array, and SAPHIR lead-glass detector) is evaluated based on calculations with the LUND or the HIJET codes combined with codes simulating the detector responses. 9 refs., 3 figs

  19. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    Science.gov (United States)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  20. LCG MCDB - a Knowledgebase of Monte Carlo Simulated Events

    CERN Document Server

    Belov, S; Galkin, E; Gusev, A; Pokorski, Witold; Sherstnev, A V

    2008-01-01

    In this paper we report on LCG Monte Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC collaborations by experts. In many cases, the modern Monte Carlo simulation of physical processes requires expert knowledge in Monte Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project.

  1. NEVESIM: event-driven neural simulation framework with a Python interface.

    Science.gov (United States)

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  2. Discrete event simulation in an artificial intelligence environment: Some examples

    International Nuclear Information System (INIS)

    Roberts, D.J.; Farish, T.

    1991-01-01

    Several Los Alamos National Laboratory (LANL) object-oriented discrete-event simulation efforts have been completed during the past three years. One of these systems has been put into production and has a growing customer base. Another (started two years earlier than the first project) was completed but has not yet been used. This paper will describe these simulation projects. Factors which were pertinent to the success of the one project, and to the failure of the second project will be discussed (success will be measured as the extent to which the simulation model was used as originally intended). 5 figs

  3. Design and validation of a dynamic discrete event stochastic simulation model of mastitis control in dairy herds.

    Science.gov (United States)

    Allore, H G; Schruben, L W; Erb, H N; Oltenacu, P A

    1998-03-01

    A dynamic stochastic simulation model for discrete events, SIMMAST, was developed to simulate the effect of mastitis on the composition of the bulk tank milk of dairy herds. Intramammary infections caused by Streptococcus agalactiae, Streptococcus spp. other than Strep. agalactiae, Staphylococcus aureus, and coagulase-negative staphylococci were modeled as were the milk, fat, and protein test day solutions for individual cows, which accounted for the fixed effects of days in milk, age at calving, season of calving, somatic cell count (SCC), and random effects of test day, cow yield differences from herdmates, and autocorrelated errors. Probabilities for the transitions among various states of udder health (uninfected or subclinically or clinically infected) were calculated to account for exposure, heifer infection, spontaneous recovery, lactation cure, infection or cure during the dry period, month of lactation, parity, within-herd yields, and the number of quarters with clinical intramammary infection in the previous and current lactations. The stochastic simulation model was constructed using estimates from the literature and also using data from 164 herds enrolled with Quality Milk Promotion Services that each had bulk tank SCC between 500,000 and 750,000/ml. Model parameters and outputs were validated against a separate data file of 69 herds from the Northeast Dairy Herd Improvement Association, each with a bulk tank SCC that was > or = 500,000/ml. Sensitivity analysis was performed on all input parameters for control herds. Using the validated stochastic simulation model, the control herds had a stable time average bulk tank SCC between 500,000 and 750,000/ml.

  4. Simulation of a Rapid Dropout Event for Highly Relativistic Electrons with the RBE Model

    Science.gov (United States)

    Kang, S-B.; Fok, M.-C.; Glocer, A.; Min, K.-W.; Choi, C.-R.; Choi, E.; Hwang, J.

    2016-01-01

    A flux dropout is a sudden and sizable decrease in the energetic electron population of the outer radiation belt on the time scale of a few hours. We simulated a flux dropout of highly relativistic 2.5 MeV electrons using the Radiation Belt Environment model, incorporating the pitch angle diffusion coefficients caused by electromagnetic ion cyclotron (EMIC) waves for the geomagnetic storm events of 23-26 October 2002. This simulation showed a remarkable decrease in the 2.5 MeV electron flux during main phase of the storm, compared to those without EMIC waves. This decrease was independent of magnetopause shadowing or drift loss to the magnetopause. We suggest that the flux decrease was likely to be primarily due to pitch angle scattering to the loss cone by EMIC waves. Furthermore, the 2.5 MeV electron flux calculated with EMIC waves correspond very well with that observed from Solar Anomalous and Magnetospheric Particle EXplorer spacecraft. EMIC wave scattering is therefore likely one of the key mechanisms to understand flux dropouts. We modeled EMIC wave intensities by the Kp index. However, the calculated dropout is a several hours earlier than the observed one. We propose that Kp is not the best parameter to predict EMIC waves.

  5. Discrete event simulation of crop operations in sweet pepper in support of work method innovation

    NARCIS (Netherlands)

    Ooster, van 't Bert; Aantjes, Wiger; Melamed, Z.

    2017-01-01

    Greenhouse Work Simulation, GWorkS, is a model that simulates crop operations in greenhouses for the purpose of analysing work methods. GWorkS is a discrete event model that approaches reality as a discrete stochastic dynamic system. GWorkS was developed and validated using cut-rose as a case

  6. Discrete event simulation methods applied to advanced importance measures of repairable components in multistate network flow systems

    International Nuclear Information System (INIS)

    Huseby, Arne B.; Natvig, Bent

    2013-01-01

    Discrete event models are frequently used in simulation studies to model and analyze pure jump processes. A discrete event model can be viewed as a system consisting of a collection of stochastic processes, where the states of the individual processes change as results of various kinds of events occurring at random points of time. We always assume that each event only affects one of the processes. Between these events the states of the processes are considered to be constant. In the present paper we use discrete event simulation in order to analyze a multistate network flow system of repairable components. In order to study how the different components contribute to the system, it is necessary to describe the often complicated interaction between component processes and processes at the system level. While analytical considerations may throw some light on this, a simulation study often allows the analyst to explore more details. By producing stable curve estimates for the development of the various processes, one gets a much better insight in how such systems develop over time. These methods are particulary useful in the study of advanced importancez measures of repairable components. Such measures can be very complicated, and thus impossible to calculate analytically. By using discrete event simulations, however, this can be done in a very natural and intuitive way. In particular significant differences between the Barlow–Proschan measure and the Natvig measure in multistate network flow systems can be explored

  7. Corpuscular event-by-event simulation of quantum optics experiments: application to a quantum-controlled delayed-choice experiment

    International Nuclear Information System (INIS)

    De Raedt, Hans; Delina, M; Jin, Fengping; Michielsen, Kristel

    2012-01-01

    A corpuscular simulation model of optical phenomena that does not require knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one by one is discussed. The event-based corpuscular model gives a unified description of multiple-beam fringes of a plane parallel plate and a single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum eraser, two-beam interference, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments. The approach is illustrated by applying it to a recent proposal for a quantum-controlled delayed choice experiment, demonstrating that also this thought experiment can be understood in terms of particle processes only.

  8. Integrating Continuous-Time and Discrete-Event Concepts in Process Modelling, Simulation and Control

    NARCIS (Netherlands)

    Beek, van D.A.; Gordijn, S.H.F.; Rooda, J.E.; Ertas, A.

    1995-01-01

    Currently, modelling of systems in the process industry requires the use of different specification languages for the specification of the discrete-event and continuous-time subsystems. In this way, models are restricted to individual subsystems of either a continuous-time or discrete-event nature.

  9. Tropical climate and vegetation cover during Heinrich event 1: Simulations with coupled climate vegetation models

    OpenAIRE

    Handiani, Dian Noor

    2012-01-01

    This study focuses on the climate and vegetation responses to abrupt climate change in the Northern Hemisphere during the last glacial period. Two abrupt climate events are explored: the abrupt cooling of the Heinrich event 1 (HE1), followed by the abrupt warming of the Bølling-Allerød interstadial (BA). These two events are simulated by perturbing the freshwater balance of the Atlantic Ocean, with the intention of altering the Atlantic Meridional Overturning Circulation (AMOC) and also of in...

  10. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  11. Numerical simulation of a mistral wind event occuring

    Science.gov (United States)

    Guenard, V.; Caccia, J. L.; Tedeschi, G.

    2003-04-01

    The experimental network of the ESCOMPTE field experiment (june-july 2001) is turned into account to investigate the Mistral wind affecting the Marseille area (South of France). Mistral wind is a northerly flow blowing across the Rhône valley and toward the Mediterranean sea resulting from the dynamical low pressure generated in the wake of the Alps ridge. It brings cold, dry air masses and clear sky conditions over the south-eastern part of France. Up to now, few scientific studies have been carried out on the Mistral wind especially the evolution of its 3-D structure so that its mesoscale numerical simulation is still relevant. Non-hydrostatic RAMS model is performed to better investigate this mesoscale phenomena. Simulations at a 12 km horizontal resolution are compared to boundary layer wind profilers and ground measurements. Preliminary results suit quite well with the Mistral statistical studies carried out by the operational service of Météo-France and observed wind profiles are correctly reproduced by the numerical model RAMS which appears to be an efficient tool for its understanding of Mistral. Owing to the absence of diabatic effect in Mistral events which complicates numerical simulations, the present work is the first step for the validation of RAMS model in that area. Further works will consist on the study of the interaction of Mistral wind with land-sea breeze. Also, RAMS simulations will be combined with aerosol production and ocean circulation models to supply chemists and oceanographers with some answers for their studies.

  12. A simple conceptual model of abrupt glacial climate events

    Directory of Open Access Journals (Sweden)

    H. Braun

    2007-11-01

    Full Text Available Here we use a very simple conceptual model in an attempt to reduce essential parts of the complex nonlinearity of abrupt glacial climate changes (the so-called Dansgaard-Oeschger events to a few simple principles, namely (i the existence of two different climate states, (ii a threshold process and (iii an overshooting in the stability of the system at the start and the end of the events, which is followed by a millennial-scale relaxation. By comparison with a so-called Earth system model of intermediate complexity (CLIMBER-2, in which the events represent oscillations between two climate states corresponding to two fundamentally different modes of deep-water formation in the North Atlantic, we demonstrate that the conceptual model captures fundamental aspects of the nonlinearity of the events in that model. We use the conceptual model in order to reproduce and reanalyse nonlinear resonance mechanisms that were already suggested in order to explain the characteristic time scale of Dansgaard-Oeschger events. In doing so we identify a new form of stochastic resonance (i.e. an overshooting stochastic resonance and provide the first explicitly reported manifestation of ghost resonance in a geosystem, i.e. of a mechanism which could be relevant for other systems with thresholds and with multiple states of operation. Our work enables us to explicitly simulate realistic probability measures of Dansgaard-Oeschger events (e.g. waiting time distributions, which are a prerequisite for statistical analyses on the regularity of the events by means of Monte-Carlo simulations. We thus think that our study is an important advance in order to develop more adequate methods to test the statistical significance and the origin of the proposed glacial 1470-year climate cycle.

  13. Simulating flaring events in complex active regions driven by observed magnetograms

    Science.gov (United States)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.

    2011-05-01

    Context. We interpret solar flares as events originating in active regions that have reached the self organized critical state, by using a refined cellular automaton model with initial conditions derived from observations. Aims: We investigate whether the system, with its imposed physical elements, reaches a self organized critical state and whether well-known statistical properties of flares, such as scaling laws observed in the distribution functions of characteristic parameters, are reproduced after this state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy and event duration follow the expected scaling laws, we first applied a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent loading and relaxation steps lead the system to self organized criticality, after which the statistical properties of the simulated events are examined. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately imposed on all elements of the model. Results: Our results show that self organized criticality is indeed reached when applying specific loading and relaxation rules. Power-law indices obtained from the distribution functions of the modeled flaring events are in good agreement with observations. Single power laws (peak and total flare energy) are obtained, as are power laws with exponential cutoff and double power laws (flare duration). The results are also compared with observational X-ray data from the GOES satellite for our active-region sample. Conclusions: We conclude that well-known statistical properties of flares are reproduced after the system has

  14. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    Science.gov (United States)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  15. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    Science.gov (United States)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  16. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  17. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  18. Top-Level Simulation of a Smart-Bolometer Using VHDL Modeling

    Directory of Open Access Journals (Sweden)

    Matthieu DENOUAL

    2012-03-01

    Full Text Available An event-driven modeling technique in standard VHDL is presented in this paper for the high level simulation of a resistive bolometer operating in closed-loop mode and implementing smart functions. The closed-loop mode operation is achieved by the capacitively coupled electrical substitution technique. The event-driven VHDL modeling technique is successfully applied to behavioral modeling and simulation of such a multi-physics system involving optical, thermal and electronics mechanisms. The modeling technique allows the high level simulations for the development and validation of the smart functions algorithms of the future integrated smart-device.

  19. Simulating single-event burnout of n-channel power MOSFET's

    International Nuclear Information System (INIS)

    Johnson, G.H.; Hohl, J.H.; Schrimpf, R.D.; Galloway, K.F.

    1993-01-01

    Heavy ions are ubiquitous in a space environment. Single-event burnout of power MOSFET's is a sudden catastrophic failure mechanism that is initiated by the passage of a heavy ion through the device structure. The passage of the heavy ion generates a current filament that locally turns on a parasitic n-p-n transistor inherent to the power MOSFET. Subsequent high currents and high voltage in the device induce second breakdown of the parasitic bipolar transistor and hence meltdown of the device. This paper presents a model that can be used for simulating the burnout mechanism in order to gain insight into the significant device parameters that most influence the single-event burnout susceptibility of n-channel power MOSFET's

  20. Simulation of air admission in a propeller hydroturbine during transient events

    Science.gov (United States)

    Nicolle, J.; Morissette, J.-F.

    2016-11-01

    In this study, multiphysic simulations are carried out in order to model fluid loading and structural stresses on propeller blades during startup and runaway. It is found that air admission plays an important role during these transient events and that biphasic simulations are therefore required. At the speed no load regime, a large air pocket with vertical free surface forms in the centre of the runner displacing the water flow near the shroud. This significantly affects the torque developed on the blades and thus structural loading. The resulting pressures are applied to a quasi-static structural model and good agreement is obtained with experimental strain gauge data.

  1. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  2. Analysis of convection-permitting simulations for capturing heavy rainfall events over Myanmar Region

    Science.gov (United States)

    Acierto, R. A. E.; Kawasaki, A.

    2017-12-01

    Perennial flooding due to heavy rainfall events causes strong impacts on the society and economy. With increasing pressures of rapid development and potential for climate change impacts, Myanmar experiences a rapid increase in disaster risk. Heavy rainfall hazard assessment is key on quantifying such disaster risk in both current and future conditions. Downscaling using Regional Climate Models (RCM) such as Weather Research and Forecast model have been used extensively for assessing such heavy rainfall events. However, usage of convective parameterizations can introduce large errors in simulating rainfall. Convective-permitting simulations have been used to deal with this problem by increasing the resolution of RCMs to 4km. This study focuses on the heavy rainfall events during the six-year (2010-2015) wet period season from May to September in Myanmar. The investigation primarily utilizes rain gauge observation for comparing downscaled heavy rainfall events in 4km resolution using ERA-Interim as boundary conditions using 12km-4km one-way nesting method. The study aims to provide basis for production of high-resolution climate projections over Myanmar in order to contribute for flood hazard and risk assessment.

  3. Managing emergency department overcrowding via ambulance diversion: a discrete event simulation model.

    Science.gov (United States)

    Lin, Chih-Hao; Kao, Chung-Yao; Huang, Chong-Ye

    2015-01-01

    Ambulance diversion (AD) is considered one of the possible solutions to relieve emergency department (ED) overcrowding. Study of the effectiveness of various AD strategies is prerequisite for policy-making. Our aim is to develop a tool that quantitatively evaluates the effectiveness of various AD strategies. A simulation model and a computer simulation program were developed. Three sets of simulations were executed to evaluate AD initiating criteria, patient-blocking rules, and AD intervals, respectively. The crowdedness index, the patient waiting time for service, and the percentage of adverse patients were assessed to determine the effect of various AD policies. Simulation results suggest that, in a certain setting, the best timing for implementing AD is when the crowdedness index reaches the critical value, 1.0 - an indicator that ED is operating at its maximal capacity. The strategy to divert all patients transported by ambulance is more effective than to divert either high-acuity patients only or low-acuity patients only. Given a total allowable AD duration, implementing AD multiple times with short intervals generally has better effect than having a single AD with maximal allowable duration. An input-throughput-output simulation model is proposed for simulating ED operation. Effectiveness of several AD strategies on relieving ED overcrowding was assessed via computer simulations based on this model. By appropriate parameter settings, the model can represent medical resource providers of different scales. It is also feasible to expand the simulations to evaluate the effect of AD strategies on a community basis. The results may offer insights for making effective AD policies. Copyright © 2012. Published by Elsevier B.V.

  4. A Study on Modeling Approaches in Discrete Event Simulation Using Design Patterns

    National Research Council Canada - National Science Library

    Kim, Leng Koh

    2007-01-01

    .... This modeling paradigm encompasses several modeling approaches active role of events, entities as independent components, and chaining components to enable interactivity that are excellent ways of building a DES system...

  5. Unified Modeling of Discrete Event and Control Systems Applied in Manufacturing

    Directory of Open Access Journals (Sweden)

    Amanda Arêas de Souza

    2015-05-01

    Full Text Available For the development of both a simulation modeland a control system, it is necessary to build, inadvance, a conceptual model. This is what isusually suggested by the methodologies applied inprojects of this nature. Some conceptual modelingtechniques allow for a better understanding ofthe simulation model, and a clear descriptionof the logic of control systems. Therefore, thispaper aims to present and evaluate conceptuallanguages for unified modeling of models ofdiscrete event simulation and control systemsapplied in manufacturing. The results show thatthe IDEF-SIM language can be applied both insimulation systems and in process control.

  6. Data-Model and Inter-Model Comparisons of the GEM Outflow Events Using the Space Weather Modeling Framework

    Science.gov (United States)

    Welling, D. T.; Eccles, J. V.; Barakat, A. R.; Kistler, L. M.; Haaland, S.; Schunk, R. W.; Chappell, C. R.

    2015-12-01

    Two storm periods were selected by the Geospace Environment Modeling Ionospheric Outflow focus group for community collaborative study because of its high magnetospheric activity and extensive data coverage: the September 27 - October 4, 2002 corotating interaction region event and the October 22 - 29 coronal mass ejection event. During both events, the FAST, Polar, Cluster, and other missions made key observations, creating prime periods for data-model comparison. The GEM community has come together to simulate this period using many different methods in order to evaluate models, compare results, and expand our knowledge of ionospheric outflow and its effects on global dynamics. This paper presents Space Weather Modeling Framework (SWMF) simulations of these important periods compared against observations from the Polar TIDE, Cluster CODIF and EFW instruments. Emphasis will be given to the second event. Density and velocity of oxygen and hydrogen throughout the lobes, plasma sheet, and inner magnetosphere will be the focus of these comparisons. For these simulations, the SWMF couples the multifluid version of BATS-R-US MHD to a variety of ionospheric outflow models of varying complexity. The simplest is outflow arising from constant MHD inner boundary conditions. Two first-principles-based models are also leveraged: the Polar Wind Outflow Model (PWOM), a fluid treatment of outflow dynamics, and the Generalized Polar Wind (GPW) model, which combines fluid and particle-in-cell approaches. Each model is capable of capturing a different set of energization mechanisms, yielding different outflow results. The data-model comparisons will illustrate how well each approach captures reality and which energization mechanisms are most important. Inter-model comparisons will illustrate how the different outflow specifications affect the magnetosphere. Specifically, it is found that the GPW provides increased heavy ion outflow over a broader spatial range than the alternative

  7. Discrete Event System Based Pyroprocessing Modeling and Simulation: Oxide Reduction

    International Nuclear Information System (INIS)

    Lee, H. J.; Ko, W. I.; Choi, S. Y.; Kim, S. K.; Hur, J. M.; Choi, E. Y.; Im, H. S.; Park, K. I.; Kim, I. T.

    2014-01-01

    Dynamic changes according to the batch operation cannot be predicted in an equilibrium material flow. This study began to build a dynamic material balance model based on the previously developed pyroprocessing flowsheet. As a mid- and long-term research, an integrated pyroprocessing simulator is being developed at the Korea Atomic Energy Research Institute (KAERI) to cope with a review on the technical feasibility, safeguards assessment, conceptual design of facility, and economic feasibility evaluation. The most fundamental thing in such a simulator development is to establish the dynamic material flow framework. This study focused on the operation modeling of pyroprocessing to implement a dynamic material flow. As a case study, oxide reduction was investigated in terms of a dynamic material flow. DES based modeling was applied to build a pyroprocessing operation model. A dynamic material flow as the basic framework for an integrated pyroprocessing was successfully implemented through ExtendSim's internal database and item blocks. Complex operation logic behavior was verified, for example, an oxide reduction process in terms of dynamic material flow. Compared to the equilibrium material flow, a model-based dynamic material flow provides such detailed information that a careful analysis of every batch is necessary to confirm the dynamic material balance results. With the default scenario of oxide reduction, the batch mass balance was verified in comparison with a one-year equilibrium mass balance. This study is still under progress with a mid-and long-term goal, the development of a multi-purpose pyroprocessing simulator that is able to cope with safeguards assessment, economic feasibility, technical evaluation, conceptual design, and support of licensing for a future pyroprocessing facility

  8. Formulation of Generic Simulation Models for Analyzing Construction Claims

    Directory of Open Access Journals (Sweden)

    Rifat Rustom

    2012-11-01

    Full Text Available While there are several techniques for analyzing the impact of claims on time schedule and productivity,very few are considered adequate and comprehensive to consider risks and uncertainties.A generic approach for claims analysis using simulation is proposed. The formulation of the generic methodology presented in this paper depends on three simulation models;As-Planned Model (APM,As-Built Model (ABM, and What-Would-HaveBeenModel(WWHBM. The proposed generic methodology as presented in this paper provides a good basis as a more elaborate approach to better analyze claims and their impacts on project time and productivity utilizing discrete event simulation.The approach proposed allows for scenario analysis to account for the disputed events and workflow disruptions. The proposed models will assist claimants in presenting their cases effectively and professionally.

  9. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  10. Hydrodynamic modelling of extreme flood events in the Kashmir valley in India

    Science.gov (United States)

    Jain, Manoj; Parvaze, Sabah

    2017-04-01

    Floods are one of the most predominant, costly and deadly hazards of all natural vulnerabilities. Every year, floods exert a heavy toll on human life and property in many parts of the world. The prediction of river stages and discharge during flood extremes plays a vital role in planning structural and non-structural measures of flood management. The predictions are also valuable to prepare the flood inundation maps and river floodplain zoning. In the Kashmir Valley, floods occur mainly and very often in the Jhelum Basin mostly due to extreme precipitation events and rugged mountainous topography of the basin. These floods cause extreme damage to life and property in the valley from time to time. Excessive rainfall, particularly in higher sub-catchments causes the snow to melt resulting in excessive runoff downhill to the streams causing floods in the Kashmir Valley where Srinagar city is located. However, very few hydrological studies have been undertaken for the Jhelum Basin mainly due to non-availability of hydrological data due to very complex mountainous terrain. Therefore, the present study has been conducted to model the extreme flood events in the Jhelum Basin in Kashmir Valley. An integrated NAM and MIKE 11 HD model has been setup for Jhelum basin up to Ram Munshi Bagh gauging site and then four most extreme historical flood events in the time series has been analyzed separately including the most recent and most extreme flood event of 2014. In September 2014, the Kashmir Valley witnessed the most severe flood in the past 60 years due to catastrophic rainfall from 1st to 6th September wherein the valley received unprecedented rainfall of more than 650 mm in just 3 days breaking record of many decades. The MIKE 11 HD and NAM model has been calibrated using 21 years (1985-2005) data and validated using 9 years (2006-2014) data. The efficiency indices of the model for calibration and validation period is 0.749 and 0.792 respectively. The model simulated

  11. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  12. Numerical Simulations of an Inversion Fog Event in the Salt Lake Valley during the MATERHORN-Fog Field Campaign

    Science.gov (United States)

    Chachere, Catherine N.; Pu, Zhaoxia

    2018-01-01

    An advanced research version of the Weather Research and Forecasting (WRF) Model is employed to simulate a wintertime inversion fog event in the Salt Lake Valley during the Mountain Terrain Atmospheric Modeling and Observations Program (MATERHORN) field campaign during January 2015. Simulation results are compared to observations obtained from the field program. The sensitivity of numerical simulations to available cloud microphysical (CM), planetary boundary layer (PBL), radiation, and land surface models (LSMs) is evaluated. The influence of differing visibility algorithms and initialization times on simulation results is also examined. Results indicate that the numerical simulations of the fog event are sensitive to the choice of CM, PBL, radiation, and LSM as well as the visibility algorithm and initialization time. Although the majority of experiments accurately captured the synoptic setup environment, errors were found in most experiments within the boundary layer, specifically a 3° warm bias in simulated surface temperatures compared to observations. Accurate representation of surface and boundary layer variables are vital in correctly predicting fog in the numerical model.

  13. Evaluation and simulation of event building techniques for a detector at the LHC

    CERN Document Server

    Spiwoks, R

    1995-01-01

    The main objectives of future experiments at the Large Hadron Collider are the search for the Higgs boson (or bosons), the verification of the Standard Model and the search beyond the Standard Model in a new energy range up to a few TeV. These experiments will have to cope with unprecedented high data rates and will need event building systems which can offer a bandwidth of 1 to 100GB/s and which can assemble events from 100 to 1000 readout memories at rates of 1 to 100kHz. This work investigates the feasibility of parallel event building sys- tems using commercially available high speed interconnects and switches. Studies are performed by building a small-scale prototype and by modelling this proto- type and realistic architectures with discrete-event simulations. The prototype is based on the HiPPI standard and uses commercially available VME-HiPPI interfaces and a HiPPI switch together with modular and scalable software. The setup operates successfully as a parallel event building system of limited size in...

  14. Discrete-event simulation for the design and evaluation of physical protection systems

    International Nuclear Information System (INIS)

    Jordan, S.E.; Snell, M.K.; Madsen, M.M.; Smith, J.S.; Peters, B.A.

    1998-01-01

    This paper explores the use of discrete-event simulation for the design and control of physical protection systems for fixed-site facilities housing items of significant value. It begins by discussing several modeling and simulation activities currently performed in designing and analyzing these protection systems and then discusses capabilities that design/analysis tools should have. The remainder of the article then discusses in detail how some of these new capabilities have been implemented in software to achieve a prototype design and analysis tool. The simulation software technology provides a communications mechanism between a running simulation and one or more external programs. In the prototype security analysis tool, these capabilities are used to facilitate human-in-the-loop interaction and to support a real-time connection to a virtual reality (VR) model of the facility being analyzed. This simulation tool can be used for both training (in real-time mode) and facility analysis and design (in fast mode)

  15. Sensitivity of a Simulated Derecho Event to Model Initial Conditions

    Science.gov (United States)

    Wang, Wei

    2014-05-01

    Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.

  16. U.S. Marine Corps Communication-Electronics School Training Process: Discrete-Event Simulation and Lean Options

    National Research Council Canada - National Science Library

    Neu, Charles R; Davenport, Jon; Smith, William R

    2007-01-01

    This paper uses discrete-event simulation modeling, inventory-reduction, and process improvement concepts to identify and analyze possibilities for improving the training continuum at the Marine Corps...

  17. Three Dimensional Simulation of the Baneberry Nuclear Event

    Science.gov (United States)

    Lomov, Ilya N.; Antoun, Tarabay H.; Wagoner, Jeff; Rambo, John T.

    2004-07-01

    Baneberry, a 10-kiloton nuclear event, was detonated at a depth of 278 m at the Nevada Test Site on December 18, 1970. Shortly after detonation, radioactive gases emanating from the cavity were released into the atmosphere through a shock-induced fissure near surface ground zero. Extensive geophysical investigations, coupled with a series of 1D and 2D computational studies were used to reconstruct the sequence of events that led to the catastrophic failure. However, the geological profile of the Baneberry site is complex and inherently three-dimensional, which meant that some geological features had to be simplified or ignored in the 2D simulations. This left open the possibility that features unaccounted for in the 2D simulations could have had an important influence on the eventual containment failure of the Baneberry event. This paper presents results from a high-fidelity 3D Baneberry simulation based on the most accurate geologic and geophysical data available. The results are compared with available data, and contrasted against the results of the previous 2D computational studies.

  18. Validating numerical simulations of snow avalanches using dendrochronology: the Cerro Ventana event in Northern Patagonia, Argentina

    Directory of Open Access Journals (Sweden)

    A. Casteller

    2008-05-01

    Full Text Available The damage caused by snow avalanches to property and human lives is underestimated in many regions around the world, especially where this natural hazard remains poorly documented. One such region is the Argentinean Andes, where numerous settlements are threatened almost every winter by large snow avalanches. On 1 September 2002, the largest tragedy in the history of Argentinean mountaineering took place at Cerro Ventana, Northern Patagonia: nine persons were killed and seven others injured by a snow avalanche. In this paper, we combine both numerical modeling and dendrochronological investigations to reconstruct this event. Using information released by local governmental authorities and compiled in the field, the avalanche event was numerically simulated using the avalanche dynamics programs AVAL-1D and RAMMS. Avalanche characteristics, such as extent and date were determined using dendrochronological techniques. Model simulation results were compared with documentary and tree-ring evidences for the 2002 event. Our results show a good agreement between the simulated projection of the avalanche and its reconstructed extent using tree-ring records. Differences between the observed and the simulated avalanche, principally related to the snow height deposition in the run-out zone, are mostly attributed to the low resolution of the digital elevation model used to represent the valley topography. The main contributions of this study are (1 to provide the first calibration of numerical avalanche models for the Patagonian Andes and (2 to highlight the potential of Nothofagus pumilio tree-ring records to reconstruct past snow-avalanche events in time and space. Future research should focus on testing this combined approach in other forested regions of the Andes.

  19. Simulation Modelling and Strategic Change: Creating the Sustainable Enterprise

    Directory of Open Access Journals (Sweden)

    Patrick Dawson

    2010-01-01

    Full Text Available This paper highlights the benefits of using discrete event simulation models for developing change management frameworks which facilitate productivity and environmental improvements in order to create a sustainable enterprise. There is an increasing need for organisations to be more socially and environmentally responsible, however these objectives cannot be realised in isolation of the strategic, operations and business objectives of the enterprise. Discrete Event Simulation models facilitate a multidimensional approach to enterprise modelling which can integrate operations and strategic considerations with environmental and social issues. Moreover these models can provide a dynamic roadmap for implementing a change strategy for realising the optimal conditions for operational and environmental performance. It is important to note that the nature of change is itself dynamic and that simulation models are capable of characterising the dynamics of the change process. The paper argues that incorporating social and environmental challenges into a strategic business model for an enterprise can result in improved profits and long term viability and that a multidimensional simulation approach can support decision making throughout the change process to more effectively achieve these goals.

  20. Discrete event simulation model for external yard choice of import container terminal in a port buffer area

    Science.gov (United States)

    Rusgiyarto, Ferry; Sjafruddin, Ade; Frazila, Russ Bona; Suprayogi

    2017-06-01

    Increasing container traffic and land acquisition problem for terminal expansion leads to usage of external yard in a port buffer area. This condition influenced the terminal performance because a road which connects the terminal and the external yard was also used by non-container traffic. Location choice problem considered to solve this condition, but the previous research has not taken account a stochastic condition of container arrival rate and service time yet. Bi-level programming framework was used to find optimum location configuration. In the lower-level, there was a problem to construct the equation, which correlated the terminal operation and the road due to different time cycle equilibrium. Container moves from the quay to a terminal gate in a daily unit of time, meanwhile, it moves from the terminal gate to the external yard through the road in a minute unit of time. If the equation formulated in hourly unit equilibrium, it cannot catch up the container movement characteristics in the terminal. Meanwhile, if the equation formulated in daily unit equilibrium, it cannot catch up the road traffic movement characteristics in the road. This problem can be addressed using simulation model. Discrete Event Simulation Model was used to simulate import container flow processes in the container terminal and external yard. Optimum location configuration in the upper-level was the combinatorial problem, which was solved by Full Enumeration approach. The objective function of the external yard location model was to minimize user transport cost (or time) and to maximize operator benefit. Numerical experiment was run for the scenario assumption of two container handling ways, three external yards, and thirty-day simulation periods. Jakarta International Container Terminal (JICT) container characteristics data was referred for the simulation. Based on five runs which were 5, 10, 15, 20, and 30 repetitions, operation one of three available external yards (external yard

  1. The Skateboard Factory: a teaching case on discrete-event simulation

    Directory of Open Access Journals (Sweden)

    Marco Aurélio de Mesquita

    Full Text Available Abstract Real-life applications during the teaching process are a desirable practice in simulation education. However, access to real cases imposes some difficulty in implement such practice, especially when the classes are large. This paper presents a teaching case for a computer simulation course in a production engineering undergraduate program. The motivation for the teaching case was to provide students with a realistic manufacturing case to stimulate the learning of simulation concepts and methods in the context of industrial engineering. The case considers a virtual factory of skateboards, which operations include parts manufacturing, final assembly and storage of raw materials, work-in-process and finished products. Students should model and simulate the factory, under push and pull production strategies, using any simulation software available in the laboratory. The teaching case, applied in the last two years, contributed to motivate and consolidate the students’ learning of discrete-event simulation. It proved to be a feasible alternative to the previous practice of letting students freely choose a case for their final project, while keeping the essence of project-based learning approach.

  2. Simulation and event reconstruction inside the PandaRoot framework

    International Nuclear Information System (INIS)

    Spataro, S

    2008-01-01

    The PANDA detector will be located at the future GSI accelerator FAIR. Its primary objective is the investigation of strong interaction with anti-proton beams, in the range up to 15 GeV/c as momentum of the incoming anti-proton. The PANDA offline simulation framework is called 'PandaRoot', as it is based upon the ROOT 5.14 package. It is characterized by a high versatility; it allows to perform simulation and analysis, to run different event generators (EvtGen, Pluto, UrQmd), different transport models (Geant3, Geant4, Fluka) with the same code, thus to compare the results simply by changing few macro lines without recompiling at all. Moreover auto-configuration scripts allow installing the full framework easily in different Linux distributions and with different compilers (the framework was installed and tested in more than 10 Linux platforms) without further manipulation. The final data are in a tree format, easily accessible and readable through simple clicks on the root browsers. The presentation will report on the actual status of the computing development inside the PandaRoot framework, in terms of detector implementation and event reconstruction

  3. Comparative Study of Aircraft Boarding Strategies Using Cellular Discrete Event Simulation

    Directory of Open Access Journals (Sweden)

    Shafagh Jafer

    2017-11-01

    Full Text Available Time is crucial in the airlines industry. Among all factors contributing to an aircraft turnaround time; passenger boarding delays is the most challenging one. Airlines do not have control over the behavior of passengers; thus, focusing their effort on reducing passenger boarding time through implementing efficient boarding strategies. In this work, we attempt to use cellular Discrete-Event System Specification (Cell-DEVS modeling and simulation to provide a comprehensive evaluation of aircraft boarding strategies. We have developed a simulation benchmark consisting of eight boarding strategies including Back-to-Front; Window Middle Aisle; Random; Zone Rotate; Reverse Pyramid; Optimal; Optimal Practical; and Efficient. Our simulation models are scalable and adaptive; providing a powerful analysis apparatus for investigating any existing or yet to be discovered boarding strategy. We explain the details of our models and present the results both visually and numerically to evaluate the eight implemented boarding strategies. We also compare our results with other studies that have used different modeling techniques; reporting nearly identical performance results. The simulations revealed that Window Middle Aisle provides the least boarding delay; with a small fraction of time difference compared to the optimal strategy. The results of this work could highly benefit the commercial airlines industry by optimizing and reducing passenger boarding delays.

  4. Identification of coronal heating events in 3D simulations

    Science.gov (United States)

    Kanella, Charalambos; Gudiksen, Boris V.

    2017-07-01

    Context. The solar coronal heating problem has been an open question in the science community since 1939. One of the proposed models for the transport and release of mechanical energy generated in the sub-photospheric layers and photosphere is the magnetic reconnection model that incorporates Ohmic heating, which releases a part of the energy stored in the magnetic field. In this model many unresolved flaring events occur in the solar corona, releasing enough energy to heat the corona. Aims: The problem with the verification and quantification of this model is that we cannot resolve small scale events due to limitations of the current observational instrumentation. Flaring events have scaling behavior extending from large X-class flares down to the so far unobserved nanoflares. Histograms of observable characteristics of flares show powerlaw behavior for energy release rate, size, and total energy. Depending on the powerlaw index of the energy release, nanoflares might be an important candidate for coronal heating; we seek to find that index. Methods: In this paper we employ a numerical three-dimensional (3D)-magnetohydrodynamic (MHD) simulation produced by the numerical code Bifrost, which enables us to look into smaller structures, and a new technique to identify the 3D heating events at a specific instant. The quantity we explore is the Joule heating, a term calculated directly by the code, which is explicitly correlated with the magnetic reconnection because it depends on the curl of the magnetic field. Results: We are able to identify 4136 events in a volume 24 × 24 × 9.5 Mm3 (I.e., 768 × 786 × 331 grid cells) of a specific snapshot. We find a powerlaw slope of the released energy per second equal to αP = 1.5 ± 0.02, and two powerlaw slopes of the identified volume equal to αV = 1.53 ± 0.03 and αV = 2.53 ± 0.22. The identified energy events do not represent all the released energy, but of the identified events, the total energy of the largest events

  5. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  6. A hadron-nucleus collision event generator for simulations at intermediate energies

    CERN Document Server

    Ackerstaff, K; Bollmann, R

    2002-01-01

    Several available codes for hadronic event generation and shower simulation are discussed and their predictions are compared to experimental data in order to obtain a satisfactory description of hadronic processes in Monte Carlo studies of detector systems for medium energy experiments. The most reasonable description is found for the intra-nuclear-cascade (INC) model of Bertini which employs microscopic description of the INC, taking into account elastic and inelastic pion-nucleon and nucleon-nucleon scattering. The isobar model of Sternheimer and Lindenbaum is used to simulate the inelastic elementary collisions inside the nucleus via formation and decay of the DELTA sub 3 sub 3 -resonance which, however, limits the model at higher energies. To overcome this limitation, the INC model has been extended by using the resonance model of the HADRIN code, considering all resonances in elementary collisions contributing more than 2% to the total cross-section up to kinetic energies of 5 GeV. In addition, angular d...

  7. Modelling and real-time simulation of continuous-discrete systems in mechatronics

    Energy Technology Data Exchange (ETDEWEB)

    Lindow, H. [Rostocker, Magdeburg (Germany)

    1996-12-31

    This work presents a methodology for simulation and modelling of systems with continuous - discrete dynamics. It derives hybrid discrete event models from Lagrange`s equations of motion. This method combines continuous mechanical, electrical and thermodynamical submodels on one hand with discrete event models an the other hand into a hybrid discrete event model. This straight forward software development avoids numeric overhead.

  8. Comparison of Explicitly Simulated and Downscaled Tropical Cyclone Activity in a High-Resolution Global Climate Model

    Directory of Open Access Journals (Sweden)

    Hirofumi Tomita

    2010-01-01

    Full Text Available The response of tropical cyclone activity to climate change is a matter of great inherent interest and practical importance. Most current global climate models are not, however, capable of adequately resolving tropical cyclones; this has led to the development of downscaling techniques designed to infer tropical cyclone activity from the large-scale fields produced by climate models. Here we compare the statistics of tropical cyclones simulated explicitly in a very high resolution (~14 km grid mesh global climate model to the results of one such downscaling technique driven by the same global model. This is done for a simulation of the current climate and also for a simulation of a climate warmed by the addition of carbon dioxide. The explicitly simulated and downscaled storms are similarly distributed in space, but the intensity distribution of the downscaled events has a somewhat longer high-intensity tail, owing to the higher resolution of the downscaling model. Both explicitly simulated and downscaled events show large increases in the frequency of events at the high-intensity ends of their respective intensity distributions, but the downscaled storms also show increases in low-intensity events, whereas the explicitly simulated weaker events decline in number. On the regional scale, there are large differences in the responses of the explicitly simulated and downscaled events to global warming. In particular, the power dissipation of downscaled events shows a 175% increase in the Atlantic, while the power dissipation of explicitly simulated events declines there.

  9. A Modeling and Simulation Framework for Adverse Events in Erlotinib-Treated Non-Small-Cell Lung Cancer Patients.

    Science.gov (United States)

    Suleiman, Ahmed Abbas; Frechen, Sebastian; Scheffler, Matthias; Zander, Thomas; Nogova, Lucia; Kocher, Martin; Jaehde, Ulrich; Wolf, Jürgen; Fuhr, Uwe

    2015-11-01

    Treatment with erlotinib, an epidermal growth factor receptor tyrosine kinase inhibitor used for treating non-small-cell lung cancer (NSCLC) and other cancers, is frequently associated with adverse events (AE). We present a modeling and simulation framework for the most common erlotinib-induced AE, rash, and diarrhea, providing insights into erlotinib toxicity. We used the framework to investigate the safety of high-dose erlotinib pulses proposed to limit acquired resistance while treating NSCLC. Continuous-time Markov models were developed using rash and diarrhea AE data from 39 NSCLC patients treated with erlotinib (150 mg/day). Exposure and different covariates were investigated as predictors of variability. Rash was also tested as a survival predictor. Models developed were used in a simulation analysis to compare the toxicities of different regimens, including the previously mentioned pulsed strategy. Probabilities of experiencing rash or diarrhea were found to be highest early during treatment. Rash, but not diarrhea, was positively correlated with erlotinib exposure. In contrast with some common understandings, radiotherapy decreased transitioning to higher rash grades by 81% (p simulations predicted that the proposed pulsed regimen (1600 mg/week + 50 mg/day remaining week days) results in a maximum of 20% of the patients suffering from severe rash throughout the treatment course in comparison to 12% when treated with standard dosing (150 mg/day). In conclusion, the framework demonstrated that radiotherapy attenuates erlotinib-induced rash, providing an opportunity to use radiotherapy and erlotinib together, and demonstrated the tolerability of high-dose pulses intended to address acquired resistance to erlotinib.

  10. Event-by-Event Simulation of Induced Fission

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  11. Event-by-Event Simulation of Induced Fission

    Science.gov (United States)

    Vogt, Ramona; Randrup, Jørgen

    2008-04-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  12. Event-by-Event Simulation of Induced Fission

    International Nuclear Information System (INIS)

    Vogt, Ramona; Randrup, Joergen

    2008-01-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented

  13. Event-by-Event Simulation of Induced Fission

    International Nuclear Information System (INIS)

    Vogt, R; Randrup, J

    2007-01-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented

  14. GRMHD Simulations of Visibility Amplitude Variability for Event Horizon Telescope Images of Sgr A*

    Science.gov (United States)

    Medeiros, Lia; Chan, Chi-kwan; Özel, Feryal; Psaltis, Dimitrios; Kim, Junhan; Marrone, Daniel P.; Sa¸dowski, Aleksander

    2018-04-01

    The Event Horizon Telescope will generate horizon scale images of the black hole in the center of the Milky Way, Sgr A*. Image reconstruction using interferometric visibilities rests on the assumption of a stationary image. We explore the limitations of this assumption using high-cadence disk- and jet-dominated GRMHD simulations of Sgr A*. We also employ analytic models that capture the basic characteristics of the images to understand the origin of the variability in the simulated visibility amplitudes. We find that, in all simulations, the visibility amplitudes for baselines oriented parallel and perpendicular to the spin axis of the black hole follow general trends that do not depend strongly on accretion-flow properties. This suggests that fitting Event Horizon Telescope observations with simple geometric models may lead to a reasonably accurate determination of the orientation of the black hole on the plane of the sky. However, in the disk-dominated models, the locations and depths of the minima in the visibility amplitudes are highly variable and are not related simply to the size of the black hole shadow. This suggests that using time-independent models to infer additional black hole parameters, such as the shadow size or the spin magnitude, will be severely affected by the variability of the accretion flow.

  15. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  16. A General Simulation Framework for Supply Chain Modeling: State of the Art and Case Study

    OpenAIRE

    Antonio Cimino; Francesco Longo; Giovanni Mirabelli

    2010-01-01

    Nowadays there is a large availability of discrete event simulation software that can be easily used in different domains: from industry to supply chain, from healthcare to business management, from training to complex systems design. Simulation engines of commercial discrete event simulation software use specific rules and logics for simulation time and events management. Difficulties and limitations come up when commercial discrete event simulation software are used for modeling complex rea...

  17. System modeling and simulation at EBR-II

    International Nuclear Information System (INIS)

    Dean, E.M.; Lehto, W.K.; Larson, H.A.

    1986-01-01

    The codes being developed and verified using EBR-II data are the NATDEMO, DSNP and CSYRED. NATDEMO is a variation of the Westinghouse DEMO code coupled to the NATCON code previously used to simulate perturbations of reactor flow and inlet temperature and loss-of-flow transients leading to natural convection in EBR-II. CSYRED uses the Continuous System Modeling Program (CSMP) to simulate the EBR-II core, including power, temperature, control-rod movement reactivity effects and flow and is used primarily to model reactivity induced power transients. The Dynamic Simulator for Nuclear Power Plants (DSNP) allows a whole plant, thermal-hydraulic simulation using specific component and system models called from libraries. It has been used to simulate flow coastdown transients, reactivity insertion events and balance-of-plant perturbations

  18. A global MHD simulation of an event with a quasi-steady northward IMF component

    Directory of Open Access Journals (Sweden)

    V. G. Merkin

    2007-06-01

    Full Text Available We show results of the Lyon-Fedder-Mobarry (LFM global MHD simulations of an event previously examined using Iridium spacecraft observations as well as DMSP and IMAGE FUV data. The event is chosen for the steady northward IMF sustained over a three-hour period during 16 July 2000. The Iridium observations showed very weak or absent Region 2 currents in the ionosphere, which makes the event favorable for global MHD modeling. Here we are interested in examining the model's performace during weak magnetospheric forcing, in particular, its ability to reproduce gross signatures of the ionospheric currents and convection pattern and energy deposition in the ionosphere both due to the Poynting flux and particle precipitation. We compare the ionospheric field-aligned current and electric potential patterns with those recovered from Iridium and DMSP observations, respectively. In addition, DMSP magnetometer data are used for comparisons of ionospheric magnetic perturbations. The electromagnetic energy flux is compared with Iridium-inferred values, while IMAGE FUV observations are utilized to verify the simulated particle energy flux.

  19. Modelling and Simulating multi-echelon food systems

    NARCIS (Netherlands)

    Vorst, van der J.G.A.J.; Beulens, A.J.M.; Beek, van P.

    2000-01-01

    This paper presents a method for modelling the dynamic behaviour of food supply chains and evaluating alternative designs of the supply chain by applying discrete-event simulation. The modelling method is based on the concepts of business processes, design variables at strategic and operational

  20. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  1. Impact of the Assimilation of Hyperspectral Infrared Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    Science.gov (United States)

    Berndt, Emily B.; Zavodsky, Bradley T; Jedlovec, Gary J.; Elmer, Nicholas J.

    2013-01-01

    Non-convective wind events commonly occur with passing extratropical cyclones and have significant societal and economic impacts. Since non-convective winds often occur in the absence of specific phenomena such as a thunderstorm, tornado, or hurricane, the public are less likely to heed high wind warnings and continue daily activities. Thus non-convective wind events result in as many fatalities as straight line thunderstorm winds. One physical explanation for non-convective winds includes tropopause folds. Improved model representation of stratospheric air and associated non-convective wind events could improve non-convective wind forecasts and associated warnings. In recent years, satellite data assimilation has improved skill in forecasting extratropical cyclones; however errors still remain in forecasting the position and strength of extratropical cyclones as well as the tropopause folding process. The goal of this study is to determine the impact of assimilating satellite temperature and moisture retrieved profiles from hyperspectral infrared (IR) sounders (i.e. Atmospheric Infrared Sounder (AIRS), Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI)) on the model representation of the tropopause fold and an associated high wind event that impacted the Northeast United States on 09 February 2013. Model simulations using the Advanced Research Weather Research and Forecasting Model (ARW) were conducted on a 12-km grid with cycled data assimilation mimicking the operational North American Model (NAM). The results from the satellite assimilation run are compared to a control experiment (without hyperspectral IR retrievals), North American Regional Reanalysis (NARR) reanalysis, and Rapid Refresh analyses.

  2. Charge-dependent correlations from event-by-event anomalous hydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hirono, Yuji [Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY 11794-3800 (United States); Hirano, Tetsufumi [Department of Physics, Sophia University, Tokyo 102-8554 (Japan); Kharzeev, Dmitri E. [Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY 11794-3800 (United States); Department of Physics and RIKEN-BNL Research Center, Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2016-12-15

    We report on our recent attempt of quantitative modeling of the Chiral Magnetic Effect (CME) in heavy-ion collisions. We perform 3+1 dimensional anomalous hydrodynamic simulations on an event-by-event basis, with constitutive equations that contain the anomaly-induced effects. We also develop a model of the initial condition for the axial charge density that captures the statistical nature of random chirality imbalances created by the color flux tubes. Basing on the event-by-event hydrodynamic simulations for hundreds of thousands of collisions, we calculate the correlation functions that are measured in experiments, and discuss how the anomalous transport affects these observables.

  3. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  4. Simulation and study of small numbers of random events

    Science.gov (United States)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  5. Event-based scenario manager for multibody dynamics simulation of heavy load lifting operations in shipyards

    Directory of Open Access Journals (Sweden)

    Sol Ha

    2016-01-01

    Full Text Available This paper suggests an event-based scenario manager capable of creating and editing a scenario for shipbuilding process simulation based on multibody dynamics. To configure various situation in shipyards and easily connect with multibody dynamics, the proposed method has two main concepts: an Actor and an Action List. The Actor represents the anatomic unit of action in the multibody dynamics and can be connected to a specific component of the dynamics kernel such as the body and joint. The user can make a scenario up by combining the actors. The Action List contains information for arranging and executing the actors. Since the shipbuilding process is a kind of event-based sequence, all simulation models were configured using Discrete EVent System Specification (DEVS formalism. The proposed method was applied to simulations of various operations in shipyards such as lifting and erection of a block and heavy load lifting operation using multiple cranes.

  6. A model management system for combat simulation

    OpenAIRE

    Dolk, Daniel R.

    1986-01-01

    The design and implementation of a model management system to support combat modeling is discussed. Structured modeling is introduced as a formalism for representing mathematical models. A relational information resource dictionary system is developed which can accommodate structured models. An implementation is described. Structured modeling is then compared to Jackson System Development (JSD) as a methodology for facilitating discrete event simulation. JSD is currently better at representin...

  7. Geologic simulation model for a hypothetical site in the Columbia Plateau

    International Nuclear Information System (INIS)

    Petrie, G.M.; Zellmer, J.T.; Lindberg, J.W.; Foley, M.G.

    1981-04-01

    This report describes the structure and operation of the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Geologic Simulation Model, a computer simulation model of the geology and hydrology of an area of the Columbia Plateau, Washington. The model is used to study the long-term suitability of the Columbia Plateau Basalts for the storage of nuclear waste in a mined repository. It is also a starting point for analyses of such repositories in other geologic settings. The Geologic Simulation Model will aid in formulating design disruptive sequences (i.e. those to be used for more detailed hydrologic, transport, and dose analyses) from the spectrum of hypothetical geological and hydrological developments that could result in transport of radionuclides out of a repository. Quantitative and auditable execution of this task, however, is impossible without computer simulation. The computer simulation model aids the geoscientist by generating the wide spectrum of possible future evolutionary paths of the areal geology and hydrology, identifying those that may affect the repository integrity. This allows the geoscientist to focus on potentially disruptive processes, or series of events. Eleven separate submodels are used in the simulation portion of the model: Climate, Continental Glaciation, Deformation, Geomorphic Events, Hydrology, Magmatic Events, Meteorite Impact, Sea-Level Fluctuations, Shaft-Seal Failure, Sub-Basalt Basement Faulting, and Undetected Features. Because of the modular construction of the model, each submodel can easily be replaced with an updated or modified version as new information or developments in the state of the art become available. The model simulates the geologic and hydrologic systems of a hypothetical repository site and region for a million years following repository decommissioning. The Geologic Simulation Model operates in both single-run and Monte Carlo modes

  8. Simulated CONUS Flash Flood Climatologies from Distributed Hydrologic Models

    Science.gov (United States)

    Flamig, Z.; Gourley, J. J.; Vergara, H. J.; Kirstetter, P. E.; Hong, Y.

    2016-12-01

    This study will describe a CONUS flash flood climatology created over the period from 2002 through 2011. The MRMS reanalysis precipitation dataset was used as forcing into the Ensemble Framework For Flash Flood Forecasting (EF5). This high resolution 1-sq km 5-minute dataset is ideal for simulating flash floods with a distributed hydrologic model. EF5 features multiple water balance components including SAC-SMA, CREST, and a hydrophobic model all coupled with kinematic wave routing. The EF5/SAC-SMA and EF5/CREST water balance schemes were used for the creation of dual flash flood climatologies based on the differing water balance principles. For the period from 2002 through 2011 the daily maximum streamflow, unit streamflow, and time of peak streamflow was stored along with the minimum soil moisture. These variables are used to describe the states of the soils right before a flash flood event and the peak streamflow that was simulated during the flash flood event. The results will be shown, compared and contrasted. The resulting model simulations will be verified on basins less than 1,000-sq km with USGS gauges to ensure the distributed hydrologic models are reliable. The results will also be compared spatially to Storm Data flash flood event observations to judge the degree of agreement between the simulated climatologies and observations.

  9. Modelling extreme climatic events in Guadalquivir Estuary ( Spain)

    Science.gov (United States)

    Delgado, Juan; Moreno-Navas, Juan; Pulido, Antoine; García-Lafuente, Juan; Calero Quesada, Maria C.; García, Rodrigo

    2017-04-01

    Extreme climatic events, such as heat waves and severe storms are predicted to increase in frequency and magnitude as a consequence of global warming but their socio-ecological effects are poorly understood, particularly in estuarine ecosystems. The Guadalquivir Estuary has been anthropologically modified several times, the original salt marshes have been transformed to grow rice and cotton and approximately one-fourth of the total surface of the estuary is now part of two protected areas, one of them is a UNESCO, MAB Biosphere Reserve. The climatic events are most likely to affect Europe in forthcoming decades and a further understanding how these climatic disturbances drive abrupt changes in the Guadalquivir estuary is needed. A barotropic model has been developed to study how severe storm events affects the estuary by conducting paired control and climate-events simulations. The changes in the local wind and atmospheric pressure conditions in the estuary have been studied in detail and several scenarios are obtained by running the model under control and real storm conditions. The model output has been validated with in situ water elevation and good agreement between modelled and real measurements have been obtained. Our preliminary results show that the model demonstrated the capability describe of the tide-surge levels in the estuary, opening the possibility to study the interaction between climatic events and the port operations and food production activities. The barotropic hydrodynamic model provide spatially explicit information on the key variables governing the tide dynamics of estuarine areas under severe climatic scenarios . The numerical model will be a powerful tool in future climate change mitigation and adaptation programs in a complex socio-ecological system.

  10. Simulating Flaring Events via an Intelligent Cellular Automata Mechanism

    Science.gov (United States)

    Dimitropoulou, M.; Vlahos, L.; Isliker, H.; Georgoulis, M.

    2010-07-01

    We simulate flaring events through a Cellular Automaton (CA) model, in which, for the first time, we use observed vector magnetograms as initial conditions. After non-linear force free extrapolation of the magnetic field from the vector magnetograms, we identify magnetic discontinuities, using two alternative criteria: (1) the average magnetic field gradient, or (2) the normalized magnetic field curl (i.e. the current). Magnetic discontinuities are identified at the grid-sites where the magnetic field gradient or curl exceeds a specified threshold. We then relax the magnetic discontinuities according to the rules of Lu and Hamilton (1991) or Lu et al. (1993), i.e. we redistribute the magnetic field locally so that the discontinuities disappear. In order to simulate the flaring events, we consider several alternative scenarios with regard to: (1) The threshold above which magnetic discontinuities are identified (applying low, high, and height-dependent threshold values); (2) The driving process that occasionally causes new discontinuities (at randomly chosen grid sites, magnetic field increments are added that are perpendicular (or may-be also parallel) to the existing magnetic field). We address the question whether the coronal active region magnetic fields can indeed be considered to be in the state of self-organized criticality (SOC).

  11. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  12. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  13. Capturing flood-to-drought transitions in regional climate model simulations

    Science.gov (United States)

    Anders, Ivonne; Haslinger, Klaus; Hofstätter, Michael; Salzmann, Manuela; Resch, Gernot

    2017-04-01

    In previous studies atmospheric cyclones have been investigated in terms of related precipitation extremes in Central Europe. Mediterranean (Vb-like) cyclones are of special relevance as they are frequently related to high atmospheric moisture fluxes leading to floods and landslides in the Alpine region. Another focus in this area is on droughts, affecting soil moisture and surface and sub-surface runoff as well. Such events develop differently depending on available pre-saturation of water in the soil. In a first step we investigated two time periods which encompass a flood event and a subsequent drought on very different time scales, one long lasting transition (2002/2003) and a rather short one between May and August 2013. In a second step we extended the investigation to the long time period 1950-2016. We focused on high spatial and temporal scales and assessed the currently achievable accuracy in the simulation of the Vb-events on one hand and following drought events on the other hand. The state-of-the-art regional climate model CCLM is applied in hindcast-mode simulating the single events described above, but also the time from 1948 to 2016 to evaluate the results from the short runs to be valid for the long time period. Besides the conventional forcing of the regional climate model at its lateral boundaries, a spectral nudging technique is applied. The simulations covering the European domain have been varied systematically different model parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). For the drought events the Standardized Precipitation Evapotranspiration Index (SPEI), soil moisture and runoff has been investigated. Varying the spectral nudging setup helps us to understand the 3D-processes during these events, but also to identify model deficiencies. To improve the simulation of such events in the past

  14. A systematic comparison of recurrent event models for application to composite endpoints.

    Science.gov (United States)

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  15. WRF-Chem Model Simulations of Arizona Dust Storms

    Science.gov (United States)

    Mohebbi, A.; Chang, H. I.; Hondula, D.

    2017-12-01

    The online Weather Research and Forecasting model with coupled chemistry module (WRF-Chem) is applied to simulate the transport, deposition and emission of the dust aerosols in an intense dust outbreak event that took place on July 5th, 2011 over Arizona. Goddard Chemistry Aerosol Radiation and Transport (GOCART), Air Force Weather Agency (AFWA), and University of Cologne (UoC) parameterization schemes for dust emission were evaluated. The model was found to simulate well the synoptic meteorological conditions also widely documented in previous studies. The chemistry module performance in reproducing the atmospheric desert dust load was evaluated using the horizontal field of the Aerosol Optical Depth (AOD) from Moderate Resolution Imaging Spectro (MODIS) radiometer Terra/Aqua and Aerosol Robotic Network (AERONET) satellites employing standard Dark Target (DT) and Deep Blue (DB) algorithms. To assess the temporal variability of the dust storm, Particulate Matter mass concentration data (PM10 and PM2.5) from Arizona Department of Environmental Quality (AZDEQ) ground-based air quality stations were used. The promising performance of WRF-Chem indicate that the model is capable of simulating the right timing and loading of a dust event in the planetary-boundary-layer (PBL) which can be used to forecast approaching severe dust events and to communicate an effective early warning.

  16. Event-based rainfall-runoff modelling of the Kelantan River Basin

    Science.gov (United States)

    Basarudin, Z.; Adnan, N. A.; Latif, A. R. A.; Tahir, W.; Syafiqah, N.

    2014-02-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area.

  17. Event-based rainfall-runoff modelling of the Kelantan River Basin

    International Nuclear Information System (INIS)

    Basarudin, Z; Adnan, N A; Latif, A R A; Syafiqah, N; Tahir, W

    2014-01-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area

  18. Simulating X-ray bursts during a transient accretion event

    Science.gov (United States)

    Johnston, Zac; Heger, Alexander; Galloway, Duncan K.

    2018-06-01

    Modelling of thermonuclear X-ray bursts on accreting neutron stars has to date focused on stable accretion rates. However, bursts are also observed during episodes of transient accretion. During such events, the accretion rate can evolve significantly between bursts, and this regime provides a unique test for burst models. The accretion-powered millisecond pulsar SAX J1808.4-3658 exhibits accretion outbursts every 2-3 yr. During the well-sampled month-long outburst of 2002 October, four helium-rich X-ray bursts were observed. Using this event as a test case, we present the first multizone simulations of X-ray bursts under a time-dependent accretion rate. We investigate the effect of using a time-dependent accretion rate in comparison to constant, averaged rates. Initial results suggest that using a constant, average accretion rate between bursts may underestimate the recurrence time when the accretion rate is decreasing, and overestimate it when the accretion rate is increasing. Our model, with an accreted hydrogen fraction of X = 0.44 and a CNO metallicity of ZCNO = 0.02, reproduces the observed burst arrival times and fluences with root mean square (rms) errors of 2.8 h, and 0.11× 10^{-6} erg cm^{-2}, respectively. Our results support previous modelling that predicted two unobserved bursts and indicate that additional bursts were also missed by observations.

  19. Discrete event simulation and virtual reality use in industry: new opportunities and future trends

    OpenAIRE

    Turner, Christopher; Hutabarat, Windo; Oyekan, John; Tiwari, Ashutosh

    2016-01-01

    This paper reviews the area of combined discrete event simulation (DES) and virtual reality (VR) use within industry. While establishing a state of the art for progress in this area, this paper makes the case for VR DES as the vehicle of choice for complex data analysis through interactive simulation models, highlighting both its advantages and current limitations. This paper reviews active research topics such as VR and DES real-time integration, communication protocols,...

  20. Automating the Simulation of SME Processes through a Discrete Event Parametric Model

    Directory of Open Access Journals (Sweden)

    Francesco Aggogeri

    2015-02-01

    Full Text Available At the factory level, the manufacturing system can be described as a group of processes governed by complex weaves of engineering strategies and technologies. Decision- making processes involve a lot of information, driven by managerial strategies, technological implications and layout constraints. Many factors affect decisions, and their combination must be carefully managed to determine the best solutions to optimize performances. In this way, advanced simulation tools could support the decisional process of many SMEs. The accessibility of these tools is limited by knowledge, cost, data availability and development time. These tools should be used to support strategic decisions rather than specific situations. In this paper, a novel approach is proposed that aims to facilitate the simulation of manufacturing processes by fast modelling and evaluation. The idea is to realize a model that is able to be automatically adapted to the user’s specific needs. The model must be characterized by a high degree of flexibility, configurability and adaptability in order to automatically simulate multiple/heterogeneous industrial scenarios. In this way, even a SME can easily access a complex tool, perform thorough analyses and be supported in taking strategic decisions. The parametric DES model is part of a greater software platform developed during COPERNICO EU funded project.

  1. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    Science.gov (United States)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  2. SAPS simulation with GITM/UCLA-RCM coupled model

    Science.gov (United States)

    Lu, Y.; Deng, Y.; Guo, J.; Zhang, D.; Wang, C. P.; Sheng, C.

    2017-12-01

    Abstract: SAPS simulation with GITM/UCLA-RCM coupled model Author: Yang Lu, Yue Deng, Jiapeng Guo, Donghe Zhang, Chih-Ping Wang, Cheng Sheng Ion velocity in the Sub Aurora region observed by Satellites in storm time often shows a significant westward component. The high speed westward stream is distinguished with convection pattern. These kind of events are called Sub Aurora Polarization Stream (SAPS). In March 17th 2013 storm, DMSP F18 satellite observed several SAPS cases when crossing Sub Aurora region. In this study, Global Ionosphere Thermosphere Model (GITM) has been coupled to UCLA-RCM model to simulate the impact of SAPS during March 2013 event on the ionosphere/thermosphere. The particle precipitation and electric field from RCM has been used to drive GITM. The conductance calculated from GITM has feedback to RCM to make the coupling to be self-consistent. The comparison of GITM simulations with different SAPS specifications will be conducted. The neutral wind from simulation will be compared with GOCE satellite. The comparison between runs with SAPS and without SAPS will separate the effect of SAPS from others and illustrate the impact on the TIDS/TADS propagating to both poleward and equatorward directions.

  3. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  4. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  5. Modelling the interaction between flooding events and economic growth

    Directory of Open Access Journals (Sweden)

    J. Grames

    2015-06-01

    Full Text Available Socio-hydrology describes the interaction between the socio-economy and water. Recent models analyze the interplay of community risk-coping culture, flooding damage and economic growth (Di Baldassarre et al., 2013; Viglione et al., 2014. These models descriptively explain the feedbacks between socio-economic development and natural disasters like floods. Contrary to these descriptive models, our approach develops an optimization model, where the intertemporal decision of an economic agent interacts with the hydrological system. In order to build this first economic growth model describing the interaction between the consumption and investment decisions of an economic agent and the occurrence of flooding events, we transform an existing descriptive stochastic model into an optimal deterministic model. The intermediate step is to formulate and simulate a descriptive deterministic model. We develop a periodic water function to approximate the former discrete stochastic time series of rainfall events. Due to the non-autonomous exogenous periodic rainfall function the long-term path of consumption and investment will be periodic.

  6. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Impact of the Assimilation of Hyperspectral Infrared Retrieved Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    Science.gov (United States)

    Berndt, E. B.; Zavodsky, B. T.; Folmer, M. J.; Jedlovec, G. J.

    2014-01-01

    Non-convective wind events commonly occur with passing extratropical cyclones and have significant societal and economic impacts. Since non-convective winds often occur in the absence of specific phenomena such as a thunderstorm, tornado, or hurricane, the public are less likely to heed high wind warnings and continue daily activities. Thus non-convective wind events result in as many fatalities as straight line thunderstorm winds. One physical explanation for non-convective winds includes tropopause folds. Improved model representation of stratospheric air and associated non-convective wind events could improve non-convective wind forecasts and associated warnings. In recent years, satellite data assimilation has improved skill in forecasting extratropical cyclones; however errors still remain in forecasting the position and strength of extratropical cyclones as well as the tropopause folding process. The goal of this study is to determine the impact of assimilating satellite temperature and moisture retrieved profiles from hyperspectral infrared (IR) sounders (i.e. Atmospheric Infrared Sounder (AIRS), Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI)) on the model representation of the tropopause fold and an associated high wind event that impacted the Northeast United States on 09 February 2013. Model simulations using the Advanced Research Weather Research and Forecasting Model (ARW) were conducted on a 12-km grid with cycled data assimilation mimicking the operational North American Model (NAM). The results from the satellite assimilation run are compared to a control experiment (without hyperspectral IR retrievals), 32-km North American Regional Reanalysis (NARR) interpolated to a 12-km grid, and 13-km Rapid Refresh analyses.

  8. Modeling the Magnetopause Shadowing Loss during the October 2012 Dropout Event

    Science.gov (United States)

    Tu, Weichao; Cunningham, Gregory

    2017-04-01

    The relativistic electron flux in Earth's outer radiation belt are observed to drop by orders of magnitude on timescales of a few hours, which is called radiation belt dropouts. Where do the electrons go during the dropouts? This is one of the most important outstanding questions in radiation belt studies. Radiation belt electrons can be lost either by precipitation into the atmosphere or by transport across the magnetopause into interplanetary space. The latter mechanism is called magnetopause shadowing, usually combined with outward radial diffusion of electrons due to the sharp radial gradient it creates. In order to quantify the relative contribution of these two mechanisms to radiation belt dropout, we performed an event study on the October 2012 dropout event observed by Van Allen Probes. First, the precipitating MeV electrons observed by multiple NOAA POES satellites at low altitude did not show evidence of enhanced precipitation during the dropout, which suggested that precipitation was not the dominant loss mechanism for the event. Then, in order to simulate the magnetopause shadowing loss and outward radial diffusion during the dropout, we applied a radial diffusion model with electron lifetimes on the order of electron drift periods outside the last closed drift shell. In addition, realistic and event-specific inputs of radial diffusion coefficients (DLL) and last closed drift shell (LCDS) were implemented in the model. Specifically, we used the new DLL developed by Cunningham [JGR 2016] which were estimated in realistic TS04 [Tsyganenko and Sitnov, JGR 2005] storm time magnetic field model and included physical K (2nd adiabatic invariant) or pitch angle dependence. Event-specific LCDS traced in TS04 model with realistic K dependence was also implemented. Our simulation results showed that these event-specific inputs are critical to explain the electron dropout during the event. The new DLL greatly improved the model performance at low L* regions (L

  9. Calculation of Fission Observables Through Event-by-Event Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Randrup, J; Vogt, R

    2009-06-04

    The increased interest in more exclusive fission observables has demanded more detailed models. We present here a new computational model, FREYA, that aims to met this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including arbitrary correlations. The various model assumptions are described and the potential utility of the model is illustrated by means of several novel correlation observables.

  10. High resolution modelling of extreme precipitation events in urban areas

    Science.gov (United States)

    Siemerink, Martijn; Volp, Nicolette; Schuurmans, Wytze; Deckers, Dave

    2015-04-01

    The present day society needs to adjust to the effects of climate change. More extreme weather conditions are expected, which can lead to longer periods of drought, but also to more extreme precipitation events. Urban water systems are not designed for such extreme events. Most sewer systems are not able to drain the excessive storm water, causing urban flooding. This leads to high economic damage. In order to take appropriate measures against extreme urban storms, detailed knowledge about the behaviour of the urban water system above and below the streets is required. To investigate the behaviour of urban water systems during extreme precipitation events new assessment tools are necessary. These tools should provide a detailed and integral description of the flow in the full domain of overland runoff, sewer flow, surface water flow and groundwater flow. We developed a new assessment tool, called 3Di, which provides detailed insight in the urban water system. This tool is based on a new numerical methodology that can accurately deal with the interaction between overland runoff, sewer flow and surface water flow. A one-dimensional model for the sewer system and open channel flow is fully coupled to a two-dimensional depth-averaged model that simulates the overland flow. The tool uses a subgrid-based approach in order to take high resolution information of the sewer system and of the terrain into account [1, 2]. The combination of using the high resolution information and the subgrid based approach results in an accurate and efficient modelling tool. It is now possible to simulate entire urban water systems using extreme high resolution (0.5m x 0.5m) terrain data in combination with a detailed sewer and surface water network representation. The new tool has been tested in several Dutch cities, such as Rotterdam, Amsterdam and The Hague. We will present the results of an extreme precipitation event in the city of Schiedam (The Netherlands). This city deals with

  11. Handling of the Generation of Primary Events in Gauss, the LHCb Simulation Framework

    CERN Multimedia

    Corti, G; Brambach, T; Brook, N H; Gauvin, N; Harrison, K; Harrison, P; He, J; Ilten, P J; Jones, C R; Lieng, M H; Manca, G; Miglioranzi, S; Robbe, P; Vagnoni, V; Whitehead, M; Wishahi, J

    2010-01-01

    The LHCb simulation application, Gauss, consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCb experimental program it is particularly important to model B meson decays: the EvtGen code developed in CLEO and BaBar has been chosen and customized for non coherent B production as occuring in pp collisions at the LHC. The initial proton-proton collision is provided by a different generator engine, currently Pythia 6 for massive production of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages are available in the physics community or specifically developed in LHCb, and are used for the different purposes. Running conditions affecting the events generated such as the size of the luminous region, the number of collisions occuring in a bunc...

  12. DROpS: an object of learning in computer simulation of discrete events

    Directory of Open Access Journals (Sweden)

    Hugo Alves Silva Ribeiro

    2015-09-01

    Full Text Available This work presents the “Realistic Dynamics Of Simulated Operations” (DROpS, the name given to the dynamics using the “dropper” device as an object of teaching and learning. The objective is to present alternatives for professors teaching content related to simulation of discrete events to graduate students in production engineering. The aim is to enable students to develop skills related to data collection, modeling, statistical analysis, and interpretation of results. This dynamic has been developed and applied to the students by placing them in a situation analogous to a real industry, where various concepts related to computer simulation were discussed, allowing the students to put these concepts into practice in an interactive manner, thus facilitating learning

  13. Event Shape Sorting: selecting events with similar evolution

    Directory of Open Access Journals (Sweden)

    Tomášik Boris

    2017-01-01

    Full Text Available We present novel method for the organisation of events. The method is based on comparing event-by-event histograms of a chosen quantity Q that is measured for each particle in every event. The events are organised in such a way that those with similar shape of the Q-histograms end-up placed close to each other. We apply the method on histograms of azimuthal angle of the produced hadrons in ultrarelativsitic nuclear collisions. By selecting events with similar azimuthal shape of their hadron distribution one chooses events which are likely that they underwent similar evolution from the initial state to the freeze-out. Such events can more easily be compared to theoretical simulations where all conditions can be controlled. We illustrate the method on data simulated by the AMPT model.

  14. Trunk muscle recruitment patterns in simulated precrash events.

    Science.gov (United States)

    Ólafsdóttir, Jóna Marín; Fice, Jason B; Mang, Daniel W H; Brolin, Karin; Davidsson, Johan; Blouin, Jean-Sébastien; Siegmund, Gunter P

    2018-02-28

    To quantify trunk muscle activation levels during whole body accelerations that simulate precrash events in multiple directions and to identify recruitment patterns for the development of active human body models. Four subjects (1 female, 3 males) were accelerated at 0.55 g (net Δv = 4.0 m/s) in 8 directions while seated on a sled-mounted car seat to simulate a precrash pulse. Electromyographic (EMG) activity in 4 trunk muscles was measured using wire electrodes inserted into the left rectus abdominis, internal oblique, iliocostalis, and multifidus muscles at the L2-L3 level. Muscle activity evoked by the perturbations was normalized by each muscle's isometric maximum voluntary contraction (MVC) activity. Spatial tuning curves were plotted at 150, 300, and 600 ms after acceleration onset. EMG activity remained below 40% MVC for the three time points for most directions. At the 150- and 300 ms time points, the highest EMG amplitudes were observed during perturbations to the left (-90°) and left rearward (-135°). EMG activity diminished by 600 ms for the anterior muscles, but not for the posterior muscles. These preliminary results suggest that trunk muscle activity may be directionally tuned at the acceleration level tested here. Although data from more subjects are needed, these preliminary data support the development of modeled trunk muscle recruitment strategies in active human body models that predict occupant responses in precrash scenarios.

  15. Numerical simulation of a winter hailstorm event over Delhi, India on 17 January 2013

    Science.gov (United States)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-09-01

    This study analyzes the cause of rare occurrence of winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using Weather Research and Forecasting (WRF) model with Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options, hail or graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with comparative analysis of the two options of GCE microphysics. On evaluating the model simulations, it is observed that hail option shows similar precipitation intensity with TRMM observation than the graupel option and is able to simulate hail precipitation. Using the model simulated output with hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached upto the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of WD. Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  16. BAYESIAN TECHNIQUES FOR COMPARING TIME-DEPENDENT GRMHD SIMULATIONS TO VARIABLE EVENT HORIZON TELESCOPE OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios, E-mail: junhankim@email.arizona.edu [Department of Astronomy and Steward Observatory, University of Arizona, 933 N. Cherry Avenue, Tucson, AZ 85721 (United States)

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  17. Discrete events simulation of a route with traffic lights through automated control in real time

    Directory of Open Access Journals (Sweden)

    Rodrigo César Teixeira Baptista

    2013-03-01

    Full Text Available This paper presents the integration and communication in real-time of a discrete event simulation model with an automatic control system. The simulation model of an intersection with roads having traffic lights was built in the Arena environment. The integration and communication have been made via network, and the control system was operated by a programmable logic controller. Scenarios were simulated for the free, regular and congested traffic situations. The results showed the average number of vehicles that entered in the system and that were retained and also the total average time of the crossing of the vehicles on the road. In general, the model allowed evaluating the behavior of the traffic in each of the ways and the commands from the controller to activation and deactivation of the traffic lights.

  18. Using a discrete-event simulation to balance ambulance availability and demand in static deployment systems.

    Science.gov (United States)

    Wu, Ching-Han; Hwang, Kevin P

    2009-12-01

    To improve ambulance response time, matching ambulance availability with the emergency demand is crucial. To maintain the standard of 90% of response times within 9 minutes, the authors introduce a discrete-event simulation method to estimate the threshold for expanding the ambulance fleet when demand increases and to find the optimal dispatching strategies when provisional events create temporary decreases in ambulance availability. The simulation model was developed with information from the literature. Although the development was theoretical, the model was validated on the emergency medical services (EMS) system of Tainan City. The data are divided: one part is for model development, and the other for validation. For increasing demand, the effect was modeled on response time when call arrival rates increased. For temporary availability decreases, the authors simulated all possible alternatives of ambulance deployment in accordance with the number of out-of-routine-duty ambulances and the durations of three types of mass gatherings: marathon races (06:00-10:00 hr), rock concerts (18:00-22:00 hr), and New Year's Eve parties (20:00-01:00 hr). Statistical analysis confirmed that the model reasonably represented the actual Tainan EMS system. The response-time standard could not be reached when the incremental ratio of call arrivals exceeded 56%, which is the threshold for the Tainan EMS system to expand its ambulance fleet. When provisional events created temporary availability decreases, the Tainan EMS system could spare at most two ambulances from the standard configuration, except between 20:00 and 01:00, when it could spare three. The model also demonstrated that the current Tainan EMS has two excess ambulances that could be dropped. The authors suggest dispatching strategies to minimize the response times in routine daily emergencies. Strategies of capacity management based on this model improved response times. The more ambulances that are out of routine duty

  19. Collaborative design for embedded systems co-modelling and co-simulation

    CERN Document Server

    Fitzgerald, John; Verhoef, Marcel

    2014-01-01

    This book presents a framework that allows the very different kinds of design models - discrete-event models of software and continuous time models of the physical environment - to be analyzed and simulated jointly, based on common scenarios.

  20. Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.

    2017-10-01

    Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.

  1. Modelling of dynamic and quasistatic events with special focus on wood-drying distortions

    OpenAIRE

    Ekevad, Mats

    2006-01-01

    This thesis deals mainly with computer simulations of wood-drying distortions, especially twist. The reason for this is that such distortions often appear in dried timber, and the results are quality downgrades and thus value losses in the wood value chain. A computer simulation is a way to theoretically simulate what happens in reality when moisture content in timber changes. If the computer simulation model is appropriate and capable of realistic simulations of real events, then it is possi...

  2. A model of return intervals between earthquake events

    Science.gov (United States)

    Zhou, Yu; Chechkin, Aleksei; Sokolov, Igor M.; Kantz, Holger

    2016-06-01

    Application of the diffusion entropy analysis and the standard deviation analysis to the time sequence of the southern California earthquake events from 1976 to 2002 uncovered scaling behavior typical for anomalous diffusion. However, the origin of such behavior is still under debate. Some studies attribute the scaling behavior to the correlations in the return intervals, or waiting times, between aftershocks or mainshocks. To elucidate a nature of the scaling, we applied specific reshulffling techniques to eliminate correlations between different types of events and then examined how it affects the scaling behavior. We demonstrate that the origin of the scaling behavior observed is the interplay between mainshock waiting time distribution and the structure of clusters of aftershocks, but not correlations in waiting times between the mainshocks and aftershocks themselves. Our findings are corroborated by numerical simulations of a simple model showing a very similar behavior. The mainshocks are modeled by a renewal process with a power-law waiting time distribution between events, and aftershocks follow a nonhomogeneous Poisson process with the rate governed by Omori's law.

  3. Modeling of Single Event Transients With Dual Double-Exponential Current Sources: Implications for Logic Cell Characterization

    Science.gov (United States)

    Black, Dolores A.; Robinson, William H.; Wilcox, Ian Z.; Limbrick, Daniel B.; Black, Jeffrey D.

    2015-08-01

    Single event effects (SEE) are a reliability concern for modern microelectronics. Bit corruptions can be caused by single event upsets (SEUs) in the storage cells or by sampling single event transients (SETs) from a logic path. An accurate prediction of soft error susceptibility from SETs requires good models to convert collected charge into compact descriptions of the current injection process. This paper describes a simple, yet effective, method to model the current waveform resulting from a charge collection event for SET circuit simulations. The model uses two double-exponential current sources in parallel, and the results illustrate why a conventional model based on one double-exponential source can be incomplete. A small set of logic cells with varying input conditions, drive strength, and output loading are simulated to extract the parameters for the dual double-exponential current sources. The parameters are based upon both the node capacitance and the restoring current (i.e., drive strength) of the logic cell.

  4. Self-Adaptive Event-Driven Simulation of Multi-Scale Plasma Systems

    Science.gov (United States)

    Omelchenko, Yuri; Karimabadi, Homayoun

    2005-10-01

    Multi-scale plasmas pose a formidable computational challenge. The explicit time-stepping models suffer from the global CFL restriction. Efficient application of adaptive mesh refinement (AMR) to systems with irregular dynamics (e.g. turbulence, diffusion-convection-reaction, particle acceleration etc.) may be problematic. To address these issues, we developed an alternative approach to time stepping: self-adaptive discrete-event simulation (DES). DES has origin in operations research, war games and telecommunications. We combine finite-difference and particle-in-cell techniques with this methodology by assuming two caveats: (1) a local time increment, dt for a discrete quantity f can be expressed in terms of a physically meaningful quantum value, df; (2) f is considered to be modified only when its change exceeds df. Event-driven time integration is self-adaptive as it makes use of causality rules rather than parametric time dependencies. This technique enables asynchronous flux-conservative update of solution in accordance with local temporal scales, removes the curse of the global CFL condition, eliminates unnecessary computation in inactive spatial regions and results in robust and fast parallelizable codes. It can be naturally combined with various mesh refinement techniques. We discuss applications of this novel technology to diffusion-convection-reaction systems and hybrid simulations of magnetosonic shocks.

  5. Improving Energy Efficiency for the Vehicle Assembly Industry: A Discrete Event Simulation Approach

    Science.gov (United States)

    Oumer, Abduaziz; Mekbib Atnaw, Samson; Kie Cheng, Jack; Singh, Lakveer

    2016-11-01

    This paper presented a Discrete Event Simulation (DES) model for investigating and improving energy efficiency in vehicle assembly line. The car manufacturing industry is one of the highest energy consuming industries. Using Rockwell Arena DES package; a detailed model was constructed for an actual vehicle assembly plant. The sources of energy considered in this research are electricity and fuel; which are the two main types of energy sources used in a typical vehicle assembly plant. The model depicts the performance measurement for process- specific energy measures of painting, welding, and assembling processes. Sound energy efficiency model within this industry has two-fold advantage: reducing CO2 emission and cost reduction associated with fuel and electricity consumption. The paper starts with an overview of challenges in energy consumption within the facilities of automotive assembly line and highlights the parameters for energy efficiency. The results of the simulation model indicated improvements for energy saving objectives and reduced costs.

  6. The Impact of Inpatient Boarding on ED Efficiency: A Discrete-Event Simulation Study

    OpenAIRE

    Bair, Aaron E.; Song, Wheyming T.; Chen, Yi-Chun; Morris, Beth A.

    2009-01-01

    In this study, a discrete-event simulation approach was used to model Emergency Department’s (ED) patient flow to investigate the effect of inpatient boarding on the ED efficiency in terms of the National Emergency Department Crowding Scale (NEDOCS) score and the rate of patients who leave without being seen (LWBS). The decision variable in this model was the boarder-released-ratio defined as the ratio of admitted patients whose boarding time is zero to all admitted patients. Our analysis sho...

  7. Evaluating MJO Event Initiation and Decay in the Skeleton Model using an RMM-like Index

    Science.gov (United States)

    2015-11-25

    univariate zonal wind EOF analysis, the mean number of continuing events exceeds 437 observations, though the observed number falls within the 95...year simulation period using the truncated, 464 observed SSTs. Approximately two-thirds of the observed events fall within 20-100 days with a 465...Advances in simulating atmospheric variability with the ECMWF 745 model: From synoptic to decadal time-scales, Q. J. Roy. Meteor . Soc.. 134, 1337

  8. The Impact of the Assimilation of Hyperspectral Infrared Retrieved Profiles on Advanced Weather and Research Model Simulations of a Non-Convective Wind Event

    Science.gov (United States)

    Berndt, Emily; Zavodsky, Bradley; Jedlovec, Gary; Elmer, Nicholas

    2013-01-01

    Non-convective wind events commonly occur with passing extratropical cyclones and have significant societal and economic impacts. Since non-convective winds often occur in the absence of specific phenomena such as a thunderstorm, tornado, or hurricane, the public are less likely to heed high wind warnings and continue daily activities. Thus non-convective wind events result in as many fatalities as straight line thunderstorm winds. One physical explanation for non-convective winds includes tropopause folds. Improved model representation of stratospheric air and associated non-convective wind events could improve non-convective wind forecasts and associated warnings. In recent years, satellite data assimilation has improved skill in forecasting extratropical cyclones; however errors still remain in forecasting the position and strength of extratropical cyclones as well as the tropopause folding process. The goal of this study is to determine the impact of assimilating satellite temperature and moisture retrieved profiles from hyperspectral infrared (IR) sounders (i.e. Atmospheric Infrared Sounder (AIRS), Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI)) on the model representation of the tropopause fold and an associated high wind event that impacted the Northeast United States on 09 February 2013. Model simulations using the Advanced Research Weather Research and Forecasting Model (ARW) were conducted on a 12-km grid with cycled data assimilation mimicking the operational North American Model (NAM). The results from the satellite assimilation run are compared to a control experiment (without hyperspectral IR retrievals), Modern Era-Retrospective Analysis for Research and Applications (MERRA) reanalysis, and Rapid Refresh analyses.

  9. Wavelet spectra of JACEE events

    International Nuclear Information System (INIS)

    Suzuki, Naomichi; Biyajima, Minoru; Ohsawa, Akinori.

    1995-01-01

    Pseudo-rapidity distributions of two high multiplicity events Ca-C and Si-AgBr observed by the JACEE are analyzed by a wavelet transform. Wavelet spectra of those events are calculated and compared with the simulation calculations. The wavelet spectrum of the Ca-C event somewhat resembles that simulated with the uniform random numbers. That of Si-AgBr event, however, is not reproduced by simulation calculations with Poisson random numbers, uniform random numbers, or a p-model. (author)

  10. A Numerical Approach for Hybrid Simulation of Power System Dynamics Considering Extreme Icing Events

    DEFF Research Database (Denmark)

    Chen, Lizheng; Zhang, Hengxu; Wu, Qiuwei

    2017-01-01

    numerical simulation scheme integrating icing weather events with power system dynamics is proposed to extend power system numerical simulation. A technique is developed to efficiently simulate the interaction of slow dynamics of weather events and fast dynamics of power systems. An extended package for PSS...

  11. Vaporization studies of plasma interactive materials in simulated plasma disruption events

    International Nuclear Information System (INIS)

    Stone, C.A. IV; Croessmann, C.D.; Whitley, J.B.

    1988-03-01

    The melting and vaporization that occur when plasma facing materials are subjected to a plasma disruption will severely limit component lifetime and plasma performance. A series of high heat flux experiments was performed on a group of fusion reactor candidate materials to model material erosion which occurs during plasma disruption events. The Electron Beam Test System was used to simulate single disruption and multiple disruption phenomena. Samples of aluminum, nickel, copper, molybdenum, and 304 stainless steel were subjected to a variety of heat loads, ranging from 100 to 400 msec pulses of 8 to 18 kWcm 2 . It was found that the initial surface temperature of a material strongly influences the vaporization process and that multiple disruptions do not scale linearly with respect to single disruption events. 2 refs., 9 figs., 5 tabs

  12. Weather model performance on extreme rainfall events simulation's over Western Iberian Peninsula

    Science.gov (United States)

    Pereira, S. C.; Carvalho, A. C.; Ferreira, J.; Nunes, J. P.; Kaiser, J. J.; Rocha, A.

    2012-08-01

    This study evaluates the performance of the WRF-ARW numerical weather model in simulating the spatial and temporal patterns of an extreme rainfall period over a complex orographic region in north-central Portugal. The analysis was performed for the December month of 2009, during the Portugal Mainland rainy season. The heavy rainfall to extreme heavy rainfall periods were due to several low surface pressure's systems associated with frontal surfaces. The total amount of precipitation for December exceeded, in average, the climatological mean for the 1971-2000 time period in +89 mm, varying from 190 mm (south part of the country) to 1175 mm (north part of the country). Three model runs were conducted to assess possible improvements in model performance: (1) the WRF-ARW is forced with the initial fields from a global domain model (RunRef); (2) data assimilation for a specific location (RunObsN) is included; (3) nudging is used to adjust the analysis field (RunGridN). Model performance was evaluated against an observed hourly precipitation dataset of 15 rainfall stations using several statistical parameters. The WRF-ARW model reproduced well the temporal rainfall patterns but tended to overestimate precipitation amounts. The RunGridN simulation provided the best results but model performance of the other two runs was good too, so that the selected extreme rainfall episode was successfully reproduced.

  13. 3D Simulation of External Flooding Events for the RISMC Pathway

    International Nuclear Information System (INIS)

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad; Smith, Curtis; Lin, Linyu

    2015-01-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  14. 3D Simulation of External Flooding Events for the RISMC Pathway

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, Steven [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sampath, Ramprasad [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lin, Linyu [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  15. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    Science.gov (United States)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  16. Formal Analysis of BPMN Models Using Event-B

    Science.gov (United States)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  17. Autocalibration of a one-dimensional hydrodynamic-ecological model (DYRESM 4.0-CAEDYM 3.1 using a Monte Carlo approach: simulations of hypoxic events in a polymictic lake

    Directory of Open Access Journals (Sweden)

    L. Luo

    2018-03-01

    Full Text Available Automated calibration of complex deterministic water quality models with a large number of biogeochemical parameters can reduce time-consuming iterative simulations involving empirical judgements of model fit. We undertook autocalibration of the one-dimensional hydrodynamic-ecological lake model DYRESM-CAEDYM, using a Monte Carlo sampling (MCS method, in order to test the applicability of this procedure for shallow, polymictic Lake Rotorua (New Zealand. The calibration procedure involved independently minimizing the root-mean-square error (RMSE, maximizing the Pearson correlation coefficient (r and Nash–Sutcliffe efficient coefficient (Nr for comparisons of model state variables against measured data. An assigned number of parameter permutations was used for 10 000 simulation iterations. The "optimal" temperature calibration produced a RMSE of 0.54 °C, Nr value of 0.99, and r value of 0.98 through the whole water column based on comparisons with 540 observed water temperatures collected between 13 July 2007 and 13 January 2009. The modeled bottom dissolved oxygen concentration (20.5 m below surface was compared with 467 available observations. The calculated RMSE of the simulations compared with the measurements was 1.78 mg L−1, the Nr value was 0.75, and the r value was 0.87. The autocalibrated model was further tested for an independent data set by simulating bottom-water hypoxia events from 15 January 2009 to 8 June 2011 (875 days. This verification produced an accurate simulation of five hypoxic events corresponding to DO  <  2 mg L−1 during summer of 2009–2011. The RMSE was 2.07 mg L−1, Nr value 0.62, and r value of 0.81, based on the available data set of 738 days. The autocalibration software of DYRESM-CAEDYM developed here is substantially less time-consuming and more efficient in parameter optimization than traditional manual calibration which has been the standard tool practiced for similar

  18. Autocalibration of a one-dimensional hydrodynamic-ecological model (DYRESM 4.0-CAEDYM 3.1) using a Monte Carlo approach: simulations of hypoxic events in a polymictic lake

    Science.gov (United States)

    Luo, Liancong; Hamilton, David; Lan, Jia; McBride, Chris; Trolle, Dennis

    2018-03-01

    Automated calibration of complex deterministic water quality models with a large number of biogeochemical parameters can reduce time-consuming iterative simulations involving empirical judgements of model fit. We undertook autocalibration of the one-dimensional hydrodynamic-ecological lake model DYRESM-CAEDYM, using a Monte Carlo sampling (MCS) method, in order to test the applicability of this procedure for shallow, polymictic Lake Rotorua (New Zealand). The calibration procedure involved independently minimizing the root-mean-square error (RMSE), maximizing the Pearson correlation coefficient (r) and Nash-Sutcliffe efficient coefficient (Nr) for comparisons of model state variables against measured data. An assigned number of parameter permutations was used for 10 000 simulation iterations. The "optimal" temperature calibration produced a RMSE of 0.54 °C, Nr value of 0.99, and r value of 0.98 through the whole water column based on comparisons with 540 observed water temperatures collected between 13 July 2007 and 13 January 2009. The modeled bottom dissolved oxygen concentration (20.5 m below surface) was compared with 467 available observations. The calculated RMSE of the simulations compared with the measurements was 1.78 mg L-1, the Nr value was 0.75, and the r value was 0.87. The autocalibrated model was further tested for an independent data set by simulating bottom-water hypoxia events from 15 January 2009 to 8 June 2011 (875 days). This verification produced an accurate simulation of five hypoxic events corresponding to DO < 2 mg L-1 during summer of 2009-2011. The RMSE was 2.07 mg L-1, Nr value 0.62, and r value of 0.81, based on the available data set of 738 days. The autocalibration software of DYRESM-CAEDYM developed here is substantially less time-consuming and more efficient in parameter optimization than traditional manual calibration which has been the standard tool practiced for similar complex water quality models.

  19. Modeling sediment yield in small catchments at event scale: Model comparison, development and evaluation

    Science.gov (United States)

    Tan, Z.; Leung, L. R.; Li, H. Y.; Tesfa, T. K.

    2017-12-01

    Sediment yield (SY) has significant impacts on river biogeochemistry and aquatic ecosystems but it is rarely represented in Earth System Models (ESMs). Existing SY models focus on estimating SY from large river basins or individual catchments so it is not clear how well they simulate SY in ESMs at larger spatial scales and globally. In this study, we compare the strengths and weaknesses of eight well-known SY models in simulating annual mean SY at about 400 small catchments ranging in size from 0.22 to 200 km2 in the US, Canada and Puerto Rico. In addition, we also investigate the performance of these models in simulating event-scale SY at six catchments in the US using high-quality hydrological inputs. The model comparison shows that none of the models can reproduce the SY at large spatial scales but the Morgan model performs the better than others despite its simplicity. In all model simulations, large underestimates occur in catchments with very high SY. A possible pathway to reduce the discrepancies is to incorporate sediment detachment by landsliding, which is currently not included in the models being evaluated. We propose a new SY model that is based on the Morgan model but including a landsliding soil detachment scheme that is being developed. Along with the results of the model comparison and evaluation, preliminary findings from the revised Morgan model will be presented.

  20. A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    International Nuclear Information System (INIS)

    George-Williams, Hindolo; Patelli, Edoardo

    2016-01-01

    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system's failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability. - Highlights: • A discrete event simulation model based on load flow principles. • Model does not require system path or cut sets. • Applicable to binary and multi-state systems of any topology. • Supports multiple output systems with competing demand. • Model is intuitive and generally applicable.

  1. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  2. Reconstruction and numerical modelling of a flash flood event: Atrani 2010

    Science.gov (United States)

    Ciervo, F.; Papa, M. N.; Medina, V.; Bateman, A.

    2012-04-01

    The work intends to reproduce the flash-flood event that occurred in Atrani (Amalfi Coast - Southern Italy) on the 9 September 2010. In the days leading up to the event, intense low pressure system affected the North Europe attracting hot humid air masses from the Mediterranean areas and pushing them to the southern regions of Italy. These conditions contributed to the development of strong convective storm systems, Mesoscale Convective Systems (MCS) type. The development of intense convective rain cells, over an extremely confined areas, leaded to a cumulative daily rainfall of 129.2 mm; the maximum precipitation in 1hr was 19.4mm. The Dragone river is artificially forced to flow underneath the urban estate of Atrani through a culvert until it finally flows out into the sea. In correspondence of the culvert inlet a minor fraction of the water discharge (5.9m^3/s), skimming over the channel cover, flowed on the street and invaded the village. The channelized flow generated overpressure involving the breaking of the cover of culvert slab and caused a new discharge inlet (20 m^3/s) on the street modifying the downstream flood dynamics. Information acquired, soon after the event, through the local people interviews and the field measurements significantly contributed to the rainfall event reconstruction and to the characterization of the induced effects. In absence of hydrometric data, the support of the amateur videos was of crucial importance for the hydraulic model development and calibration. A geomorphology based rainfall-runoff model, WFIUH type (Instantaneous Unit Hydrograph Width Function), is implemented to extract the hydrograph of the hydrological event. All analysis are performed with GIS support basing on a Digital Terrain System (DTM) 5x5m. Two parameters have been used to calibrate the model: the average watershed velocity (Vmean = 0.08m/s) and hydrodynamic diffusivity (D=10E^-6 m^2/s). The model is calibrated basing on the peak discharge assessed value

  3. Evaluating TCMS Train-to-Ground communication performances based on the LTE technology and discreet event simulations

    DEFF Research Database (Denmark)

    Bouaziz, Maha; Yan, Ying; Kassab, Mohamed

    2018-01-01

    is shared between the train and different passengers. The simulation is based on the discrete-events network simulator Riverbed Modeler. Next, second step focusses on a co-simulation testbed, to evaluate performances with real traffic based on Hardware-In-The-Loop and OpenAirInterface modules. Preliminary...... (Long Term Evolution) network as an alternative communication technology, instead of GSM-R (Global System for Mobile communications-Railway) because of some capacity and capability limits. First step, a pure simulation is used to evaluate the network load for a high-speed scenario, when the LTE network...... simulation and co-simulation results show that LTE provides good performance for the TCMS traffic exchange in terms of packet delay and data integrity...

  4. Numerical Modeling of the Severe Cold Weather Event over Central Europe (January 2006

    Directory of Open Access Journals (Sweden)

    D. Hari Prasad

    2010-01-01

    Full Text Available Cold waves commonly occur in higher latitudes under prevailing high pressure systems especially during winter season which cause serious economical loss and cold related death. Accurate prediction of such severe weather events is important for decision making by administrators and for mitigation planning. An Advanced high resolution Weather Research and Forecasting mesoscale model is used to simulate a severe cold wave event occurred during January 2006 over Europe. The model is integrated for 31 days starting from 00UTC of 1 January 2006 with 30 km horizontal resolution. Comparison of the model derived area averaged daily mean temperatures at 2m height from different zones over the central Europe with observations indicates that the model is able to simulate the occurrence of the cold wave with the observed time lag of 1 to 3days but with lesser intensity. The temperature, winds, surface pressure and the geopential heights at 500 hPa reveal that the cold wave development associates with the southward progression of a high pressure system and cold air advection. The results have good agreement with the analysis fields indicates that the model has the ability to reproduce the time evolution of the cold wave event.

  5. Extreme temperature events on Greenland in observations and the MAR regional climate model

    Science.gov (United States)

    Leeson, Amber A.; Eastoe, Emma; Fettweis, Xavier

    2018-03-01

    Meltwater from the Greenland Ice Sheet contributed 1.7-6.12 mm to global sea level between 1993 and 2010 and is expected to contribute 20-110 mm to future sea level rise by 2100. These estimates were produced by regional climate models (RCMs) which are known to be robust at the ice sheet scale but occasionally miss regional- and local-scale climate variability (e.g. Leeson et al., 2017; Medley et al., 2013). To date, the fidelity of these models in the context of short-period variability in time (i.e. intra-seasonal) has not been fully assessed, for example their ability to simulate extreme temperature events. We use an event identification algorithm commonly used in extreme value analysis, together with observations from the Greenland Climate Network (GC-Net), to assess the ability of the MAR (Modèle Atmosphérique Régional) RCM to reproduce observed extreme positive-temperature events at 14 sites around Greenland. We find that MAR is able to accurately simulate the frequency and duration of these events but underestimates their magnitude by more than half a degree Celsius/kelvin, although this bias is much smaller than that exhibited by coarse-scale Era-Interim reanalysis data. As a result, melt energy in MAR output is underestimated by between 16 and 41 % depending on global forcing applied. Further work is needed to precisely determine the drivers of extreme temperature events, and why the model underperforms in this area, but our findings suggest that biases are passed into MAR from boundary forcing data. This is important because these forcings are common between RCMs and their range of predictions of past and future ice sheet melting. We propose that examining extreme events should become a routine part of global and regional climate model evaluation and that addressing shortcomings in this area should be a priority for model development.

  6. Constructing Dynamic Event Trees from Markov Models

    International Nuclear Information System (INIS)

    Paolo Bucci; Jason Kirschenbaum; Tunc Aldemir; Curtis Smith; Ted Wood

    2006-01-01

    In the probabilistic risk assessment (PRA) of process plants, Markov models can be used to model accurately the complex dynamic interactions between plant physical process variables (e.g., temperature, pressure, etc.) and the instrumentation and control system that monitors and manages the process. One limitation of this approach that has prevented its use in nuclear power plant PRAs is the difficulty of integrating the results of a Markov analysis into an existing PRA. In this paper, we explore a new approach to the generation of failure scenarios and their compilation into dynamic event trees from a Markov model of the system. These event trees can be integrated into an existing PRA using software tools such as SAPHIRE. To implement our approach, we first construct a discrete-time Markov chain modeling the system of interest by: (a) partitioning the process variable state space into magnitude intervals (cells), (b) using analytical equations or a system simulator to determine the transition probabilities between the cells through the cell-to-cell mapping technique, and, (c) using given failure/repair data for all the components of interest. The Markov transition matrix thus generated can be thought of as a process model describing the stochastic dynamic behavior of the finite-state system. We can therefore search the state space starting from a set of initial states to explore all possible paths to failure (scenarios) with associated probabilities. We can also construct event trees of arbitrary depth by tracing paths from a chosen initiating event and recording the following events while keeping track of the probabilities associated with each branch in the tree. As an example of our approach, we use the simple level control system often used as benchmark in the literature with one process variable (liquid level in a tank), and three control units: a drain unit and two supply units. Each unit includes a separate level sensor to observe the liquid level in the tank

  7. Using discrete event simulation to compare the performance of family health unit and primary health care centre organizational models in Portugal.

    Science.gov (United States)

    Fialho, André S; Oliveira, Mónica D; Sá, Armando B

    2011-10-15

    Recent reforms in Portugal aimed at strengthening the role of the primary care system, in order to improve the quality of the health care system. Since 2006 new policies aiming to change the organization, incentive structures and funding of the primary health care sector were designed, promoting the evolution of traditional primary health care centres (PHCCs) into a new type of organizational unit--family health units (FHUs). This study aimed to compare performances of PHCC and FHU organizational models and to assess the potential gains from converting PHCCs into FHUs. Stochastic discrete event simulation models for the two types of organizational models were designed and implemented using Simul8 software. These models were applied to data from nineteen primary care units in three municipalities of the Greater Lisbon area. The conversion of PHCCs into FHUs seems to have the potential to generate substantial improvements in productivity and accessibility, while not having a significant impact on costs. This conversion might entail a 45% reduction in the average number of days required to obtain a medical appointment and a 7% and 9% increase in the average number of medical and nursing consultations, respectively. Reorganization of PHCC into FHUs might increase accessibility of patients to services and efficiency in the provision of primary care services.

  8. 3-D topological signatures and a new discrimination method for single-electron events and 0νββ events in CdZnTe: A Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Ming; Li, Teng-Lin; Cang, Ji-Rong [Key Laboratory of Particle & Radiation Imaging (Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Zeng, Zhi, E-mail: zengzhi@tsinghua.edu.cn [Key Laboratory of Particle & Radiation Imaging (Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Fu, Jian-Qiang; Zeng, Wei-He; Cheng, Jian-Ping; Ma, Hao; Liu, Yi-Nong [Key Laboratory of Particle & Radiation Imaging (Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China)

    2017-06-21

    In neutrinoless double beta (0νββ) decay experiments, the diversity of topological signatures of different particles provides an important tool to distinguish double beta events from background events and reduce background rates. Aiming at suppressing the single-electron backgrounds which are most challenging, several groups have established Monte Carlo simulation packages to study the topological characteristics of single-electron events and 0νββ events and develop methods to differentiate them. In this paper, applying the knowledge of graph theory, a new topological signature called REF track (Refined Energy-Filtered track) is proposed and proven to be an accurate approximation of the real particle trajectory. Based on the analysis of the energy depositions along the REF track of single-electron events and 0νββ events, the REF energy deposition models for both events are proposed to indicate the significant differences between them. With these differences, this paper presents a new discrimination method, which, in the Monte Carlo simulation, achieved a single-electron rejection factor of 93.8±0.3 (stat.)% as well as a 0νββ efficiency of 85.6±0.4 (stat.)% with optimized parameters in CdZnTe.

  9. Simultaneous Modeling of Gradual SEP Events at the Earth and the Mars

    Science.gov (United States)

    Hu, J.; Li, G.

    2017-12-01

    Solar Energetic Particles (SEP) event is the number one space hazard for spacecraft instruments and astronauts' safety. Recent studies have shown that both longitudinal and radial extent of SEP events can be very significant. In this work, we use the improved Particle Acceleration and Transport in the Heliosphere (iPATH) model to simulate gradual SEP events that have impacts upon both the Earth and the Mars. We follow the propagation of a 2D CME-driven shock. Particles are accelerated at the shock via the diffusive shock acceleration (DSA) mechanism. Transport of the escaped particles to the Earth and the Mars is then followed using a backward stochastic differential equation method. Perpendicular diffusion is considered in both the DSA and the transport process. Model results such as time intensity profile and energetic particle spectrum at the two locations are compared to understand the spatial extent of an SEP event. Observational data at the Earth and the Mars are also studied to validate the model.

  10. Improved simulation of two types of El Niño in CMIP5 models

    International Nuclear Information System (INIS)

    Kug, Jong-Seong; Ham, Yoo-Geun; Lee, June-Yi; Jin, Fei-Fei

    2012-01-01

    Using the coupled general circulation models (CGCMs) participating in phases 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5), simulations of the two types of El Niño event are evaluated. Previous studies using CMIP3 models pointed out that most of the models tend to simulate a single type of El Niño, and have serious problems in simulating the two types of El Niño independently. On the average, the CGCMs in CMIP5 have slightly better performance in simulating the two types of El Niño event independently with more distinct spatial patterns, compared to those in CMIP3. It is demonstrated that the precipitation response to Cold Tongue El Niño is one of the important factors in simulating the two types of El Niño independently in coupled models, and this precipitation response is closely related to the dry bias over the equatorial eastern Pacific. (letter)

  11. Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

    CERN Document Server

    The ENVISION Collaboration

    2014-01-01

    Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

  12. Modeling Multi-Event Non-Point Source Pollution in a Data-Scarce Catchment Using ANN and Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2017-06-01

    Full Text Available Event-based runoff–pollutant relationships have been the key for water quality management, but the scarcity of measured data results in poor model performance, especially for multiple rainfall events. In this study, a new framework was proposed for event-based non-point source (NPS prediction and evaluation. The artificial neural network (ANN was used to extend the runoff–pollutant relationship from complete data events to other data-scarce events. The interpolation method was then used to solve the problem of tail deviation in the simulated pollutographs. In addition, the entropy method was utilized to train the ANN for comprehensive evaluations. A case study was performed in the Three Gorges Reservoir Region, China. Results showed that the ANN performed well in the NPS simulation, especially for light rainfall events, and the phosphorus predictions were always more accurate than the nitrogen predictions under scarce data conditions. In addition, peak pollutant data scarcity had a significant impact on the model performance. Furthermore, these traditional indicators would lead to certain information loss during the model evaluation, but the entropy weighting method could provide a more accurate model evaluation. These results would be valuable for monitoring schemes and the quantitation of event-based NPS pollution, especially in data-poor catchments.

  13. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  14. A model of spreading of sudden events on social networks

    Science.gov (United States)

    Wu, Jiao; Zheng, Muhua; Zhang, Zi-Ke; Wang, Wei; Gu, Changgui; Liu, Zonghua

    2018-03-01

    Information spreading has been studied for decades, but its underlying mechanism is still under debate, especially for those ones spreading extremely fast through the Internet. By focusing on the information spreading data of six typical events on Sina Weibo, we surprisingly find that the spreading of modern information shows some new features, i.e., either extremely fast or slow, depending on the individual events. To understand its mechanism, we present a susceptible-accepted-recovered model with both information sensitivity and social reinforcement. Numerical simulations show that the model can reproduce the main spreading patterns of the six typical events. By this model, we further reveal that the spreading can be speeded up by increasing either the strength of information sensitivity or social reinforcement. Depending on the transmission probability and information sensitivity, the final accepted size can change from continuous to discontinuous transition when the strength of the social reinforcement is large. Moreover, an edge-based compartmental theory is presented to explain the numerical results. These findings may be of significance on the control of information spreading in modern society.

  15. An Efficient Simulation Method for Rare Events

    KAUST Repository

    Rached, Nadhir B.

    2015-01-07

    Estimating the probability that a sum of random variables (RVs) exceeds a given threshold is a well-known challenging problem. Closed-form expressions for the sum distribution do not generally exist, which has led to an increasing interest in simulation approaches. A crude Monte Carlo (MC) simulation is the standard technique for the estimation of this type of probability. However, this approach is computationally expensive, especially when dealing with rare events. Variance reduction techniques are alternative approaches that can improve the computational efficiency of naive MC simulations. We propose an Importance Sampling (IS) simulation technique based on the well-known hazard rate twisting approach, that presents the advantage of being asymptotically optimal for any arbitrary RVs. The wide scope of applicability of the proposed method is mainly due to our particular way of selecting the twisting parameter. It is worth observing that this interesting feature is rarely satisfied by variance reduction algorithms whose performances were only proven under some restrictive assumptions. It comes along with a good efficiency, illustrated by some selected simulation results comparing the performance of our method with that of an algorithm based on a conditional MC technique.

  16. An Efficient Simulation Method for Rare Events

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Kammoun, Abla; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    Estimating the probability that a sum of random variables (RVs) exceeds a given threshold is a well-known challenging problem. Closed-form expressions for the sum distribution do not generally exist, which has led to an increasing interest in simulation approaches. A crude Monte Carlo (MC) simulation is the standard technique for the estimation of this type of probability. However, this approach is computationally expensive, especially when dealing with rare events. Variance reduction techniques are alternative approaches that can improve the computational efficiency of naive MC simulations. We propose an Importance Sampling (IS) simulation technique based on the well-known hazard rate twisting approach, that presents the advantage of being asymptotically optimal for any arbitrary RVs. The wide scope of applicability of the proposed method is mainly due to our particular way of selecting the twisting parameter. It is worth observing that this interesting feature is rarely satisfied by variance reduction algorithms whose performances were only proven under some restrictive assumptions. It comes along with a good efficiency, illustrated by some selected simulation results comparing the performance of our method with that of an algorithm based on a conditional MC technique.

  17. Simulation of Electrical Grid with Omnet++ Open Source Discrete Event System Simulator

    Directory of Open Access Journals (Sweden)

    Sőrés Milán

    2016-12-01

    Full Text Available The simulation of electrical networks is very important before development and servicing of electrical networks and grids can occur. There are software that can simulate the behaviour of electrical grids under different operating conditions, but these simulation environments cannot be used in a single cloud-based project, because they are not GNU-licensed software products. In this paper, an integrated framework was proposed that models and simulates communication networks. The design and operation of the simulation environment are investigated and a model of electrical components is proposed. After simulation, the simulation results were compared to manual computed results.

  18. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    Science.gov (United States)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-12-01

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  19. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    KAUST Repository

    Chevuturi, A.

    2014-12-19

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00–18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  20. Hybrid High-Fidelity Modeling of Radar Scenarios Using Atemporal, Discrete-Event, and Time-Step Simulation

    Science.gov (United States)

    2016-12-01

    10 Figure 1.8 High-efficiency and high-fidelity radar system simulation flowchart . 15 Figure 1.9...Methodology roadmaps: experimental-design flowchart showing hybrid sensor models integrated from three simulation categories, followed by overall...simulation display and output produced by Java Simkit program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Figure 4.5 Hybrid

  1. Dynamic vegetation modeling of tropical biomes during Heinrich events

    Science.gov (United States)

    Handiani, Dian Noor; Paul, André; Dupont, Lydie M.

    2010-05-01

    Heinrich events are thought to be associated with a slowdown of the Atlantic Meridional Overturning Circulation (AMOC), which in turn would lead to a cooling of the North Atlantic Ocean and a warming of the South Atlantic Ocean (the "bipolar seesaw" hypothesis). The accompanying abrupt climate changes occurred not only in the ocean but also on the continents. Changes were strongest in the Northern Hemisphere but were registered in the tropics as well. Pollen data from Angola and Brazil showed that climate changes during Heinrich events affected vegetation patterns very differently in eastern South America and western Africa. To understand the differential response in the terrestrial tropics, we studied the vegetation changes during Heinrich events by using a dynamic global vegetation model (TRIFFID) as part of the University of Victoria (UVic) Earth System-Climate Model (ESCM). The model results show a bipolar seesaw pattern in temperature and precipitation during a near-collapse of the AMOC. The succession in plant-functional types (PFTs) showed changes from forest to shrubs to desert, including spreading desert in northwest Africa, retreating broadleaf trees in West Africa and northern South America, but advancing broadleaf trees in Brazil. The pattern is explained by a southward shift of the tropical rainbelt resulting in a strong decrease in precipitation over northwest and West Africa as well as in northern South America, but an increase in precipitation in eastern Brazil. To facilitate the comparison between modeled vegetation results with pollen data, we diagnosed the distribution of biomes from the PFT coverage and the simulated model climate. The biome distribution was computed for Heinrich event 1 and the Last Glacial Maximum as well as for pre-industrial conditions. We used a classification of biomes in terms of "mega-biomes", which were defined following a scheme originally proposed by BIOME 6000 (v 4.2). The biome distribution of the Sahel region

  2. Discrete-event system simulation on small and medium enterprises productivity improvement

    Science.gov (United States)

    Sulistio, J.; Hidayah, N. A.

    2017-12-01

    Small and medium industries in Indonesia is currently developing. The problem faced by SMEs is the difficulty of meeting growing demand coming into the company. Therefore, SME need an analysis and evaluation on its production process in order to meet all orders. The purpose of this research is to increase the productivity of SMEs production floor by applying discrete-event system simulation. This method preferred because it can solve complex problems die to the dynamic and stochastic nature of the system. To increase the credibility of the simulation, model validated by cooperating the average of two trials, two trials of variance and chi square test. Afterwards, Benferroni method applied to development several alternatives. The article concludes that, the productivity of SMEs production floor increased up to 50% by adding the capacity of dyeing and drying machines.

  3. An improved methodology for dynamic modelling and simulation of ...

    Indian Academy of Sciences (India)

    The diversity of the processes and the complexity of the drive system .... modelling the specific event, general simulation tools such as Matlab R provide the user with tools for creating ..... using the pulse width modulation (PWM) techniques.

  4. Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash flood event in Genoa, Italy, in the framework of the DRIHM project

    Directory of Open Access Journals (Sweden)

    A. Hally

    2015-03-01

    Full Text Available The e-Science environment developed in the framework of the EU-funded DRIHM project was used to demonstrate its ability to provide relevant, meaningful hydrometeorological forecasts. This was illustrated for the tragic case of 4 November 2011, when Genoa, Italy, was flooded as the result of heavy, convective precipitation that inundated the Bisagno catchment. The Meteorological Model Bridge (MMB, an innovative software component developed within the DRIHM project for the interoperability of meteorological and hydrological models, is a key component of the DRIHM e-Science environment. The MMB allowed three different rainfall-discharge models (DRiFt, RIBS and HBV to be driven by four mesoscale limited-area atmospheric models (WRF-NMM, WRF-ARW, Meso-NH and AROME and a downscaling algorithm (RainFARM in a seamless fashion. In addition to this multi-model configuration, some of the models were run in probabilistic mode, thus giving a comprehensive account of modelling errors and a very large amount of likely hydrometeorological scenarios (> 1500. The multi-model approach proved to be necessary because, whilst various aspects of the event were successfully simulated by different models, none of the models reproduced all of these aspects correctly. It was shown that the resulting set of simulations helped identify key atmospheric processes responsible for the large rainfall accumulations over the Bisagno basin. The DRIHM e-Science environment facilitated an evaluation of the sensitivity to atmospheric and hydrological modelling errors. This showed that both had a significant impact on predicted discharges, the former being larger than the latter. Finally, the usefulness of the set of hydrometeorological simulations was assessed from a flash flood early-warning perspective.

  5. Algorithm and simulation development in support of response strategies for contamination events in air and water systems.

    Energy Technology Data Exchange (ETDEWEB)

    Waanders, Bart Van Bloemen

    2006-01-01

    Chemical/Biological/Radiological (CBR) contamination events pose a considerable threat to our nation's infrastructure, especially in large internal facilities, external flows, and water distribution systems. Because physical security can only be enforced to a limited degree, deployment of early warning systems is being considered. However to achieve reliable and efficient functionality, several complex questions must be answered: (1) where should sensors be placed, (2) how can sparse sensor information be efficiently used to determine the location of the original intrusion, (3) what are the model and data uncertainties, (4) how should these uncertainties be handled, and (5) how can our algorithms and forward simulations be sufficiently improved to achieve real time performance? This report presents the results of a three year algorithmic and application development to support the identification, mitigation, and risk assessment of CBR contamination events. The main thrust of this investigation was to develop (1) computationally efficient algorithms for strategically placing sensors, (2) identification process of contamination events by using sparse observations, (3) characterization of uncertainty through developing accurate demands forecasts and through investigating uncertain simulation model parameters, (4) risk assessment capabilities, and (5) reduced order modeling methods. The development effort was focused on water distribution systems, large internal facilities, and outdoor areas.

  6. Optimized Parallel Discrete Event Simulation (PDES) for High Performance Computing (HPC) Clusters

    National Research Council Canada - National Science Library

    Abu-Ghazaleh, Nael

    2005-01-01

    The aim of this project was to study the communication subsystem performance of state of the art optimistic simulator Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES...

  7. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  8. Hybrid modelling in discrete-event control system design

    NARCIS (Netherlands)

    Beek, van D.A.; Rooda, J.E.; Gordijn, S.H.F.; Borne, P.

    1996-01-01

    Simulation-based testing of discrete-event control systems can be advantageous. There is, however, a considerable difference between languages for real-time control and simulation languages. The Chi language, presented in this paper, is suited to specification and simulation of real-time control

  9. Quasi-monte carlo simulation and variance reduction techniques substantially reduce computational requirements of patient-level simulation models: An application to a discrete event simulation model

    NARCIS (Netherlands)

    Treur, M.; Postma, M.

    2014-01-01

    Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using

  10. Optimization of the resolution of remotely sensed digital elevation model to facilitate the simulation and spatial propagation of flood events in flat areas

    Science.gov (United States)

    Karapetsas, Nikolaos; Skoulikaris, Charalampos; Katsogiannos, Fotis; Zalidis, George; Alexandridis, Thomas

    2013-04-01

    The use of satellite remote sensing products, such as Digital Elevation Models (DEMs), under specific computational interfaces of Geographic Information Systems (GIS) has fostered and facilitated the acquisition of data on specific hydrologic features, such as slope, flow direction and flow accumulation, which are crucial inputs to hydrology or hydraulic models at the river basin scale. However, even though DEMs of different resolution varying from a few km up to 20m are freely available for the European continent, these remotely sensed elevation data are rather coarse in cases where large flat areas are dominant inside a watershed, resulting in an unsatisfactory representation of the terrain characteristics. This scientific work aims at implementing a combing interpolation technique for the amelioration of the analysis of a DEM in order to be used as the input ground model to a hydraulic model for the assessment of potential flood events propagation in plains. More specifically, the second version of the ASTER Global Digital Elevation Model (GDEM2), which has an overall accuracy of around 20 meters, was interpolated with a vast number of aerial control points available from the Hellenic Mapping and Cadastral Organization (HMCO). The uncertainty that was inherent in both the available datasets (ASTER & HMCO) and the appearance of uncorrelated errors and artifacts was minimized by incorporating geostatistical filtering. The resolution of the produced DEM was approximately 10 meters and its validation was conducted with the use of an external dataset of 220 geodetic survey points. The derived DEM was then used as an input to the hydraulic model InfoWorks RS, whose operation is based on the relief characteristics contained in the ground model, for defining, in an automated way, the cross section parameters and simulating the flood spatial distribution. The plain of Serres, which is located in the downstream part of the Struma/Strymon transboundary river basin shared

  11. Analysis and verification of a prediction model of solar energetic proton events

    Science.gov (United States)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  12. PATHWAY: a simulation model of radionuclide-transport through agricultural food chains

    International Nuclear Information System (INIS)

    Kirchner, T.B.; Whicker, F.W.; Otis, M.D.

    1982-01-01

    PATHWAY simulates the transport of radionuclides from fallout through an agricultural ecosystem. The agro-ecosystem is subdivided into several land management units, each of which is used either for grazing animals, for growing hay, or for growing food crops. The model simulates the transport of radionuclides by both discrete events and continuous, time-dependent processes. The discrete events include tillage of soil, harvest and storage of crops,and deposition of fallout. The continuous processes include the transport of radionuclides due to resuspension, weathering, rain splash, percolation, leaching, adsorption and desorption of radionuclides in the soil, root uptake, foliar absorption, growth and senescence of vegetation, and the ingestion assimilation, and excretion of radionuclides by animals. Preliminary validation studies indicate that the model dynamics and simulated values of radionuclide concentrations in several agricultural products agree well with measured values when the model is driven with site specific data on deposition from world-wide fallout

  13. LISEM: a physically based model to simulate runoff and soil erosion in catchments: model structure

    NARCIS (Netherlands)

    Roo, de A.P.J.; Wesseling, C.G.; Cremers, N.H.D.T.; Verzandvoort, M.A.; Ritsema, C.J.; Oostindie, K.

    1996-01-01

    The Limburg Soil Erosion Model (LISEM) is described as a way of simulating hydrological and soil erosion processes during single rainfall events on the catchment scale. Sensitivity analysis of the model shows that the initial matric pressure potentialthe hydraulic conductivity of the soil and

  14. Tropical climate and vegetation changes during Heinrich Event 1: a model-data comparison

    Directory of Open Access Journals (Sweden)

    D. Handiani

    2012-01-01

    Full Text Available Abrupt climate changes from 18 to 15 thousand years before present (kyr BP associated with Heinrich Event 1 (HE1 had a strong impact on vegetation patterns not only at high latitudes of the Northern Hemisphere, but also in the tropical regions around the Atlantic Ocean. To gain a better understanding of the linkage between high and low latitudes, we used the University of Victoria (UVic Earth System-Climate Model (ESCM with dynamical vegetation and land surface components to simulate four scenarios of climate-vegetation interaction: the pre-industrial era, the Last Glacial Maximum (LGM, and a Heinrich-like event with two different climate backgrounds (interglacial and glacial. We calculated mega-biomes from the plant-functional types (PFTs generated by the model to allow for a direct comparison between model results and palynological vegetation reconstructions.

    Our calculated mega-biomes for the pre-industrial period and the LGM corresponded well with biome reconstructions of the modern and LGM time slices, respectively, except that our pre-industrial simulation predicted the dominance of grassland in southern Europe and our LGM simulation resulted in more forest cover in tropical and sub-tropical South America.

    The HE1-like simulation with a glacial climate background produced sea-surface temperature patterns and enhanced inter-hemispheric thermal gradients in accordance with the "bipolar seesaw" hypothesis. We found that the cooling of the Northern Hemisphere caused a southward shift of those PFTs that are indicative of an increased desertification and a retreat of broadleaf forests in West Africa and northern South America. The mega-biomes from our HE1 simulation agreed well with paleovegetation data from tropical Africa and northern South America. Thus, according to our model-data comparison, the reconstructed vegetation changes for the tropical regions around the Atlantic Ocean were physically consistent with the remote

  15. Simulation modeling and analysis in safety. II

    International Nuclear Information System (INIS)

    Ayoub, M.A.

    1981-01-01

    The paper introduces and illustrates simulation modeling as a viable approach for dealing with complex issues and decisions in safety and health. The author details two studies: evaluation of employee exposure to airborne radioactive materials and effectiveness of the safety organization. The first study seeks to define a policy to manage a facility used in testing employees for radiation contamination. An acceptable policy is one that would permit the testing of all employees as defined under regulatory requirements, while not exceeding available resources. The second study evaluates the relationship between safety performance and the characteristics of the organization, its management, its policy, and communication patterns among various functions and levels. Both studies use models where decisions are reached based on the prevailing conditions and occurrence of key events within the simulation environment. Finally, several problem areas suitable for simulation studies are highlighted. (Auth.)

  16. Flood modelling with a distributed event-based parsimonious rainfall-runoff model: case of the karstic Lez river catchment

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2012-04-01

    Full Text Available Rainfall-runoff models are crucial tools for the statistical prediction of flash floods and real-time forecasting. This paper focuses on a karstic basin in the South of France and proposes a distributed parsimonious event-based rainfall-runoff model, coherent with the poor knowledge of both evaporative and underground fluxes. The model combines a SCS runoff model and a Lag and Route routing model for each cell of a regular grid mesh. The efficiency of the model is discussed not only to satisfactorily simulate floods but also to get powerful relationships between the initial condition of the model and various predictors of the initial wetness state of the basin, such as the base flow, the Hu2 index from the Meteo-France SIM model and the piezometric levels of the aquifer. The advantage of using meteorological radar rainfall in flood modelling is also assessed. Model calibration proved to be satisfactory by using an hourly time step with Nash criterion values, ranging between 0.66 and 0.94 for eighteen of the twenty-one selected events. The radar rainfall inputs significantly improved the simulations or the assessment of the initial condition of the model for 5 events at the beginning of autumn, mostly in September–October (mean improvement of Nash is 0.09; correction in the initial condition ranges from −205 to 124 mm, but were less efficient for the events at the end of autumn. In this period, the weak vertical extension of the precipitation system and the low altitude of the 0 °C isotherm could affect the efficiency of radar measurements due to the distance between the basin and the radar (~60 km. The model initial condition S is correlated with the three tested predictors (R2 > 0.6. The interpretation of the model suggests that groundwater does not affect the first peaks of the flood, but can strongly impact subsequent peaks in the case of a multi-storm event. Because this kind of model is based on a limited

  17. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    Science.gov (United States)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  18. Event-by-event Simulation of EPR-Bohm Experiments

    NARCIS (Netherlands)

    De Raedt, K.; Keimpema, K.; De Raedt, H.; Michielsen, K.; Miyashita, S.; Landau, DP; Lewis, SP; Schuttler, HB

    2009-01-01

    We present a computer simulation model that is strictly causal and local in Einstein's sense, does not rely on concepts of quantum theory but. can nevertheless reproduce the results of quantum theory for the single-spin expectation values and two-spin correlations in an Einstem-Podolsky-Rosen-Bohm

  19. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  20. StochKit2: software for discrete stochastic simulation of biochemical systems with events.

    Science.gov (United States)

    Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R

    2011-09-01

    StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.

  1. Benchmarking Simulation of Long Term Station Blackout Events

    International Nuclear Information System (INIS)

    Kim, Sung Kyum; Lee, John C.; Fynan, Douglas A.; Lee, John C.

    2013-01-01

    The importance of passive cooling systems has emerged since the SBO events. Turbine-driven auxiliary feedwater (TD-AFW) system is the only passive cooling system for steam generators (SGs) in current PWRs. During SBO events, all alternating current (AC) and direct current (DC) are interrupted and then the water levels of steam generators become high. In this case, turbine blades could be degraded and cannot cool down the SGs anymore. To prevent this kind of degradations, improved TD-AFW system should be installed for current PWRs, especially OPR 1000 plants. A long-term station blackout (LTSBO) scenario based on the improved TD-AFW system has been benchmarked as a reference input file. The following task is a safety analysis in order to find some important parameters causing the peak cladding temperature (PCT) to vary. This task has been initiated with the benchmarked input deck applying to the State-of-the-Art Reactor Consequence Analyses (SOARCA) Report. The point of the improved TD-AFW is to control the water level of the SG by using the auxiliary battery charged by a generator connected with the auxiliary turbine. However, this battery also could be disconnected from the generator. To analyze the uncertainties of the failure of the auxiliary battery, the simulation for the time-dependent failure of the TD-AFW has been performed. In addition to the cases simulated in the paper, some valves (e. g., pressurizer safety valve), available during SBO events in the paper, could be important parameters to assess uncertainties in PCTs estimated. The results for these parameters will be included in a future study in addition to the results for the leakage of the RCP seals. After the simulation of several transient cases, alternating conditional expectation (ACE) algorithm will be used to derive functional relationships between the PCT and several system parameters

  2. Nuclear data relevant to single-event upsets (SEU) in microelectronics and their application to SEU simulation

    International Nuclear Information System (INIS)

    Watanabe, Yukinobu; Tukamoto, Yasuyuki; Kodama, Akihiro; Nakashima, Hideki

    2004-01-01

    A cross-section database for neutron-induced reactions on 28 Si was developed in the energy range between 2 MeV and 3 GeV in order to analyze single-event upsets (SEUs) phenomena induced by cosmic-ray neutrons in microelectronic devices. A simplified spherical device model was proposed for simulation of the initial process of SEUs. The model was applied to SEU cross-section calculations for semiconductor memory devices. The calculated results were compared with measured SEU cross-sections and the other simulation result. The dependence of SEU cross-sections on incident neutron energy and secondary ions having the most important effects on SEUs are discussed. (author)

  3. Simulating spontaneous aseismic and seismic slip events on evolving faults

    Science.gov (United States)

    Herrendörfer, Robert; van Dinther, Ylona; Pranger, Casper; Gerya, Taras

    2017-04-01

    Plate motion along tectonic boundaries is accommodated by different slip modes: steady creep, seismic slip and slow slip transients. Due to mainly indirect observations and difficulties to scale results from laboratory experiments to nature, it remains enigmatic which fault conditions favour certain slip modes. Therefore, we are developing a numerical modelling approach that is capable of simulating different slip modes together with the long-term fault evolution in a large-scale tectonic setting. We extend the 2D, continuum mechanics-based, visco-elasto-plastic thermo-mechanical model that was designed to simulate slip transients in large-scale geodynamic simulations (van Dinther et al., JGR, 2013). We improve the numerical approach to accurately treat the non-linear problem of plasticity (see also EGU 2017 abstract by Pranger et al.). To resolve a wide slip rate spectrum on evolving faults, we develop an invariant reformulation of the conventional rate-and-state dependent friction (RSF) and adapt the time step (Lapusta et al., JGR, 2000). A crucial part of this development is a conceptual ductile fault zone model that relates slip rates along discrete planes to the effective macroscopic plastic strain rates in the continuum. We test our implementation first in a simple 2D setup with a single fault zone that has a predefined initial thickness. Results show that deformation localizes in case of steady creep and for very slow slip transients to a bell-shaped strain rate profile across the fault zone, which suggests that a length scale across the fault zone may exist. This continuum length scale would overcome the common mesh-dependency in plasticity simulations and question the conventional treatment of aseismic slip on infinitely thin fault zones. We test the introduction of a diffusion term (similar to the damage description in Lyakhovsky et al., JMPS, 2011) into the state evolution equation and its effect on (de-)localization during faster slip events. We compare

  4. Proceedings of the 17. IASTED international conference on modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wamkeue, R. (comp.) [Quebec Univ., Abitibi-Temiscaminque, PQ (Canada)

    2006-07-01

    The International Association of Science and Technology for Development (IASTED) hosted this conference to provide a forum for international researchers and practitioners interested in all areas of modelling and simulation. The conference featured 12 sessions entitled: (1) automation, control and robotics, (2) hydraulic and hydrologic modelling, (3) applications in processes and design optimization, (4) environmental systems, (5) biomedicine and biomechanics, (6) communications, computers and informatics 1, (7) economics, management and operations research 1, (8) modelling and simulation methodologies 1, (9) economics, management and operations research 2, (10) modelling, optimization, identification and simulation, (11) communications, computers and informatics 2, and, (12) modelling and simulation methodologies 2. Participants took the opportunity to present the latest research, results, and ideas in mathematical modelling; physically-based modelling; agent-based modelling; dynamic modelling; 3-dimensional modelling; computational geometry; time series analysis; finite element methods; discrete event simulation; web-based simulation; Monte Carlo simulation; simulation optimization; simulation uncertainty; fuzzy systems; data modelling; computer aided design; and, visualization. Case studies in engineering design were also presented along with simulation tools and languages. The conference also highlighted topical issues in environmental systems modelling such as air modelling and simulation, atmospheric modelling, hazardous materials, mobile source emissions, ecosystem modelling, hydrological modelling, aquatic ecosystems, terrestrial ecosystems, biological systems, agricultural modelling, terrain analysis, meteorological modelling, earth system modelling, climatic modelling, and natural resource management. The conference featured 110 presentations, of which 3 have been catalogued separately for inclusion in this database. refs., tabs., figs.

  5. Determining Nurse Aide Staffing Requirements to Provide Care Based on Resident Workload: A Discrete Event Simulation Model.

    Science.gov (United States)

    Schnelle, John F; Schroyer, L Dale; Saraf, Avantika A; Simmons, Sandra F

    2016-11-01

    Nursing aides provide most of the labor-intensive activities of daily living (ADL) care to nursing home (NH) residents. Currently, most NHs do not determine nurse aide staffing requirements based on the time to provide ADL care for their unique resident population. The lack of an objective method to determine nurse aide staffing requirements suggests that many NHs could be understaffed in their capacity to provide consistent ADL care to all residents in need. Discrete event simulation (DES) mathematically models key work parameters (eg, time to provide an episode of care and available staff) to predict the ability of the work setting to provide care over time and offers an objective method to determine nurse aide staffing needs in NHs. This study had 2 primary objectives: (1) to describe the relationship between ADL workload and the level of nurse aide staffing reported by NHs; and, (2) to use a DES model to determine the relationship between ADL workload and nurse aide staffing necessary for consistent, timely ADL care. Minimum Data Set data related to the level of dependency on staff for ADL care for residents in over 13,500 NHs nationwide were converted into 7 workload categories that captured 98% of all residents. In addition, data related to the time to provide care for the ADLs within each workload category was used to calculate a workload score for each facility. The correlation between workload and reported nurse aide staffing levels was calculated to determine the association between staffing reported by NHs and workload. Simulations to project staffing requirements necessary to provide ADL care were then conducted for 65 different workload scenarios, which included 13 different nurse aide staffing levels (ranging from 1.6 to 4.0 total hours per resident day) and 5 different workload percentiles (ranging from the 5th to the 95th percentile). The purpose of the simulation model was to determine the staffing necessary to provide care within each workload

  6. Thermohydraulic modeling and simulation of breeder reactors

    International Nuclear Information System (INIS)

    Agrawal, A.K.; Khatib-Rahbar, M.; Curtis, R.T.; Hetrick, D.L.; Girijashankar, P.V.

    1982-01-01

    This paper deals with the modeling and simulation of system-wide transients in LMFBRs. Unprotected events (i.e., the presumption of failure of the plant protection system) leading to core-melt are not considered in this paper. The existing computational capabilities in the area of protected transients in the US are noted. Various physical and numerical approximations that are made in these codes are discussed. Finally, the future direction in the area of model verification and improvements is discussed

  7. Construction and Updating of Event Models in Auditory Event Processing

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  8. Producing physically consistent and bias free extreme precipitation events over the Switzerland: Bridging gaps between meteorology and impact models

    Science.gov (United States)

    José Gómez-Navarro, Juan; Raible, Christoph C.; Blumer, Sandro; Martius, Olivia; Felder, Guido

    2016-04-01

    Extreme precipitation episodes, although rare, are natural phenomena that can threat human activities, especially in areas densely populated such as Switzerland. Their relevance demands the design of public policies that protect public assets and private property. Therefore, increasing the current understanding of such exceptional situations is required, i.e. the climatic characterisation of their triggering circumstances, severity, frequency, and spatial distribution. Such increased knowledge shall eventually lead us to produce more reliable projections about the behaviour of these events under ongoing climate change. Unfortunately, the study of extreme situations is hampered by the short instrumental record, which precludes a proper characterization of events with return period exceeding few decades. This study proposes a new approach that allows studying storms based on a synthetic, but physically consistent database of weather situations obtained from a long climate simulation. Our starting point is a 500-yr control simulation carried out with the Community Earth System Model (CESM). In a second step, this dataset is dynamically downscaled with the Weather Research and Forecasting model (WRF) to a final resolution of 2 km over the Alpine area. However, downscaling the full CESM simulation at such high resolution is infeasible nowadays. Hence, a number of case studies are previously selected. This selection is carried out examining the precipitation averaged in an area encompassing Switzerland in the ESM. Using a hydrological criterion, precipitation is accumulated in several temporal windows: 1 day, 2 days, 3 days, 5 days and 10 days. The 4 most extreme events in each category and season are selected, leading to a total of 336 days to be simulated. The simulated events are affected by systematic biases that have to be accounted before this data set can be used as input in hydrological models. Thus, quantile mapping is used to remove such biases. For this task

  9. Application of Tecnomatix Plant Simulation for Modeling Production and Logistics Processes

    Directory of Open Access Journals (Sweden)

    Julia Siderska

    2016-06-01

    Full Text Available The main objective of the article was to present the possibilities and examples of using Tecnomatix Plant Simulation (by Siemens to simulate the production and logistics processes. This tool allows to simulate discrete events and create digital models of logistic systems (e.g. production, optimize the operation of production plants, production lines, as well as individual logistics processes. The review of implementations of Tecnomatix Plant Simulation for modeling processes in production engineering and logistics was conducted and a few selected examples of simulations were presented. The author’s future studies are going to focus on simulation of production and logistic processes and their optimization with the use of genetic algorithms and artificial neural networks.

  10. Discrete event simulation of the ATLAS second level trigger

    International Nuclear Information System (INIS)

    Vermeulen, J.C.; Dankers, R.J.; Hunt, S.; Harris, F.; Hortnagl, C.; Erasov, A.; Bogaerts, A.

    1998-01-01

    Discrete event simulation is applied for determining the computing and networking resources needed for the ATLAS second level trigger. This paper discusses the techniques used and some of the results obtained so far for well defined laboratory configurations and for the full system

  11. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  12. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data

    Directory of Open Access Journals (Sweden)

    Justine B. Nasejje

    2017-07-01

    Full Text Available Abstract Background Random survival forest (RSF models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. Methods In this study, we compare the random survival forest model to the conditional inference model (CIF using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points. The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB which consists of mainly categorical covariates with two levels (few split-points. Results The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Conclusion Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  13. A model of nitrous oxide evolution from soil driven by rainfall events. I - Model structure and sensitivity. II - Model applications

    Science.gov (United States)

    Changsheng, LI; Frolking, Steve; Frolking, Tod A.

    1992-01-01

    Simulations of N2O and CO2 emissions from soils were conducted with a rain-event driven, process-oriented model (DNDC) of nitrogen and carbon cycling processes in soils. The magnitude and trends of simulated N2O (or N2O + N2) and CO2 emissions were consistent with the results obtained in field experiments. The successful simulation of these emissions from the range of soil types examined demonstrates that the DNDC will be a useful tool for the study of linkages among climate, soil-atmosphere interactions, land use, and trace gas fluxes.

  14. Discrete event dynamic system (DES)-based modeling for dynamic material flow in the pyroprocess

    International Nuclear Information System (INIS)

    Lee, Hyo Jik; Kim, Kiho; Kim, Ho Dong; Lee, Han Soo

    2011-01-01

    A modeling and simulation methodology was proposed in order to implement the dynamic material flow of the pyroprocess. Since the static mass balance provides the limited information on the material flow, it is hard to predict dynamic behavior according to event. Therefore, a discrete event system (DES)-based model named, PyroFlow, was developed at the Korea Atomic Energy Research Institute (KAERI). PyroFlow is able to calculate dynamic mass balance and also show various dynamic operational results in real time. By using PyroFlow, it is easy to rapidly predict unforeseeable results, such as throughput in unit process, accumulated product in buffer and operation status. As preliminary simulations, bottleneck analyses in the pyroprocess were carried out and consequently it was presented that operation strategy had influence on the productivity of the pyroprocess.

  15. The devil is in the details: Comparisons of episodic simulations of positive and negative future events.

    Science.gov (United States)

    Puig, Vannia A; Szpunar, Karl K

    2017-08-01

    Over the past decade, psychologists have devoted considerable attention to episodic simulation-the ability to imagine specific hypothetical events. Perhaps one of the most consistent patterns of data to emerge from this literature is that positive simulations of the future are rated as more detailed than negative simulations of the future, a pattern of results that is commonly interpreted as evidence for a positivity bias in future thinking. In the present article, we demonstrate across two experiments that negative future events are consistently simulated in more detail than positive future events when frequency of prior thinking is taken into account as a possible confounding variable and when level of detail associated with simulated events is assessed using an objective scoring criterion. Our findings are interpreted in the context of the mobilization-minimization hypothesis of event cognition that suggests people are especially likely to devote cognitive resources to processing negative scenarios. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Connecting macroscopic observables and microscopic assembly events in amyloid formation using coarse grained simulations.

    Directory of Open Access Journals (Sweden)

    Noah S Bieler

    Full Text Available The pre-fibrillar stages of amyloid formation have been implicated in cellular toxicity, but have proved to be challenging to study directly in experiments and simulations. Rational strategies to suppress the formation of toxic amyloid oligomers require a better understanding of the mechanisms by which they are generated. We report Dynamical Monte Carlo simulations that allow us to study the early stages of amyloid formation. We use a generic, coarse-grained model of an amyloidogenic peptide that has two internal states: the first one representing the soluble random coil structure and the second one the [Formula: see text]-sheet conformation. We find that this system exhibits a propensity towards fibrillar self-assembly following the formation of a critical nucleus. Our calculations establish connections between the early nucleation events and the kinetic information available in the later stages of the aggregation process that are commonly probed in experiments. We analyze the kinetic behaviour in our simulations within the framework of the theory of classical nucleated polymerisation, and are able to connect the structural events at the early stages in amyloid growth with the resulting macroscopic observables such as the effective nucleus size. Furthermore, the free-energy landscapes that emerge from these simulations allow us to identify pertinent properties of the monomeric state that could be targeted to suppress oligomer formation.

  17. Application of Discrete Event Simulation in Mine Production Forecast

    African Journals Online (AJOL)

    Application of Discrete Event Simulation in Mine Production Forecast. Felix Adaania Kaba, Victor Amoako Temeng, Peter Arroja Eshun. Abstract. Mine production forecast is pertinent to mining as it serves production goals for a production period. Perseus Mining Ghana Limited (PMGL), Ayanfuri, deterministically forecasts ...

  18. Parallel discrete-event simulation of FCFS stochastic queueing networks

    Science.gov (United States)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  19. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Towards renewed health economic simulation of type 2 diabetes: risk equations for first and second cardiovascular events from Swedish register data.

    Directory of Open Access Journals (Sweden)

    Aliasghar Ahmad Kiadaliri

    Full Text Available OBJECTIVE: Predicting the risk of future events is an essential part of health economic simulation models. In pursuit of this goal, the current study aims to predict the risk of developing first and second acute myocardial infarction, heart failure, non-acute ischaemic heart disease, and stroke after diagnosis in patients with type 2 diabetes, using data from the Swedish National Diabetes Register. MATERIAL AND METHODS: Register data on 29,034 patients with type 2 diabetes were analysed over five years of follow up (baseline 2003. To develop and validate the risk equations, the sample was randomly divided into training (75% and test (25% subsamples. The Weibull proportional hazard model was used to estimate the coefficients of the risk equations, and these were validated in both the training and the test samples. RESULTS: In total, 4,547 first and 2,418 second events were observed during the five years of follow up. Experiencing a first event substantially elevated the risk of subsequent events. There were heterogeneities in the effects of covariates within as well as between events; for example, while for females the hazard ratio of having a first acute myocardial infarction was 0.79 (0.70-0.90, the hazard ratio of a second was 1.21 (0.98-1.48. The hazards of second events decreased as the time since first events elapsed. The equations showed adequate calibration and discrimination (C statistics range: 0.70-0.84 in test samples. CONCLUSION: The accuracy of health economic simulation models of type 2 diabetes can be improved by ensuring that they account for the heterogeneous effects of covariates on the risk of first and second cardiovascular events. Thus it is important to extend such models by including risk equations for second cardiovascular events.

  1. ReDecay, a method to re-use the underlying events to speed up the simulation in LHCb

    CERN Multimedia

    Muller, Dominik

    2017-01-01

    With the steady increase in the precision of flavour physics measurements collected during LHC Run 2, the LHCb experiment requires simulated data samples of ever increasing magnitude to study the detector response in detail. However, relying on an increase of available computing power for the production of simulated events will not suffice to achieve this goal. The simulation of the detector response is the main contribution to the time needed to generate a sample, that scales linearly with the particles multiplicity of the event. Of the dozens of particles present in the simulation only a few, namely those participating in the studied signal decay, are of particular interest, while all remaining ones, the so-called underlying event, mainly affect the resolution and efficiencies of the detector. This talk presents a novel development for the LHCb simulation software which re-uses the underlying event from previously simulated events. This approach achieves an order of magnitude increase in speed and the same ...

  2. Discrete Event Simulation Method as a Tool for Improvement of Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Adrian Kampa

    2017-02-01

    Full Text Available The problem of production flow in manufacturing systems is analyzed. The machines can be operated by workers or by robots, since breakdowns and human factors destabilize the production processes that robots are preferred to perform. The problem is how to determine the real difference in work efficiency between humans and robots. We present an analysis of the production efficiency and reliability of the press shop lines operated by human operators or industrial robots. This is a problem from the field of Operations Research for which the Discrete Event Simulation (DES method has been used. Three models have been developed, including the manufacturing line before and after robotization, taking into account stochastic parameters of availability and reliability of the machines, operators, and robots. We apply the OEE (Overall Equipment Effectiveness indicator to present how the availability, reliability, and quality parameters influence the performance of the workstations, especially in the short run and in the long run. In addition, the stability of the simulation model was analyzed. This approach enables a better representation of real manufacturing processes.

  3. Simulation of tokamak runaway-electron events

    International Nuclear Information System (INIS)

    Bolt, H.; Miyahara, A.; Miyake, M.; Yamamoto, T.

    1987-08-01

    High energy runaway-electron events which can occur in tokamaks when the plasma hits the first wall are a critical issue for the materials selection of future devices. Runaway-electron events are simulated with an electron linear accelerator to better understand the observed runaway-electron damage to tokamak first wall materials and to consider the runaway-electron issue in further materials development and selection. The electron linear accelerator produces beam energies of 20 to 30 MeV at an integrated power input of up to 1.3 kW. Graphite, SiC + 2 % AlN, stainless steel, molybdenum and tungsten have been tested as bulk materials. To test the reliability of actively cooled systems under runaway-electron impact layer systems of graphite fixed to metal substrates have been tested. The irradiation resulted in damage to the metal compounds but left graphite and SiC + 2 % AlN without damage. Metal substrates of graphite - metal systems for actively cooled structures suffer severe damage unless thick graphite shielding is provided. (author)

  4. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    Science.gov (United States)

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintore, Joaquin

    2012-01-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  5. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    Science.gov (United States)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  6. Attributing anthropogenic impact on regional heat wave events using CAM5 model large ensemble simulations

    Science.gov (United States)

    Lo, S. H.; Chen, C. T.

    2017-12-01

    Extreme heat waves have serious impacts on society. It was argued that the anthropogenic forcing might substantially increase the risk of extreme heat wave events (e.g. over western Europe in 2003 and over Russia in 2010). However, the regional dependence of such anthropogenic impact and the sensitivity of the attributed risk to the definition of heat wave still require further studies. In our research framework, the change in the frequency and severity of a heat wave event under current conditions is calculated and compared with the probability and magnitude of the event if the effects of particular external forcing, such as due to human influence, had been absent. In our research, we use the CAM5 large ensemble simulation from the CLIVAR C20C+ Detection and Attribution project (http://portal.nersc.gov/c20c/main.html, Folland et al. 2014) to detect the heat wave events occurred in both historical all forcing run and natural forcing only run. The heat wave events are identified by partial duration series method (Huth et al., 2000). We test the sensitivity of heat wave thresholds from daily maximum temperature (Tmax) in warm season (from May to September) between 1959 and 2013. We consider the anthropogenic effect on the later period (2000-2013) when the warming due to human impact is more evident. Using Taiwan and surrounding area as our preliminary research target, We found the anthropogenic effect will increase the heat wave day per year from 30 days to 75 days and make the mean starting(ending) day for heat waves events about 15-30 days earlier(later). Using the Fraction of Attribution Risk analysis to estimate the risk of frequency of heat wave day, our results show the anthropogenic forcing very likely increase the heat wave days over Taiwan by more than 50%. Further regional differences and sensitivity of the attributed risk to the definition of heat wave will be compared and discussed.

  7. Event-by-Event Simulations of Early Gluon Fields in High Energy Nuclear Collisions

    Science.gov (United States)

    Nickel, Matthew; Rose, Steven; Fries, Rainer

    2017-09-01

    Collisions of heavy ions are carried out at ultra relativistic speeds at the Relativistic Heavy Ion Collider and the Large Hadron Collider to create Quark Gluon Plasma. The earliest stages of such collisions are dominated by the dynamics of classical gluon fields. The McLerran-Venugopalan (MV) model of color glass condensate provides a model for this process. Previous research has provided an analytic solution for event averaged observables in the MV model. Using the High Performance Research Computing Center (HPRC) at Texas A&M, we have developed a C++ code to explicitly calculate the initial gluon fields and energy momentum tensor event by event using the analytic recursive solution. The code has been tested against previously known analytic results up to fourth order. We have also have been able to test the convergence of the recursive solution at high orders in time and studied the time evolution of color glass condensate.

  8. Advances in Discrete-Event Simulation for MSL Command Validation

    Science.gov (United States)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  9. Sensitivity of the WRF model to the lower boundary in an extreme precipitation event - Madeira island case study

    Science.gov (United States)

    Teixeira, J. C.; Carvalho, A. C.; Carvalho, M. J.; Luna, T.; Rocha, A.

    2014-08-01

    The advances in satellite technology in recent years have made feasible the acquisition of high-resolution information on the Earth's surface. Examples of such information include elevation and land use, which have become more detailed. Including this information in numerical atmospheric models can improve their results in simulating lower boundary forced events, by providing detailed information on their characteristics. Consequently, this work aims to study the sensitivity of the weather research and forecast (WRF) model to different topography as well as land-use simulations in an extreme precipitation event. The test case focused on a topographically driven precipitation event over the island of Madeira, which triggered flash floods and mudslides in the southern parts of the island. Difference fields between simulations were computed, showing that the change in the data sets produced statistically significant changes to the flow, the planetary boundary layer structure and precipitation patterns. Moreover, model results show an improvement in model skill in the windward region for precipitation and in the leeward region for wind, in spite of the non-significant enhancement in the overall results with higher-resolution data sets of topography and land use.

  10. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  11. Evaluation of the HadGEM3-A simulations in view of detection and attribution of human influence on extreme events in Europe

    Science.gov (United States)

    Vautard, Robert; Christidis, Nikolaos; Ciavarella, Andrew; Alvarez-Castro, Carmen; Bellprat, Omar; Christiansen, Bo; Colfescu, Ioana; Cowan, Tim; Doblas-Reyes, Francisco; Eden, Jonathan; Hauser, Mathias; Hegerl, Gabriele; Hempelmann, Nils; Klehmet, Katharina; Lott, Fraser; Nangini, Cathy; Orth, René; Radanovics, Sabine; Seneviratne, Sonia I.; van Oldenborgh, Geert Jan; Stott, Peter; Tett, Simon; Wilcox, Laura; Yiou, Pascal

    2018-04-01

    A detailed analysis is carried out to assess the HadGEM3-A global atmospheric model skill in simulating extreme temperatures, precipitation and storm surges in Europe in the view of their attribution to human influence. The analysis is performed based on an ensemble of 15 atmospheric simulations forced with observed sea surface temperature of the 54 year period 1960-2013. These simulations, together with dual simulations without human influence in the forcing, are intended to be used in weather and climate event attribution. The analysis investigates the main processes leading to extreme events, including atmospheric circulation patterns, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares observed and simulated variability, trends and generalized extreme value theory parameters for temperature and precipitation. One of the most striking findings is the ability of the model to capture North-Atlantic atmospheric weather regimes as obtained from a cluster analysis of sea level pressure fields. The model also reproduces the main observed weather patterns responsible for temperature and precipitation extreme events. However, biases are found in many physical processes. Slightly excessive drying may be the cause of an overestimated summer interannual variability and too intense heat waves, especially in central/northern Europe. However, this does not seem to hinder proper simulation of summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitation amounts are overestimated and too variable. The atmospheric conditions leading to storm surges were also examined in the Baltics region. There, simulated weather conditions appear not to be leading to strong enough storm surges, but winds were found in very good agreement with reanalyses. The performance in reproducing atmospheric weather patterns

  12. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  13. Current fluctuations and statistics during a large deviation event in an exactly solvable transport model

    International Nuclear Information System (INIS)

    Hurtado, Pablo I; Garrido, Pedro L

    2009-01-01

    We study the distribution of the time-integrated current in an exactly solvable toy model of heat conduction, both analytically and numerically. The simplicity of the model allows us to derive the full current large deviation function and the system statistics during a large deviation event. In this way we unveil a relation between system statistics at the end of a large deviation event and for intermediate times. The mid-time statistics is independent of the sign of the current, a reflection of the time-reversal symmetry of microscopic dynamics, while the end-time statistics does depend on the current sign, and also on its microscopic definition. We compare our exact results with simulations based on the direct evaluation of large deviation functions, analyzing the finite-size corrections of this simulation method and deriving detailed bounds for its applicability. We also show how the Gallavotti–Cohen fluctuation theorem can be used to determine the range of validity of simulation results

  14. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  15. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    Raedt, H. De; Raedt, K. De; Michielsen, K.; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    In various basic experiments in quantum physics, observations are recorded event-by-event. The final outcome of such experiments can be computed according to the rules of quantum theory but quantum theory does not describe single events. In this paper, we describe a stimulation approach that does

  16. Numerical simulation of internal reconnection event in spherical tokamak

    International Nuclear Information System (INIS)

    Hayashi, Takaya; Mizuguchi, Naoki; Sato, Tetsuya

    1999-07-01

    Three-dimensional magnetohydrodynamic simulations are executed in a full toroidal geometry to clarify the physical mechanisms of the Internal Reconnection Event (IRE), which is observed in the spherical tokamak experiments. The simulation results reproduce several main properties of IRE. Comparison between the numerical results and experimental observation indicates fairly good agreements regarding nonlinear behavior, such as appearance of localized helical distortion, appearance of characteristic conical shape in the pressure profile during thermal quench, and subsequent appearance of the m=2/n=1 type helical distortion of the torus. (author)

  17. Comparison of Thunderstorm Simulations from WRF-NMM and WRF-ARW Models over East Indian Region

    Directory of Open Access Journals (Sweden)

    A. J. Litta

    2012-01-01

    Full Text Available The thunderstorms are typical mesoscale systems dominated by intense convection. Mesoscale models are essential for the accurate prediction of such high-impact weather events. In the present study, an attempt has been made to compare the simulated results of three thunderstorm events using NMM and ARW model core of WRF system and validated the model results with observations. Both models performed well in capturing stability indices which are indicators of severe convective activity. Comparison of model-simulated radar reflectivity imageries with observations revealed that NMM model has simulated well the propagation of the squall line, while the squall line movement was slow in ARW. From the model-simulated spatial plots of cloud top temperature, we can see that NMM model has better captured the genesis, intensification, and propagation of thunder squall than ARW model. The statistical analysis of rainfall indicates the better performance of NMM than ARW. Comparison of model-simulated thunderstorm affected parameters with that of the observed showed that NMM has performed better than ARW in capturing the sharp rise in humidity and drop in temperature. This suggests that NMM model has the potential to provide unique and valuable information for severe thunderstorm forecasters over east Indian region.

  18. Attribution of Extreme Rainfall Events in the South of France Using EURO-CORDEX Simulations

    Science.gov (United States)

    Luu, L. N.; Vautard, R.; Yiou, P.

    2017-12-01

    The Mediterranean region regularly undergoes episodes of intense precipitation in the fall season that exceed 300mm a day. This study focuses on the role of climate change on the dynamics of the events that occur in the South of France. We used an ensemble of 10 EURO-CORDEX model simulations with two horizontal resolutions (EUR-11: 0.11° and EUR-44: 0.44°) for the attribution of extreme rainfall in the fall in the Cevennes mountain range (South of France). The biases of the simulations were corrected with simple scaling adjustment and a quantile correction (CDFt). This produces five datasets including EUR-44 and EUR-11 with and without scaling adjustment and CDFt-EUR-11, on which we test the impact of resolution and bias correction on the extremes. Those datasets, after pooling all of models together, are fitted by a stationary Generalized Extreme Value distribution for several periods to estimate a climate change signal in the tail of distribution of extreme rainfall in the Cévenne region. Those changes are then interpreted by a scaling model that links extreme rainfall with mean and maximum daily temperature. The results show that higher-resolution simulations with bias adjustment provide a robust and confident increase of intensity and likelihood of occurrence of autumn extreme rainfall in the area in current climate in comparison with historical climate. The probability (exceedance probability) of 1-in-1000-year event in historical climate may increase by a factor of 1.8 under current climate with a confident interval of 0.4 to 5.3 following the CDFt bias-adjusted EUR-11. The change of magnitude appears to follow the Clausius-Clapeyron relation that indicates a 7% increase in rainfall per 1oC increase in temperature.

  19. The 2010 Pakistan floods: high-resolution simulations with the WRF model

    Science.gov (United States)

    Viterbo, Francesca; Parodi, Antonio; Molini, Luca; Provenzale, Antonello; von Hardenberg, Jost; Palazzi, Elisa

    2013-04-01

    Estimating current and future water resources in high mountain regions with complex orography is a difficult but crucial task. In particular, the French-Italian project PAPRIKA is focused on two specific regions in the Hindu-Kush -- Himalaya -- Karakorum (HKKH)region: the Shigar basin in Pakistan, at the feet of K2, and the Khumbu valley in Nepal, at the feet of Mount Everest. In this framework, we use the WRF model to simulate precipitation and meteorological conditions with high resolution in areas with extreme orographic slopes, comparing the model output with station and satellite data. Once validated the model, we shall run a set of three future time-slices at very high spatial resolution, in the periods 2046-2050, 2071-2075 and 2096-2100, nested in different climate change scenarios (EXtreme PREcipitation and Hydrological climate Scenario Simulations -EXPRESS-Hydro project). As a prelude to this study, here we discuss the simulation of specific, high-intensity rainfall events in this area. In this paper we focus on the 2010 Pakistan floods which began in late July 2010, producing heavy monsoon rains in the Khyber Pakhtunkhwa, Sindh, Punjab and Balochistan regions of Pakistan and affecting the Indus River basin. Approximately one-fifth of Pakistan's total land area was underwater, with a death toll of about 2000 people. This event has been simulated with the WRF model (version 3.3.) in cloud-permitting mode (d01 14 km and d02 3.5 km): different convective closures and microphysics parameterization have been used. A deeper understanding of the processes responsible for this event has been gained through comparison with rainfall depth observations, radiosounding data and geostationary/polar satellite images.

  20. Mathematical basis for the process of model simulation of drilling operations

    Energy Technology Data Exchange (ETDEWEB)

    Lipovetskiy, G M; Lebedinskiy, G L

    1979-01-01

    The authors describe the application of a method for the model simulation of drilling operations and for the solution of problems concerned with the planning and management of such operations. A description is offered for an approach to the simulator process when the drilling operations are part of a large system. An algorithm is provided for calculating complex events.

  1. Modeling the Process of Event Sequence Data Generated for Working Condition Diagnosis

    Directory of Open Access Journals (Sweden)

    Jianwei Ding

    2015-01-01

    Full Text Available Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM, which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis. Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.

  2. Discrete event simulation for exploring strategies: an urban water management case.

    Science.gov (United States)

    Huang, Dong-Bin; Scholz, Roland W; Gujer, Willi; Chitwood, Derek E; Loukopoulos, Peter; Schertenleib, Roland; Siegrist, Hansruedi

    2007-02-01

    This paper presents a model structure aimed at offering an overview of the various elements of a strategy and exploring their multidimensional effects through time in an efficient way. It treats a strategy as a set of discrete events planned to achieve a certain strategic goal and develops a new form of causal networks as an interfacing component between decision makers and environment models, e.g., life cycle inventory and material flow models. The causal network receives a strategic plan as input in a discrete manner and then outputs the updated parameter sets to the subsequent environmental models. Accordingly, the potential dynamic evolution of environmental systems caused by various strategies can be stepwise simulated. It enables a way to incorporate discontinuous change in models for environmental strategy analysis, and enhances the interpretability and extendibility of a complex model by its cellular constructs. It is exemplified using an urban water management case in Kunming, a major city in Southwest China. By utilizing the presented method, the case study modeled the cross-scale interdependencies of the urban drainage system and regional water balance systems, and evaluated the effectiveness of various strategies for improving the situation of Dianchi Lake.

  3. Analysis of hypoglycemic events using negative binomial models.

    Science.gov (United States)

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Forecasting Lightning Threat using Cloud-resolving Model Simulations

    Science.gov (United States)

    McCaul, E. W., Jr.; Goodman, S. J.; LaCasse, K. M.; Cecil, D. J.

    2009-01-01

    As numerical forecasts capable of resolving individual convective clouds become more common, it is of interest to see if quantitative forecasts of lightning flash rate density are possible, based on fields computed by the numerical model. Previous observational research has shown robust relationships between observed lightning flash rates and inferred updraft and large precipitation ice fields in the mixed phase regions of storms, and that these relationships might allow simulated fields to serve as proxies for lightning flash rate density. It is shown in this paper that two simple proxy fields do indeed provide reasonable and cost-effective bases for creating time-evolving maps of predicted lightning flash rate density, judging from a series of diverse simulation case study events in North Alabama for which Lightning Mapping Array data provide ground truth. One method is based on the product of upward velocity and the mixing ratio of precipitating ice hydrometeors, modeled as graupel only, in the mixed phase region of storms at the -15\\dgc\\ level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domainwide statistics of the peak values of simulated flash rate proxy fields against domainwide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. A blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Weather Research and Forecast Model simulations of selected North Alabama cases show that this model can distinguish the general character and intensity of most convective events, and that the proposed methods show promise as a means of generating

  5. A code for simulation of human failure events in nuclear power plants: SIMPROC

    International Nuclear Information System (INIS)

    Gil, Jesus; Fernandez, Ivan; Murcia, Santiago; Gomez, Javier; Marrao, Hugo; Queral, Cesar; Exposito, Antonio; Rodriguez, Gabriel; Ibanez, Luisa; Hortal, Javier; Izquierdo, Jose M.; Sanchez, Miguel; Melendez, Enrique

    2011-01-01

    Over the past years, many Nuclear Power Plant organizations have performed Probabilistic Safety Assessments to identify and understand key plant vulnerabilities. As part of enhancing the PSA quality, the Human Reliability Analysis is essential to make a realistic evaluation of safety and about the potential facility's weaknesses. Moreover, it has to be noted that HRA continues to be a large source of uncertainty in the PSAs. Within their current joint collaborative activities, Indizen, Universidad Politecnica de Madrid and Consejo de Seguridad Nuclear have developed the so-called SIMulator of PROCedures (SIMPROC), a tool aiming at simulate events related with human actions and able to interact with a plant simulation model. The tool helps the analyst to quantify the importance of human actions in the final plant state. Among others, the main goal of SIMPROC is to check the Emergency Operating Procedures being used by operating crew in order to lead the plant to a safe shutdown plant state. Currently SIMPROC is coupled with the SCAIS software package, but the tool is flexible enough to be linked to other plant simulation codes. SIMPROC-SCAIS applications are shown in the present article to illustrate the tool performance. The applications were developed in the framework of the Nuclear Energy Agency project on Safety Margin Assessment and Applications (SM2A). First an introductory example was performed to obtain the damage domain boundary of a selected sequence from a SBLOCA. Secondly, the damage domain area of a selected sequence from a loss of Component Cooling Water with a subsequent seal LOCA was calculated. SIMPROC simulates the corresponding human actions in both cases. The results achieved shown how the system can be adapted to a wide range of purposes such as Dynamic Event Tree delineation, Emergency Operating Procedures and damage domain search.

  6. Toward Improving Predictability of Extreme Hydrometeorological Events: the Use of Multi-scale Climate Modeling in the Northern High Plains

    Science.gov (United States)

    Munoz-Arriola, F.; Torres-Alavez, J.; Mohamad Abadi, A.; Walko, R. L.

    2014-12-01

    Our goal is to investigate possible sources of predictability of hydrometeorological extreme events in the Northern High Plains. Hydrometeorological extreme events are considered the most costly natural phenomena. Water deficits and surpluses highlight how the water-climate interdependence becomes crucial in areas where single activities drive economies such as Agriculture in the NHP. Nonetheless we recognize the Water-Climate interdependence and the regulatory role that human activities play, we still grapple to identify what sources of predictability could be added to flood and drought forecasts. To identify the benefit of multi-scale climate modeling and the role of initial conditions on flood and drought predictability on the NHP, we use the Ocean Land Atmospheric Model (OLAM). OLAM is characterized by a dynamic core with a global geodesic grid with hexagonal (and variably refined) mesh cells and a finite volume discretization of the full compressible Navier Stokes equations, a cut-grid cell method for topography (that reduces error in computational gradient computation and anomalous vertical dispersion). Our hypothesis is that wet conditions will drive OLAM's simulations of precipitation to wetter conditions affecting both flood forecast and drought forecast. To test this hypothesis we simulate precipitation during identified historical flood events followed by drought events in the NHP (i.e. 2011-2012 years). We initialized OLAM with CFS-data 1-10 days previous to a flooding event (as initial conditions) to explore (1) short-term and high-resolution and (2) long-term and coarse-resolution simulations of flood and drought events, respectively. While floods are assessed during a maximum of 15-days refined-mesh simulations, drought is evaluated during the following 15 months. Simulated precipitation will be compared with the Sub-continental Observation Dataset, a gridded 1/16th degree resolution data obtained from climatological stations in Canada, US, and

  7. Construction and updating of event models in auditory event processing.

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Physical model of the nuclear fuel cycle simulation code SITON

    International Nuclear Information System (INIS)

    Brolly, Á.; Halász, M.; Szieberth, M.; Nagy, L.; Fehér, S.

    2017-01-01

    Finding answers to main challenges of nuclear energy, like resource utilisation or waste minimisation, calls for transient fuel cycle modelling. This motivation led to the development of SITON v2.0 a dynamic, discrete facilities/discrete materials and also discrete events fuel cycle simulation code. The physical model of the code includes the most important fuel cycle facilities. Facilities can be connected flexibly; their number is not limited. Material transfer between facilities is tracked by taking into account 52 nuclides. Composition of discharged fuel is determined using burnup tables except for the 2400 MW thermal power design of the Gas-Cooled Fast Reactor (GFR2400). For the GFR2400 the FITXS method is used, which fits one-group microscopic cross-sections as polynomial functions of the fuel composition. This method is accurate and fast enough to be used in fuel cycle simulations. Operation of the fuel cycle, i.e. material requests and transfers, is described by discrete events. In advance of the simulation reactors and plants formulate their requests as events; triggered requests are tracked. After that, the events are simulated, i.e. the requests are fulfilled and composition of the material flow between facilities is calculated. To demonstrate capabilities of SITON v2.0, a hypothetical transient fuel cycle is presented in which a 4-unit VVER-440 reactor park was replaced by one GFR2400 that recycled its own spent fuel. It is found that the GFR2400 can be started if the cooling time of its spent fuel is 2 years. However, if the cooling time is 5 years it needs an additional plutonium feed, which can be covered from the spent fuel of a Generation III light water reactor.

  9. CATASTROPHIC EVENTS MODELING

    Directory of Open Access Journals (Sweden)

    Ciumas Cristina

    2013-07-01

    Full Text Available This paper presents the emergence and evolution of catastrophe models (cat models. Starting with the present context of extreme weather events and features of catastrophic risk (cat risk we’ll make a chronological illustration from a theoretical point of view of the main steps taken for building such models. In this way the importance of interdisciplinary can be observed. The first cat model considered contains three modules. For each of these indentified modules: hazard, vulnerability and financial losses a detailed overview and also an exemplification of a potential case of an earthquake that measures more than 7 on Richter scale occurring nowadays in Bucharest will be provided. The key areas exposed to earthquake in Romania will be identified. Then, based on past catastrophe data and taking into account present conditions of housing stock, insurance coverage and the population of Bucharest the impact will be quantified by determining potential losses. In order to accomplish this work we consider a scenario with data representing average values for: dwelling’s surface, location, finishing works. On each step we’ll make a reference to the earthquake on March 4 1977 to see what would happen today if a similar event occurred. The value of Bucharest housing stock will be determined taking firstly the market value, then the replacement value and ultimately the real value to quantify potential damages. Through this approach we can find the insurance coverage of potential losses and also the uncovered gap. A solution that may be taken into account by public authorities, for example by Bucharest City Hall will be offered: in case such an event occurs the impossibility of paying compensations to insured people, rebuilding infrastructure and public buildings and helping the suffering persons should be avoided. An actively public-private partnership should be created between government authorities, the Natural Disaster Insurance Pool, private

  10. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  11. Mixed-realism simulation of adverse event disclosure: an educational methodology and assessment instrument.

    Science.gov (United States)

    Matos, Francisco M; Raemer, Daniel B

    2013-04-01

    Physicians have an ethical duty to disclose adverse events to patients or families. Various strategies have been reported for teaching disclosure, but no instruments have been shown to be reliable for assessing them.The aims of this study were to report a structured method for teaching adverse event disclosure using mixed-realism simulation, develop and begin to validate an instrument for assessing performance, and describe the disclosure practice of anesthesiology trainees. Forty-two anesthesiology trainees participated in a 2-part exercise with mixed-realism simulation. The first part took place using a mannequin patient in a simulated operating room where trainees became enmeshed in a clinical episode that led to an adverse event and the second part in a simulated postoperative care unit where the learner is asked to disclose to a standardized patient who systematically moves through epochs of grief response. Two raters scored subjects using an assessment instrument we developed that combines a 4-element behaviorally anchored rating scale (BARS) and a 5-stage objective rating scale. The performance scores for elements within the BARS and the 5-stage instrument showed excellent interrater reliability (Cohen's κ = 0.7), appropriate range (mean range for BARS, 4.20-4.47; mean range for 5-stage instrument, 3.73-4.46), and high internal consistency (P realism simulation that engages learners in an adverse event and allows them to practice disclosure to a structured range of patient responses. We have developed a reliable 2-part instrument with strong psychometric properties for assessing disclosure performance.

  12. Proactive modeling of water quality impacts of extreme precipitation events in a drinking water reservoir.

    Science.gov (United States)

    Jeznach, Lillian C; Hagemann, Mark; Park, Mi-Hyun; Tobiason, John E

    2017-10-01

    Extreme precipitation events are of concern to managers of drinking water sources because these occurrences can affect both water supply quantity and quality. However, little is known about how these low probability events impact organic matter and nutrient loads to surface water sources and how these loads may impact raw water quality. This study describes a method for evaluating the sensitivity of a water body of interest from watershed input simulations under extreme precipitation events. An example application of the method is illustrated using the Wachusett Reservoir, an oligo-mesotrophic surface water reservoir in central Massachusetts and a major drinking water supply to metropolitan Boston. Extreme precipitation event simulations during the spring and summer resulted in total organic carbon, UV-254 (a surrogate measurement for reactive organic matter), and total algae concentrations at the drinking water intake that exceeded recorded maximums. Nutrient concentrations after storm events were less likely to exceed recorded historical maximums. For this particular reservoir, increasing inter-reservoir transfers of water with lower organic matter content after a large precipitation event has been shown in practice and in model simulations to decrease organic matter levels at the drinking water intake, therefore decreasing treatment associated oxidant demand, energy for UV disinfection, and the potential for formation of disinfection byproducts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Applications Of Monte Carlo Radiation Transport Simulation Techniques For Predicting Single Event Effects In Microelectronics

    International Nuclear Information System (INIS)

    Warren, Kevin; Reed, Robert; Weller, Robert; Mendenhall, Marcus; Sierawski, Brian; Schrimpf, Ronald

    2011-01-01

    MRED (Monte Carlo Radiative Energy Deposition) is Vanderbilt University's Geant4 application for simulating radiation events in semiconductors. Geant4 is comprised of the best available computational physics models for the transport of radiation through matter. In addition to basic radiation transport physics contained in the Geant4 core, MRED has the capability to track energy loss in tetrahedral geometric objects, includes a cross section biasing and track weighting technique for variance reduction, and additional features relevant to semiconductor device applications. The crucial element of predicting Single Event Upset (SEU) parameters using radiation transport software is the creation of a dosimetry model that accurately approximates the net collected charge at transistor contacts as a function of deposited energy. The dosimetry technique described here is the multiple sensitive volume (MSV) model. It is shown to be a reasonable approximation of the charge collection process and its parameters can be calibrated to experimental measurements of SEU cross sections. The MSV model, within the framework of MRED, is examined for heavy ion and high-energy proton SEU measurements of a static random access memory.

  14. Simulation of land use evolution by discrete events method: Application to “la chaîne des puys” from XV to XVIII Century

    Directory of Open Access Journals (Sweden)

    Y. Michelin

    1998-01-01

    Full Text Available By using a discrete event method, simulation of land use evolution has been applied to a landscape model of “la ChaÎne des Puys” (French Massif Central during along period (XV–XVIII centuries. The indications concerning the evolution of land use are in conformity with the observation of actual situations but the dynamic changes are faster than in actual facts. In spite of limitations due to necessary simplifications, it is now established that the discrete event method is efficient to simulate land use evolution during a long period. The model is immediately able to describe actual dynamics and to show sensitive variables with their critical values. Although oversimplified, it shows how far factors such as level of crops production and taxation can influence land use and landscape changes with a more or less lengthy period. In the future, the model should be bettered by introducing other determined and/or stochastic events.

  15. NUMERICAL MODELING OF THE 2009 IMPACT EVENT ON JUPITER

    Energy Technology Data Exchange (ETDEWEB)

    Pond, Jarrad W. T.; Palotai, Csaba; Gabriel, Travis; Harrington, Joseph; Rebeli, Noemi [Planetary Sciences Group, Department of Physics, University of Central Florida, Orlando, FL 32816-2385 (United States); Korycansky, Donald G., E-mail: jarradpond@gmail.com [Department of Earth and Planetary Science, University of California, Santa Cruz, CA 95064 (United States)

    2012-02-01

    We have investigated the 2009 July impact event on Jupiter using the ZEUS-MP 2 three-dimensional hydrodynamics code. We studied the impact itself and the following plume development. Eight impactors were considered: 0.5 km and 1 km porous ({rho} = 1.760 g cm{sup -3}) and non-porous ({rho} = 2.700 g cm{sup -3}) basalt impactors, and 0.5 km and 1 km porous ({rho} = 0.600 g cm{sup -3}) and non-porous ({rho} = 0.917 g cm{sup -3}) ice impactors. The simulations consisted of these bolides colliding with Jupiter at an incident angle of {theta} = 69 Degree-Sign from the vertical and with an impact velocity of v = 61.4 km s{sup -1}. Our simulations show the development of relatively larger, faster plumes created after impacts involving 1 km diameter bodies. Comparing simulations of the 2009 event with simulations of the Shoemaker-Levy 9 (SL9) events reveals a difference in plume development, with the higher incident angle of the 2009 impact leading to a shallower terminal depth and a smaller and slower plume. We also studied the amount of dynamical chaos present in the simulations conducted at the 2009 incident angle. Compared to the chaos of the SL9 simulations, where {theta} Almost-Equal-To 45 Degree-Sign , we find no significant difference in chaos at the higher 2009 incident angle.

  16. Joint two-part Tobit models for longitudinal and time-to-event data.

    Science.gov (United States)

    Dagne, Getachew A

    2017-11-20

    In this article, we show how Tobit models can address problems of identifying characteristics of subjects having left-censored outcomes in the context of developing a method for jointly analyzing time-to-event and longitudinal data. There are some methods for handling these types of data separately, but they may not be appropriate when time to event is dependent on the longitudinal outcome, and a substantial portion of values are reported to be below the limits of detection. An alternative approach is to develop a joint model for the time-to-event outcome and a two-part longitudinal outcome, linking them through random effects. This proposed approach is implemented to assess the association between the risk of decline of CD4/CD8 ratio and rates of change in viral load, along with discriminating between patients who are potentially progressors to AIDS from patients who do not. We develop a fully Bayesian approach for fitting joint two-part Tobit models and illustrate the proposed methods on simulated and real data from an AIDS clinical study. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Kinematics of a Head-Neck Model Simulating Whiplash

    Science.gov (United States)

    Colicchia, Giuseppe; Zollman, Dean; Wiesner, Hartmut; Sen, Ahmet Ilhan

    2008-01-01

    A whiplash event is a relative motion between the head and torso that occurs in rear-end automobile collisions. In particular, the large inertia of the head results in a horizontal translation relative to the thorax. This paper describes a simulation of the motion of the head and neck during a rear-end (whiplash) collision. A head-neck model that…

  18. Structure, Function, and Applications of the Georgetown-Einstein (GE) Breast Cancer Simulation Model.

    Science.gov (United States)

    Schechter, Clyde B; Near, Aimee M; Jayasekera, Jinani; Chandler, Young; Mandelblatt, Jeanne S

    2018-04-01

    The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. The model results consistently match key temporal trends in US breast cancer incidence and mortality. The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.

  19. Power System Event Ranking Using a New Linear Parameter-Varying Modeling with a Wide Area Measurement System-Based Approach

    Directory of Open Access Journals (Sweden)

    Mohammad Bagher Abolhasani Jabali

    2017-07-01

    Full Text Available Detecting critical power system events for Dynamic Security Assessment (DSA is required for reliability improvement. The approach proposed in this paper investigates the effects of events on dynamic behavior during nonlinear system response while common approaches use steady-state conditions after events. This paper presents some new and enhanced indices for event ranking based on time-domain simulation and polytopic linear parameter-varying (LPV modeling of a power system. In the proposed approach, a polytopic LPV representation is generated via linearization about some points of the nonlinear dynamic behavior of power system using wide-area measurement system (WAMS concepts and then event ranking is done based on the frequency response of the system models on the vertices. Therefore, the nonlinear behaviors of the system in the time of fault occurrence are considered for events ranking. The proposed algorithm is applied to a power system using nonlinear simulation. The comparison of the results especially in different fault conditions shows the advantages of the proposed approach and indices.

  20. Probabilistic Models for Solar Particle Events

    Science.gov (United States)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  1. Two-dimensional numerical simulation of the effect of single event burnout for n-channel VDMOSFET

    International Nuclear Information System (INIS)

    Guo Hongxia; Chen Yusheng; Wang Wei; Zhao Jinlong; Zhang Yimen; Zhou Hui

    2004-01-01

    2D MEDICI simulator is used to investigate the effect of Single Event Burnout (SEB) for n-channel power VDMOSFETs. The simulation results are consistent with experimental results which have been published. The simulation results are of great interest for a better understanding of the occurrence of events. The effects of the minority carrier lifetime in the base region, the base width and the emitter doping density on SEB susceptibility are verified. Some hardening solutions to SEB are provided. The work shows that the 2D simulator MEDICI is an useful tool for burnout prediction and for the evaluation of hardening solutions. (authors)

  2. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    International Nuclear Information System (INIS)

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  3. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  4. In situ simulation: Taking reported critical incidents and adverse events back to the clinic

    DEFF Research Database (Denmark)

    Juul, Jonas; Paltved, Charlotte; Krogh, Kristian

    2014-01-01

    for content analysis4 and thematic analysis5. Medical experts and simulation faculty will design scenarios for in situ simulation training based on the analysis. Short-term observations using time logs will be performed along with interviews with key informants at the departments. Video data will be collected...... improve patient safety if coupled with training and organisational support2. Insight into the nature of reported critical incidents and adverse events can be used in writing in situ simulation scenarios and thus lead to interventions that enhance patient safety. The patient safety literature emphasises...... well-developed non-technical skills in preventing medical errors3. Furthermore, critical incidents and adverse events reporting systems comprise a knowledgebase to gain in-depth insights into patient safety issues. This study explores the use of critical incidents and adverse events reports to inform...

  5. Multi-day activity scheduling reactions to planned activities and future events in a dynamic model of activity-travel behavior

    Science.gov (United States)

    Nijland, Linda; Arentze, Theo; Timmermans, Harry

    2014-01-01

    Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of individuals indicate the importance of incorporating those pre-planned activities in the new generation of dynamic travel demand models. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that takes into account possible influences of pre-planned activities and events. This paper describes the theory and shows the results of simulations of the extension. The simulation was conducted for six different activities, and the parameter values used were consistent with an earlier estimation study. The results show that the model works well and that the influences of the parameters are consistent, logical, and have clear interpretations. These findings offer further evidence of face and construct validity to the suggested modeling approach.

  6. Quasi-continuous stochastic simulation framework for flood modelling

    Science.gov (United States)

    Moustakis, Yiannis; Kossieris, Panagiotis; Tsoukalas, Ioannis; Efstratiadis, Andreas

    2017-04-01

    Typically, flood modelling in the context of everyday engineering practices is addressed through event-based deterministic tools, e.g., the well-known SCS-CN method. A major shortcoming of such approaches is the ignorance of uncertainty, which is associated with the variability of soil moisture conditions and the variability of rainfall during the storm event.In event-based modeling, the sole expression of uncertainty is the return period of the design storm, which is assumed to represent the acceptable risk of all output quantities (flood volume, peak discharge, etc.). On the other hand, the varying antecedent soil moisture conditions across the basin are represented by means of scenarios (e.g., the three AMC types by SCS),while the temporal distribution of rainfall is represented through standard deterministic patterns (e.g., the alternative blocks method). In order to address these major inconsistencies,simultaneously preserving the simplicity and parsimony of the SCS-CN method, we have developed a quasi-continuous stochastic simulation approach, comprising the following steps: (1) generation of synthetic daily rainfall time series; (2) update of potential maximum soil moisture retention, on the basis of accumulated five-day rainfall; (3) estimation of daily runoff through the SCS-CN formula, using as inputs the daily rainfall and the updated value of soil moisture retention;(4) selection of extreme events and application of the standard SCS-CN procedure for each specific event, on the basis of synthetic rainfall.This scheme requires the use of two stochastic modelling components, namely the CastaliaR model, for the generation of synthetic daily data, and the HyetosMinute model, for the disaggregation of daily rainfall to finer temporal scales. Outcomes of this approach are a large number of synthetic flood events, allowing for expressing the design variables in statistical terms and thus properly evaluating the flood risk.

  7. An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Donev, A; Garcia, A L; Alder, B J

    2007-07-30

    A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.

  8. Staffs' and managers' perceptions of how and when discrete event simulation modelling can be used as a decision support in quality improvement: a focus group discussion study at two hospital settings in Sweden.

    Science.gov (United States)

    Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria

    2017-06-06

    To explore healthcare staffs' and managers' perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Two focus group discussions were performed. Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Healthcare staff and managers (n=13) from the two settings. Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Efficient rare-event simulation for multiple jump events in regularly varying random walks and compound Poisson processes

    NARCIS (Netherlands)

    B. Chen (Bohan); J. Blanchet; C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2017-01-01

    textabstractWe propose a class of strongly efficient rare event simulation estimators for random walks and compound Poisson processes with a regularly varying increment/jump-size distribution in a general large deviations regime. Our estimator is based on an importance sampling strategy that hinges

  10. Assessing the Impacts of Flooding Caused by Extreme Rainfall Events Through a Combined Geospatial and Numerical Modeling Approach

    Science.gov (United States)

    Santillan, J. R.; Amora, A. M.; Makinano-Santillan, M.; Marqueso, J. T.; Cutamora, L. C.; Serviano, J. L.; Makinano, R. M.

    2016-06-01

    In this paper, we present a combined geospatial and two dimensional (2D) flood modeling approach to assess the impacts of flooding due to extreme rainfall events. We developed and implemented this approach to the Tago River Basin in the province of Surigao del Sur in Mindanao, Philippines, an area which suffered great damage due to flooding caused by Tropical Storms Lingling and Jangmi in the year 2014. The geospatial component of the approach involves extraction of several layers of information such as detailed topography/terrain, man-made features (buildings, roads, bridges) from 1-m spatial resolution LiDAR Digital Surface and Terrain Models (DTM/DSMs), and recent land-cover from Landsat 7 ETM+ and Landsat 8 OLI images. We then used these layers as inputs in developing a Hydrologic Engineering Center Hydrologic Modeling System (HEC HMS)-based hydrologic model, and a hydraulic model based on the 2D module of the latest version of HEC River Analysis System (RAS) to dynamically simulate and map the depth and extent of flooding due to extreme rainfall events. The extreme rainfall events used in the simulation represent 6 hypothetical rainfall events with return periods of 2, 5, 10, 25, 50, and 100 years. For each event, maximum flood depth maps were generated from the simulations, and these maps were further transformed into hazard maps by categorizing the flood depth into low, medium and high hazard levels. Using both the flood hazard maps and the layers of information extracted from remotely-sensed datasets in spatial overlay analysis, we were then able to estimate and assess the impacts of these flooding events to buildings, roads, bridges and landcover. Results of the assessments revealed increase in number of buildings, roads and bridges; and increase in areas of land-cover exposed to various flood hazards as rainfall events become more extreme. The wealth of information generated from the flood impact assessment using the approach can be very useful to the

  11. Modeling associations between latent event processes governing time series of pulsing hormones.

    Science.gov (United States)

    Liu, Huayu; Carlson, Nichole E; Grunwald, Gary K; Polotsky, Alex J

    2017-10-31

    This work is motivated by a desire to quantify relationships between two time series of pulsing hormone concentrations. The locations of pulses are not directly observed and may be considered latent event processes. The latent event processes of pulsing hormones are often associated. It is this joint relationship we model. Current approaches to jointly modeling pulsing hormone data generally assume that a pulse in one hormone is coupled with a pulse in another hormone (one-to-one association). However, pulse coupling is often imperfect. Existing joint models are not flexible enough for imperfect systems. In this article, we develop a more flexible class of pulse association models that incorporate parameters quantifying imperfect pulse associations. We propose a novel use of the Cox process model as a model of how pulse events co-occur in time. We embed the Cox process model into a hormone concentration model. Hormone concentration is the observed data. Spatial birth and death Markov chain Monte Carlo is used for estimation. Simulations show the joint model works well for quantifying both perfect and imperfect associations and offers estimation improvements over single hormone analyses. We apply this model to luteinizing hormone (LH) and follicle stimulating hormone (FSH), two reproductive hormones. Use of our joint model results in an ability to investigate novel hypotheses regarding associations between LH and FSH secretion in obese and non-obese women. © 2017, The International Biometric Society.

  12. Discrete-Event Simulation Unmasks the Quantum Cheshire Cat

    Science.gov (United States)

    Michielsen, Kristel; Lippert, Thomas; Raedt, Hans De

    2017-05-01

    It is shown that discrete-event simulation accurately reproduces the experimental data of a single-neutron interferometry experiment [T. Denkmayr {\\sl et al.}, Nat. Commun. 5, 4492 (2014)] and provides a logically consistent, paradox-free, cause-and-effect explanation of the quantum Cheshire cat effect without invoking the notion that the neutron and its magnetic moment separate. Describing the experimental neutron data using weak-measurement theory is shown to be useless for unravelling the quantum Cheshire cat effect.

  13. Operational analysis and improvement of a spent nuclear fuel handling and treatment facility using discrete event simulation

    International Nuclear Information System (INIS)

    Garcia, H.E.

    2000-01-01

    Spent nuclear fuel handling and treatment often require facilities with a high level of operational complexity. Simulation models can reveal undesirable characteristics and production problems before they become readily apparent during system operations. The value of this approach is illustrated here through an operational study, using discrete event modeling techniques, to analyze the Fuel Conditioning Facility at Argonne National Laboratory and to identify enhanced nuclear waste treatment configurations. The modeling approach and results of what-if studies are discussed. An example on how to improve productivity is presented.

  14. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  15. Multi-spacecraft observations and transport simulations of solar energetic particles for the May 17th 2012 event

    Science.gov (United States)

    Battarbee, M.; Guo, J.; Dalla, S.; Wimmer-Schweingruber, R.; Swalwell, B.; Lawrence, D. J.

    2018-05-01

    Context. The injection, propagation and arrival of solar energetic particles (SEPs) during eruptive solar events is an important and current research topic of heliospheric physics. During the largest solar events, particles may have energies up to a few GeVs and sometimes even trigger ground-level enhancements (GLEs) at Earth. These large SEP events are best investigated through multi-spacecraft observations. Aims: We aim to study the first GLE-event of solar cycle 24, from 17th May 2012, using data from multiple spacecraft (SOHO, GOES, MSL, STEREO-A, STEREO-B and MESSENGER). These spacecraft are located throughout the inner heliosphere, at heliocentric distances between 0.34 and 1.5 astronomical units (au), covering nearly the whole range of heliospheric longitudes. Methods: We present and investigate sub-GeV proton time profiles for the event at several energy channels, obtained via different instruments aboard the above spacecraft. We investigated issues caused by magnetic connectivity, and present results of three-dimensional SEP propagation simulations. We gathered virtual time profiles and perform qualitative and quantitative comparisons with observations, assessed longitudinal injection and transport effects as well as peak intensities. Results: We distinguish different time profile shapes for well-connected and weakly connected observers, and find our onset time analysis to agree with this distinction. At select observers, we identify an additional low-energy component of Energetic Storm Particles (ESPs). Using well-connected observers for normalisation, our simulations are able to accurately recreate both time profile shapes and peak intensities at multiple observer locations. Conclusions: This synergetic approach combining numerical modelling with multi-spacecraft observations is crucial for understanding the propagation of SEPs within the interplanetary magnetic field. Our novel analysis provides valuable proof of the ability to simulate SEP propagation

  16. Evaluation of 6 and 10 Year-Old Child Human Body Models in Emergency Events.

    Science.gov (United States)

    Gras, Laure-Lise; Stockman, Isabelle; Brolin, Karin

    2017-01-01

    Emergency events can influence a child's kinematics prior to a car-crash, and thus its interaction with the restraint system. Numerical Human Body Models (HBMs) can help understand the behaviour of children in emergency events. The kinematic responses of two child HBMs-MADYMO 6 and 10 year-old models-were evaluated and compared with child volunteers' data during emergency events-braking and steering-with a focus on the forehead and sternum displacements. The response of the 6 year-old HBM was similar to the response of the 10 year-old HBM, however both models had a different response compared with the volunteers. The forward and lateral displacements were within the range of volunteer data up to approximately 0.3 s; but then, the HBMs head and sternum moved significantly downwards, while the volunteers experienced smaller displacement and tended to come back to their initial posture. Therefore, these HBMs, originally intended for crash simulations, are not too stiff and could be able to reproduce properly emergency events thanks, for instance, to postural control.

  17. Interferences and events on epistemic shifts in physics through computer simulations

    CERN Document Server

    Warnke, Martin

    2017-01-01

    Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.

  18. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    International Nuclear Information System (INIS)

    Mercurio, D.

    2011-01-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  19. Measurement of Forward-Backward Asymmetry of Simulated and Reconstructed $Z' \\to \\mu^{+}\\mu^{-}$ Events in CMS

    CERN Document Server

    Cousins, Robert; Valuev, Vyacheslav

    2005-01-01

    This note describes a fitting technique for measuring the forward-backward asymmetry A_FB of Z' --> mu+ mu- events. We extract A_FB of fully simulated and reconstructed events by using an unbinned maximum likelihood fit based on a probability density function for six observables. We illustrate the potential for the measured on-peak A_FB to be used to distinguish among various couplings of a Z', such as those associated with Zssm, Zpsi, Zeta, Zchi, ZLRM, or ZALRM models. With 400 fb^-1 of integrated luminosity at CMS, one can distinguish between either a Zchi or ZALRM and one of the other four models with significance level alpha >= 3 sigma up to a Z' mass between 2.0 and 2.7 TeV. One can distinguish among the other four models with the same level of significance only up to Z' mass equal to 1 - 1.5 TeV, whereas ZALRM and Zchi are indistinguishable in the mass range of Z' mass >= 1 TeV.

  20. Future changes in extreme precipitation in the Rhine basin based on global and regional climate model simulations

    NARCIS (Netherlands)

    Pelt, van S.C.; Beersma, J.J.; Buishand, T.A.; Hurk, van den B.J.J.M.; Kabat, P.

    2012-01-01

    Probability estimates of the future change of extreme precipitation events are usually based on a limited number of available global climate model (GCM) or regional climate model (RCM) simulations. Since floods are related to heavy precipitation events, this restricts the assessment of flood risks.

  1. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  2. Modeling of episodic particulate matter events using a 3-D air quality model with fine grid: Applications to a pair of cities in the US/Mexico border

    Science.gov (United States)

    Choi, Yu-Jin; Hyde, Peter; Fernando, H. J. S.

    High (episodic) particulate matter (PM) events over the sister cities of Douglas (AZ) and Agua Prieta (Sonora), located in the US-Mexico border, were simulated using the 3D Eulerian air quality model, MODELS-3/CMAQ. The best available input information was used for the simulations, with pollution inventory specified on a fine grid. In spite of inherent uncertainties associated with the emission inventory as well as the chemistry and meteorology of the air quality simulation tool, model evaluations showed acceptable PM predictions, while demonstrating the need for including the interaction between meteorology and emissions in an interactive mode in the model, a capability currently unavailable in MODELS-3/CMAQ when dealing with PM. Sensitivity studies on boundary influence indicate an insignificant regional (advection) contribution of PM to the study area. The contribution of secondary particles to the occurrence of high PM events was trivial. High PM episodes in the study area, therefore, are purely local events that largely depend on local meteorological conditions. The major PM emission sources were identified as vehicular activities on unpaved/paved roads and wind-blown dust. The results will be of immediate utility in devising PM mitigation strategies for the study area, which is one of the US EPA-designated non-attainment areas with respect to PM.

  3. Hierarchical Context Modeling for Video Event Recognition.

    Science.gov (United States)

    Wang, Xiaoyang; Ji, Qiang

    2016-10-11

    Current video event recognition research remains largely target-centered. For real-world surveillance videos, targetcentered event recognition faces great challenges due to large intra-class target variation, limited image resolution, and poor detection and tracking results. To mitigate these challenges, we introduced a context-augmented video event recognition approach. Specifically, we explicitly capture different types of contexts from three levels including image level, semantic level, and prior level. At the image level, we introduce two types of contextual features including the appearance context features and interaction context features to capture the appearance of context objects and their interactions with the target objects. At the semantic level, we propose a deep model based on deep Boltzmann machine to learn event object representations and their interactions. At the prior level, we utilize two types of prior-level contexts including scene priming and dynamic cueing. Finally, we introduce a hierarchical context model that systematically integrates the contextual information at different levels. Through the hierarchical context model, contexts at different levels jointly contribute to the event recognition. We evaluate the hierarchical context model for event recognition on benchmark surveillance video datasets. Results show that incorporating contexts in each level can improve event recognition performance, and jointly integrating three levels of contexts through our hierarchical model achieves the best performance.

  4. Constrained optimization via simulation models for new product innovation

    Science.gov (United States)

    Pujowidianto, Nugroho A.

    2017-11-01

    We consider the problem of constrained optimization where the decision makers aim to optimize the primary performance measure while constraining the secondary performance measures. This paper provides a brief overview of stochastically constrained optimization via discrete event simulation. Most review papers tend to be methodology-based. This review attempts to be problem-based as decision makers may have already decided on the problem formulation. We consider constrained optimization models as there are usually constraints on secondary performance measures as trade-off in new product development. It starts by laying out different possible methods and the reasons using constrained optimization via simulation models. It is then followed by the review of different simulation optimization approach to address constrained optimization depending on the number of decision variables, the type of constraints, and the risk preferences of the decision makers in handling uncertainties.

  5. A semi-analytical foreshock model for energetic storm particle events inside 1 AU

    Directory of Open Access Journals (Sweden)

    Vainio Rami

    2014-02-01

    Full Text Available We have constructed a semi-analytical model of the energetic-ion foreshock of a CME-driven coronal/interplanetary shock wave responsible for the acceleration of large solar energetic particle (SEP events. The model is based on the analytical model of diffusive shock acceleration of Bell (1978, appended with a temporal dependence of the cut-off momentum of the energetic particles accelerated at the shock, derived from the theory. Parameters of the model are re-calibrated using a fully time-dependent self-consistent simulation model of the coupled particle acceleration and Alfvén-wave generation upstream of the shock. Our results show that analytical estimates of the cut-off energy resulting from the simplified theory and frequently used in SEP modelling are overestimating the cut-off momentum at the shock by one order magnitude. We show also that the cut-off momentum observed remotely far upstream of the shock (e.g., at 1 AU can be used to infer the properties of the foreshock and the resulting energetic storm particle (ESP event, when the shock is still at small distances from the Sun, unaccessible to the in-situ observations. Our results can be used in ESP event modelling for future missions to the inner heliosphere, like the Solar Orbiter and Solar Probe Plus as well as in developing acceleration models for SEP events in the solar corona.

  6. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism

    NARCIS (Netherlands)

    van Beek, J.H.G.M.; Supandi, F.B.; Gavai, Anand; de Graaf, A.A.; Binsl, T.W.; Hettling, H.

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events.We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We

  7. Simulating the physiology of athletes during endurance sports events: Modelling human energy conversion and metabolism

    NARCIS (Netherlands)

    Beek, J.H.G.M. van; Supandi, F.; Gavai, A.K.; Graaf, A.A. de; Binsl, T.W.; Hettling, H.

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events.We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We

  8. A Monte Carlo study on event-by-event transverse momentum fluctuation at RHIC

    International Nuclear Information System (INIS)

    Xu Mingmei

    2005-01-01

    The experimental observation on the multiplicity dependence of event-by-event transverse momentum fluctuation in relativistic heavy ion collisions is studied using Monte Carlo simulation. It is found that the Monte Carlo generator HIJING is unable to describe the experimental phenomenon well. A simple Monte Carlo model is proposed, which can recover the data and thus shed some light on the dynamical origin of the multiplicity dependence of event-by-event transverse momentum fluctuation. (authors)

  9. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  10. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    Directory of Open Access Journals (Sweden)

    Ninna Reitzel Jensen

    2015-06-01

    Full Text Available Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account the Markov model for the state of the policyholder and, hereby, facilitating event risk.

  11. SPATKIN: a simulator for rule-based modeling of biomolecular site dynamics on surfaces.

    Science.gov (United States)

    Kochanczyk, Marek; Hlavacek, William S; Lipniacki, Tomasz

    2017-11-15

    Rule-based modeling is a powerful approach for studying biomolecular site dynamics. Here, we present SPATKIN, a general-purpose simulator for rule-based modeling in two spatial dimensions. The simulation algorithm is a lattice-based method that tracks Brownian motion of individual molecules and the stochastic firing of rule-defined reaction events. Because rules are used as event generators, the algorithm is network-free, meaning that it does not require to generate the complete reaction network implied by rules prior to simulation. In a simulation, each molecule (or complex of molecules) is taken to occupy a single lattice site that cannot be shared with another molecule (or complex). SPATKIN is capable of simulating a wide array of membrane-associated processes, including adsorption, desorption and crowding. Models are specified using an extension of the BioNetGen language, which allows to account for spatial features of the simulated process. The C ++ source code for SPATKIN is distributed freely under the terms of the GNU GPLv3 license. The source code can be compiled for execution on popular platforms (Windows, Mac and Linux). An installer for 64-bit Windows and a macOS app are available. The source code and precompiled binaries are available at the SPATKIN Web site (http://pmbm.ippt.pan.pl/software/spatkin). spatkin.simulator@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Evaluation on the cost-effective threshold of osteoporosis treatment on elderly women in China using discrete event simulation model.

    Science.gov (United States)

    Ni, W; Jiang, Y

    2017-02-01

    This study used a simulation model to determine the cost-effective threshold of fracture risk to treat osteoporosis among elderly Chinese women. Osteoporosis treatment is cost-effective among average-risk women who are at least 75 years old and above-average-risk women who are younger than 75 years old. Aging of the Chinese population is imposing increasing economic burden of osteoporosis. This study evaluated the cost-effectiveness of osteoporosis treatment among the senior Chinese women population. A discrete event simulation model using age-specific probabilities of hip fracture, clinical vertebral fracture, wrist fracture, humerus fracture, and other fracture; costs (2015 US dollars); and quality-adjusted life years (QALYs) was used to assess the cost-effectiveness of osteoporosis treatment. Incremental cost-effectiveness ratio (ICER) was calculated. The willingness to pay (WTP) for a QALY in China was compared with the calculated ICER to decide the cost-effectiveness. To determine the absolute 10-year hip fracture probability at which the osteoporosis treatment became cost-effective, average age-specific probabilities for all fractures were multiplied by a relative risk (RR) that was systematically varied from 0 to 10 until the WTP threshold was observed for treatment relative to no intervention. Sensitivity analyses were also performed to evaluate the impacts from WTP and annual treatment costs. In baseline analysis, simulated ICERs were higher than the WTP threshold among Chinese women younger than 75, but much lower than the WTP among the older population. Sensitivity analyses indicated that cost-effectiveness could vary due to a higher WTP threshold or a lower annual treatment cost. A 30 % increase in WTP or a 30 % reduction in annual treatment costs will make osteoporosis treatment cost-effective for Chinese women population from 55 to 85. The current study provides evidence that osteoporosis treatment is cost-effective among a subpopulation of

  13. Analysis of nucleation events in the European boundary layer using the regional aerosol–climate model REMO-HAM with a solar radiation-driven OH-proxy

    Directory of Open Access Journals (Sweden)

    J.-P. Pietikäinen

    2014-11-01

    Full Text Available This work describes improvements in the regional aerosol–climate model REMO-HAM in order to simulate more realistically the process of atmospheric new particle formation (NPF. A new scheme was implemented to simulate OH radical concentrations using a proxy approach based on observations and also accounting for the effects of clouds upon OH concentrations. Second, the nucleation rate calculation was modified to directly simulate the formation rates of 3 nm particles, which removes some unnecessary steps in the formation rate calculations used earlier in the model. Using the updated model version, NPF over Europe was simulated for the periods 2003–2004 and 2008–2009. The statistics of the simulated particle formation events were subsequently compared to observations from 13 ground-based measurement sites. The new model shows improved agreement with the observed NPF rates compared to former versions and can simulate the event statistics realistically for most parts of Europe.

  14. Structured Event-B Models and Proofs

    DEFF Research Database (Denmark)

    Hallerstede, Stefan

    2010-01-01

    Event-B does not provide specific support for the modelling of problems that require some structuring, such as, local variables or sequential ordering of events. All variables need to be declared globally and sequential ordering of events can only be achieved by abstract program counters. This ha...

  15. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  16. Galaxy Alignments: Theory, Modelling & Simulations

    Science.gov (United States)

    Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais

    2015-11-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.

  17. An observational and modeling study of the August 2017 Florida climate extreme event.

    Science.gov (United States)

    Konduru, R.; Singh, V.; Routray, A.

    2017-12-01

    A special report on the climate extremes by the Intergovernmental Panel on Climate Change (IPCC) elucidates that the sole cause of disasters is due to the exposure and vulnerability of the human and natural system to the climate extremes. The cause of such a climate extreme could be anthropogenic or non-anthropogenic. Therefore, it is challenging to discern the critical factor of influence for a particular climate extreme. Such kind of perceptive study with reasonable confidence on climate extreme events is possible only if there exist any past case studies. A similar rarest climate extreme problem encountered in the case of Houston floods and extreme rainfall over Florida in August 2017. A continuum of hurricanes like Harvey and Irma targeted the Florida region and caused catastrophe. Due to the rarity of August 2017 Florida climate extreme event, it requires the in-depth study on this case. To understand the multi-faceted nature of the event, a study on the development of the Harvey hurricane and its progression and dynamics is significant. Current article focus on the observational and modeling study on the Harvey hurricane. A global model named as NCUM (The global UK Met office Unified Model (UM) operational at National Center for Medium Range Weather Forecasting, India, was utilized to simulate the Harvey hurricane. The simulated rainfall and wind fields were compared with the observational datasets like Tropical Rainfall Measuring Mission rainfall datasets and Era-Interim wind fields. The National Centre for Environmental Prediction (NCEP) automated tracking system was utilized to track the Harvey hurricane, and the tracks were analyzed statistically for different forecasts concerning the Harvey hurricane track of Joint Typhon Warning Centre. Further, the current study will be continued to investigate the atmospheric processes involved in the August 2017 Florida climate extreme event.

  18. Modeling and evaluation of urban pollution events of atmospheric heavy metals from a large Cu-smelter.

    Science.gov (United States)

    Chen, Bing; Stein, Ariel F; Castell, Nuria; Gonzalez-Castanedo, Yolanda; Sanchez de la Campa, A M; de la Rosa, J D

    2016-01-01

    Metal smelting and processing are highly polluting activities that have a strong influence on the levels of heavy metals in air, soil, and crops. We employ an atmospheric transport and dispersion model to predict the pollution levels originated from the second largest Cu-smelter in Europe. The model predicts that the concentrations of copper (Cu), zinc (Zn), and arsenic (As) in an urban area close to the Cu-smelter can reach 170, 70, and 30 ng m−3, respectively. The model captures all the observed urban pollution events, but the magnitude of the elemental concentrations is predicted to be lower than that of the observed values; ~300, ~500, and ~100 ng m−3 for Cu, Zn, and As, respectively. The comparison between model and observations showed an average correlation coefficient of 0.62 ± 0.13. The simulation shows that the transport of heavy metals reaches a peak in the afternoon over the urban area. The under-prediction in the peak is explained by the simulated stronger winds compared with monitoring data. The stronger simulated winds enhance the transport and dispersion of heavy metals to the regional area, diminishing the impact of pollution events in the urban area. This model, driven by high resolution meteorology (2 km in horizontal), predicts the hourly-interval evolutions of atmospheric heavy metal pollutions in the close by urban area of industrial hotspot.

  19. Application of declarative modeling approaches for external events

    International Nuclear Information System (INIS)

    Anoba, R.C.

    2005-01-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at Nuclear Power Plants. Since the issuance of Generic Letter 88-20 and subsequent IPE/IPEEE assessments, the NRC has issued several Regulatory Guides such as RG 1.174 to describe the use of PSA in risk-informed regulation activities. Most PSA have the capability to address internal events including internal floods. As the more demands are being placed for using the PSA to support risk-informed applications, there has been a growing need to integrate other eternal events (Seismic, Fire, etc.) into the logic models. Most external events involve spatial dependencies and usually impact the logic models at the component level. Therefore, manual insertion of external events impacts into a complex integrated fault tree model may be too cumbersome for routine uses of the PSA. Within the past year, a declarative modeling approach has been developed to automate the injection of external events into the PSA. The intent of this paper is to introduce the concept of declarative modeling in the context of external event applications. A declarative modeling approach involves the definition of rules for injection of external event impacts into the fault tree logic. A software tool such as the EPRI's XInit program can be used to interpret the pre-defined rules and automatically inject external event elements into the PSA. The injection process can easily be repeated, as required, to address plant changes, sensitivity issues, changes in boundary conditions, etc. External event elements may include fire initiating events, seismic initiating events, seismic fragilities, fire-induced hot short events, special human failure events, etc. This approach has been applied at a number of US nuclear power plants including a nuclear power plant in Romania. (authors)

  20. A class of Box-Cox transformation models for recurrent event data.

    Science.gov (United States)

    Sun, Liuquan; Tong, Xingwei; Zhou, Xian

    2011-04-01

    In this article, we propose a class of Box-Cox transformation models for recurrent event data, which includes the proportional means models as special cases. The new model offers great flexibility in formulating the effects of covariates on the mean functions of counting processes while leaving the stochastic structure completely unspecified. For the inference on the proposed models, we apply a profile pseudo-partial likelihood method to estimate the model parameters via estimating equation approaches and establish large sample properties of the estimators and examine its performance in moderate-sized samples through simulation studies. In addition, some graphical and numerical procedures are presented for model checking. An example of application on a set of multiple-infection data taken from a clinic study on chronic granulomatous disease (CGD) is also illustrated.

  1. Using discrete-event simulation in strategic capacity planning for an outpatient physical therapy service.

    Science.gov (United States)

    Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan

    2013-12-01

    This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.

  2. Debris flow run-out simulation and analysis using a dynamic model

    Science.gov (United States)

    Melo, Raquel; van Asch, Theo; Zêzere, José L.

    2018-02-01

    Only two months after a huge forest fire occurred in the upper part of a valley located in central Portugal, several debris flows were triggered by intense rainfall. The event caused infrastructural and economic damage, although no lives were lost. The present research aims to simulate the run-out of two debris flows that occurred during the event as well as to calculate via back-analysis the rheological parameters and the excess rain involved. Thus, a dynamic model was used, which integrates surface runoff, concentrated erosion along the channels, propagation and deposition of flow material. Afterwards, the model was validated using 32 debris flows triggered during the same event that were not considered for calibration. The rheological and entrainment parameters obtained for the most accurate simulation were then used to perform three scenarios of debris flow run-out on the basin scale. The results were confronted with the existing buildings exposed in the study area and the worst-case scenario showed a potential inundation that may affect 345 buildings. In addition, six streams where debris flow occurred in the past and caused material damage and loss of lives were identified.

  3. Measurement-Based Hybrid Fluid-Flow Models for Fast Multi-Scale Simulation and Control

    National Research Council Canada - National Science Library

    Sohraby, Khosrow

    2004-01-01

    .... We point out that traditional queuing models are intractable or provide poor fit to real-life networks, while discrete-event simulation at the packet level can consume prohibitive amounts of CPU times...

  4. Computer simulation of thermal-hydraulic transient events in multi-circuits with multipumps

    International Nuclear Information System (INIS)

    Veloso, Marcelo Antonio

    2003-01-01

    PANTERA-2 (from Programa para Analise Termo-hidraulica de Reatores a Agua - Program for Thermal-hydraulic Analysis of Water Reactors, Version 2), whose fundamentals are described in this work, is intended to carry out rod bundle subchannel analysis in conjunction with multiloop simulation. It solves simultaneously the conservation equations of mass, axial and lateral momentum, and energy for subchannel geometry coupled with the balance equations that describe the fluid flows in any number of coolant loops connected to a pressure vessel containing the rod bundle. As far as subchannel analysis is concerned, the basic computational strategy of PANTERA-2 comes from COBRA codes, but an alternative implicit solution method oriented to the pressure field has been used to solve the finite difference approximations for the balance laws. The results provided by the subchannel model comprise the fluid density, enthalpy, flow rate, and pressure fields in the subchannels. The loop model predicts the individual loop flows, total flow through the pressure vessel, and pump rotational speeds as a function of time subsequent to the failure of any number of the coolant pumps. The flow transients in the loops may initiated by partial, total or sequential loss of electric power to the operating pumps. Transient events caused by either shaft break or rotor locking may also be simulated. The changes in rotational speed of the pumps as a function of rime are determined from a torque balance. Pump dynamic head and hydraulic torque are calculated as a function of rotational speed and volumetric flow from two polar homologous curves supplied to the code in the tabular form. In order to illustrate the analytical capability of PANTERA-2, three sample problems are presented and discussed. Comparisons between calculated and measured results indicate that the program reproduces with a good accuracy experimental data for subchannel exit temperatures and critical heat fluxes in 5x5 rod bundles. It

  5. Event-based aquifer-to-atmosphere modeling over the European CORDEX domain

    Science.gov (United States)

    Keune, J.; Goergen, K.; Sulis, M.; Shrestha, P.; Springer, A.; Kusche, J.; Ohlwein, C.; Kollet, S. J.

    2014-12-01

    Despite the fact that recent studies focus on the impact of soil moisture on climate and especially land-energy feedbacks, groundwater dynamics are often neglected or conceptual groundwater flow models are used. In particular, in the context of climate change and the occurrence of droughts and floods, a better understanding and an improved simulation of the physical processes involving groundwater on continental scales is necessary. This requires the implementation of a physically consistent terrestrial modeling system, which explicitly incorporates groundwater dynamics and the connection with shallow soil moisture. Such a physics-based system enables simulations and monitoring of groundwater storage and enhanced representations of the terrestrial energy and hydrologic cycles over long time periods. On shorter timescales, the prediction of groundwater-related extremes, such as floods and droughts, are expected to improve, because of the improved simulation of components of the hydrological cycle. In this study, we present a fully coupled aquifer-to-atmosphere modeling system over the European CORDEX domain. The integrated Terrestrial Systems Modeling Platform, TerrSysMP, consisting of the three-dimensional subsurface model ParFlow, the Community Land Model CLM3.5 and the numerical weather prediction model COSMO of the German Weather Service, is used. The system is set up with a spatial resolution of 0.11° (12.5km) and closes the terrestrial water and energy cycles from aquifers into the atmosphere. Here, simulations of the fully coupled system are performed over events, such as the 2013 flood in Central Europe and the 2003 European heat wave, and over extended time periods on the order of 10 years. State and flux variables of the terrestrial hydrologic and energy cycle are analyzed and compared to both in situ (e.g. stream and water level gauge networks, FLUXNET) and remotely sensed observations (e.g. GRACE, ESA ICC ECV soil moisture and SMOS). Additionally, the

  6. Developing future precipitation events from historic events: An Amsterdam case study.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two

  7. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  8. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  9. Application of RADSAFE to Model Single Event Upset Response of a 0.25 micron CMOS SRAM

    Science.gov (United States)

    Warren, Kevin M.; Weller, Robert A.; Sierawski, Brian; Reed, Robert A.; Mendenhall, Marcus H.; Schrimpf, Ronald D.; Massengill, Lloyd; Porter, Mark; Wilkerson, Jeff; LaBel, Kenneth A.; hide

    2006-01-01

    The RADSAFE simulation framework is described and applied to model Single Event Upsets (SEU) in a 0.25 micron CMOS 4Mbit Static Random Access Memory (SRAM). For this circuit, the RADSAFE approach produces trends similar to those expected from classical models, but more closely represents the physical mechanisms responsible for SEU in the SRAM circuit.

  10. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  11. Discrete Event Simulation of Distributed Team Communication

    Science.gov (United States)

    2012-03-22

    performs, and auditory information that is provided through multiple audio devices with speech response. This paper extends previous discrete event workload...2008, pg. 1) notes that “Architecture modeling furnishes abstrac- tions for use in managing complexities, allowing engineers to visualise the proposed

  12. Statistical modelling for recurrent events: an application to sports injuries.

    Science.gov (United States)

    Ullah, Shahid; Gabbett, Tim J; Finch, Caroline F

    2014-09-01

    Injuries are often recurrent, with subsequent injuries influenced by previous occurrences and hence correlation between events needs to be taken into account when analysing such data. This paper compares five different survival models (Cox proportional hazards (CoxPH) model and the following generalisations to recurrent event data: Andersen-Gill (A-G), frailty, Wei-Lin-Weissfeld total time (WLW-TT) marginal, Prentice-Williams-Peterson gap time (PWP-GT) conditional models) for the analysis of recurrent injury data. Empirical evaluation and comparison of different models were performed using model selection criteria and goodness-of-fit statistics. Simulation studies assessed the size and power of each model fit. The modelling approach is demonstrated through direct application to Australian National Rugby League recurrent injury data collected over the 2008 playing season. Of the 35 players analysed, 14 (40%) players had more than 1 injury and 47 contact injuries were sustained over 29 matches. The CoxPH model provided the poorest fit to the recurrent sports injury data. The fit was improved with the A-G and frailty models, compared to WLW-TT and PWP-GT models. Despite little difference in model fit between the A-G and frailty models, in the interest of fewer statistical assumptions it is recommended that, where relevant, future studies involving modelling of recurrent sports injury data use the frailty model in preference to the CoxPH model or its other generalisations. The paper provides a rationale for future statistical modelling approaches for recurrent sports injury. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Reduced herbivory during simulated ENSO rainy events increases native herbaceous plants in semiarid Chile

    NARCIS (Netherlands)

    Manrique, R.; Gutierrez, J.R.; Holmgren, M.; Squeo, F.A.

    2007-01-01

    El Niño Southern Oscillation (ENSO) events have profound consequences for the dynamics of terrestrial ecosystems. Since increased climate variability is expected to favour the invasive success of exotic species, we conducted a field experiment to study the effects that simulated rainy ENSO events in

  14. Hydrological simulation of flood transformations in the upper Danube River: Case study of large flood events

    Directory of Open Access Journals (Sweden)

    Mitková Veronika Bačová

    2016-12-01

    Full Text Available The problem of understand natural processes as factors that restrict, limit or even jeopardize the interests of human society is currently of great concern. The natural transformation of flood waves is increasingly affected and disturbed by artificial interventions in river basins. The Danube River basin is an area of high economic and water management importance. Channel training can result in changes in the transformation of flood waves and different hydrographic shapes of flood waves compared with the past. The estimation and evolution of the transformation of historical flood waves under recent river conditions is only possible by model simulations. For this purpose a nonlinear reservoir cascade model was constructed. The NLN-Danube nonlinear reservoir river model was used to simulate the transformation of flood waves in four sections of the Danube River from Kienstock (Austria to Štúrovo (Slovakia under relatively recent river reach conditions. The model was individually calibrated for two extreme events in August 2002 and June 2013. Some floods that occurred on the Danube during the period of 1991–2002 were used for the validation of the model. The model was used to identify changes in the transformational properties of the Danube channel in the selected river reach for some historical summer floods (1899, 1954 1965 and 1975. Finally, a simulation of flood wave propagation of the most destructive Danube flood of the last millennium (August 1501 is discussed.

  15. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  16. Survival analysis with covariates in combination with multinomial analysis to parametrize time to event for multi-state models

    NARCIS (Netherlands)

    Feenstra, T.L.; Postmus, D.; Quik, E.H.; Langendijk, H.; Krabbe, P.F.M.

    Objectives: Recent ISPOR Good practice guidelines as well as literature encourage to use a single distribution rather than the latent failure approach to model time to event for patient level simulation models with multiple competing outcomes. Aim was to apply the preferred method of a single

  17. Survival analysis with covariates in combination with multinomial analysis to parametrize time to event for multi-state models

    NARCIS (Netherlands)

    Feenstra, T.L.; Postmus, D.; Quik, E.H.; Langendijk, H.; Krabbe, P.F.M.

    2013-01-01

    Objectives: Recent ISPOR Good practice guidelines as well as literature encourage to use a single distribution rather than the latent failure approach to model time to event for patient level simulation models with multiple competing outcomes. Aim was to apply the preferred method of a single

  18. A simulation model of MAPS for the FairRoot framework

    Energy Technology Data Exchange (ETDEWEB)

    Amar-Youcef, Samir; Linnik, Benjamin; Sitzmann, Philipp [Goethe-Universitaet Frankfurt (Germany); Collaboration: CBM-MVD-Collaboration

    2014-07-01

    CMOS MAPS are the sensors of choice for the MVD of the CBM experiment at the FAIR facility. They offer a unique combination of features required for the CBM detector like low material budget, spatial resolution, radiation tolerance and yet sufficient read-out speed. The physics performance of various designs of the MVD integrated to the CBM detector system is evaluated in the CBM-/FairRoot simulation framework. In this context, algorithm are developed to simulate the realistic detector response and to optimize feature extraction from the sensor information. The objective of the sensor response model is to provide fast and realistic pixel response for a given track energy loss and position. In addition, we discuss aspects of simulating event pile-up and dataflow in the context of the CBM FLES event extraction and selection concept. This is of particular importance for the MVD since the sensors feature a comparably long integration time and a frame-wise read-out. All other detector systems operate with un-triggered front-end electronics and are freely streaming time-stamped data to the FLES. Because of the large data rates, event extraction is performed via distributed networking on a large HPC compute farm. We present an overview and status of the MVD software developments focusing on the integration of the system in a free-flowing read-out system and on the concurrent application for simulated and real data.

  19. Modeling Supermassive Black Holes in Cosmological Simulations

    Science.gov (United States)

    Tremmel, Michael

    My thesis work has focused on improving the implementation of supermassive black hole (SMBH) physics in cosmological hydrodynamic simulations. SMBHs are ubiquitous in mas- sive galaxies, as well as bulge-less galaxies and dwarfs, and are thought to be a critical component to massive galaxy evolution. Still, much is unknown about how SMBHs form, grow, and affect their host galaxies. Cosmological simulations are an invaluable tool for un- derstanding the formation of galaxies, self-consistently tracking their evolution with realistic merger and gas accretion histories. SMBHs are often modeled in these simulations (generally as a necessity to produce realistic massive galaxies), but their implementations are commonly simplified in ways that can limit what can be learned. Current and future observations are opening new windows into the lifecycle of SMBHs and their host galaxies, but require more detailed, physically motivated simulations. Within the novel framework I have developed, SMBHs 1) are seeded at early times without a priori assumptions of galaxy occupation, 2) grow in a way that accounts for the angular momentum of gas, and 3) experience realistic orbital evolution. I show how this model, properly tuned with a novel parameter optimiza- tion technique, results in realistic galaxies and SMBHs. Utilizing the unique ability of these simulations to capture the dynamical evolution of SMBHs, I present the first self-consistent prediction for the formation timescales of close SMBH pairs, precursors to SMBH binaries and merger events potentially detected by future gravitational wave experiments.

  20. Simulating extreme low-discharge events for the Rhine using a stochastic model

    Science.gov (United States)

    Macian-Sorribes, Hector; Mens, Marjolein; Schasfoort, Femke; Diermanse, Ferdinand; Pulido-Velazquez, Manuel

    2017-04-01

    The specific features of hydrological droughts make them more difficult to be analysed than other water-related phenomena: longer time scales (months to several years) so less historical events are available, and the drought severity and associate damage depends on a combination of variables with no clear prevalence (e.g., total water deficit, maximum deficit and duration). As part of drought risk analysis, which aims to provide insight into the variability of hydrological conditions and associated socio-economic impacts, long synthetic time series should therefore be developed. In this contribution, we increase the length of the available inflow time series using stochastic autoregressive modelling. This enhancement could improve the characterization of the extreme range and can define extreme droughts with similar periods of return but different patterns that can lead to distinctly different damages. The methodology consists of: 1) fitting an autoregressive model (AR, ARMA…) to the available records; 2) generating extended time series (thousands of years); 3) performing a frequency analysis with different characteristic variables (total, deficit, maximum deficit and so on); and 4) selecting extreme drought events associated with different characteristic variables and return periods. The methodology was applied to the Rhine river discharge at location Lobith, where the Rhine enters The Netherlands. A monthly ARMA(1,1) autoregressive model with seasonally varying parameters was fitted and successfully validated to the historical records available since year 1901. The maximum monthly deficit with respect to a threshold value of 1800 m3/s and the average discharge for a given time span in m3/s were chosen as indicators to identify drought periods. A synthetic series of 10,000 years of discharges was generated using the validated ARMA model. Two time spans were considered in the analysis: the whole calendar year and the half-year period between April and September

  1. Nuclear reactor core modelling in multifunctional simulators

    International Nuclear Information System (INIS)

    Puska, E.K.

    1999-01-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  2. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  3. Evaluation of Three Models for Simulating Pesticide Runoff from Irrigated Agricultural Fields.

    Science.gov (United States)

    Zhang, Xuyang; Goh, Kean S

    2015-11-01

    Three models were evaluated for their accuracy in simulating pesticide runoff at the edge of agricultural fields: Pesticide Root Zone Model (PRZM), Root Zone Water Quality Model (RZWQM), and OpusCZ. Modeling results on runoff volume, sediment erosion, and pesticide loss were compared with measurements taken from field studies. Models were also compared on their theoretical foundations and ease of use. For runoff events generated by sprinkler irrigation and rainfall, all models performed equally well with small errors in simulating water, sediment, and pesticide runoff. The mean absolute percentage errors (MAPEs) were between 3 and 161%. For flood irrigation, OpusCZ simulated runoff and pesticide mass with the highest accuracy, followed by RZWQM and PRZM, likely owning to its unique hydrological algorithm for runoff simulations during flood irrigation. Simulation results from cold model runs by OpusCZ and RZWQM using measured values for model inputs matched closely to the observed values. The MAPE ranged from 28 to 384 and 42 to 168% for OpusCZ and RZWQM, respectively. These satisfactory model outputs showed the models' abilities in mimicking reality. Theoretical evaluations indicated that OpusCZ and RZWQM use mechanistic approaches for hydrology simulation, output data on a subdaily time-step, and were able to simulate management practices and subsurface flow via tile drainage. In contrast, PRZM operates at daily time-step and simulates surface runoff using the USDA Soil Conservation Service's curve number method. Among the three models, OpusCZ and RZWQM were suitable for simulating pesticide runoff in semiarid areas where agriculture is heavily dependent on irrigation. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  4. Geomechanical Modeling of Fault Responses and the Potential for Notable Seismic Events during Underground CO2 Injection

    Science.gov (United States)

    Rutqvist, J.; Cappa, F.; Mazzoldi, A.; Rinaldi, A.

    2012-12-01

    The importance of geomechanics associated with large-scale geologic carbon storage (GCS) operations is now widely recognized. There are concerns related to the potential for triggering notable (felt) seismic events and how such events could impact the long-term integrity of a CO2 repository (as well as how it could impact the public perception of GCS). In this context, we review a number of modeling studies and field observations related to the potential for injection-induced fault reactivations and seismic events. We present recent model simulations of CO2 injection and fault reactivation, including both aseismic and seismic fault responses. The model simulations were conducted using a slip weakening fault model enabling sudden (seismic) fault rupture, and some of the numerical analyses were extended to fully dynamic modeling of seismic source, wave propagation, and ground motion. The model simulations illustrated what it will take to create a magnitude 3 or 4 earthquake that would not result in any significant damage at the groundsurface, but could raise concerns in the local community and could also affect the deep containment of the stored CO2. The analyses show that the local in situ stress field, fault orientation, fault strength, and injection induced overpressure are critical factors in determining the likelihood and magnitude of such an event. We like to clarify though that in our modeling we had to apply very high injection pressure to be able to intentionally induce any fault reactivation. Consequently, our model simulations represent extreme cases, which in a real GCS operation could be avoided by estimating maximum sustainable injection pressure and carefully controlling the injection pressure. In fact, no notable seismic event has been reported from any of the current CO2 storage projects, although some unfelt microseismic activities have been detected by geophones. On the other hand, potential future commercial GCS operations from large power plants

  5. Evaluating the use of different precipitation datasets in simulating a flood event

    Science.gov (United States)

    Akyurek, Z.; Ozkaya, A.

    2016-12-01

    Floods caused by convective storms in mountainous regions are sensitive to the temporal and spatial variability of rainfall. Space-time estimates of rainfall from weather radar, satellites and numerical weather prediction models can be a remedy to represent pattern of the rainfall with some inaccuracy. However, there is a strong need for evaluation of the performance and limitations of these estimates in hydrology. This study aims to provide a comparison of gauge, radar, satellite (Hydro-Estimator (HE)) and numerical weather prediciton model (Weather Research and Forecasting (WRF)) precipitation datasets during an extreme flood event (22.11.2014) lasting 40 hours in Samsun-Turkey. For this study, hourly rainfall data from 13 ground observation stations were used in the analyses. This event having a peak discharge of 541 m3/sec created flooding at the downstream of Terme Basin. Comparisons were performed in two parts. First the analysis were performed in areal and point based manner. Secondly, a semi-distributed hydrological model was used to assess the accuracy of the rainfall datasets to simulate river flows for the flood event. Kalman Filtering was used in the bias correction of radar rainfall data compared to gauge measurements. Radar, gauge, corrected radar, HE and WRF rainfall data were used as model inputs. Generally, the HE product underestimates the cumulative rainfall amounts in all stations, radar data underestimates the results in cumulative sense but keeps the consistency in the results. On the other hand, almost all stations in WRF mean statistics computations have better results compared to the HE product but worse than the radar dataset. Results in point comparisons indicated that, trend of the rainfall is captured by the radar rainfall estimation well but radar underestimates the maximum values. According to cumulative gauge value, radar underestimated the cumulative rainfall amount by % 32. Contrary to other datasets, the bias of WRF is positive

  6. Various sizes of sliding event bursts in the plastic flow of metallic glasses based on a spatiotemporal dynamic model

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Jingli, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn; Chen, Cun [School of Mathematics and Statistics, Zhengzhou University, Zhengzhou 450001 (China); Wang, Gang, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn [Laboratory for Microstructures, Shanghai University, Shanghai 200444 (China); Cheung, Wing-Sum [Department of Mathematics, The University of HongKong, HongKong (China); Sun, Baoan; Mattern, Norbert [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Siegmund, Stefan [Department of Mathematics, TU Dresden, D-01062 Dresden (Germany); Eckert, Jürgen [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Institute of Materials Science, TU Dresden, D-01062 Dresden (Germany)

    2014-07-21

    This paper presents a spatiotemporal dynamic model based on the interaction between multiple shear bands in the plastic flow of metallic glasses during compressive deformation. Various sizes of sliding events burst in the plastic deformation as the generation of different scales of shear branches occurred; microscopic creep events and delocalized sliding events were analyzed based on the established model. This paper discusses the spatially uniform solutions and traveling wave solution. The phase space of the spatially uniform system applied in this study reflected the chaotic state of the system at a lower strain rate. Moreover, numerical simulation showed that the microscopic creep events were manifested at a lower strain rate, whereas the delocalized sliding events were manifested at a higher strain rate.

  7. A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment

    Science.gov (United States)

    Wu, Wenyan; Westra, Seth; Leonard, Michael

    2017-04-01

    Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability

  8. Parallel discrete event simulation

    NARCIS (Netherlands)

    Overeinder, B.J.; Hertzberger, L.O.; Sloot, P.M.A.; Withagen, W.J.

    1991-01-01

    In simulating applications for execution on specific computing systems, the simulation performance figures must be known in a short period of time. One basic approach to the problem of reducing the required simulation time is the exploitation of parallelism. However, in parallelizing the simulation

  9. Analytic expressions for the construction of a fire event PSA model

    International Nuclear Information System (INIS)

    Kang, Dae Il; Kim, Kil Yoo; Kim, Dong San; Hwang, Mee Jeong; Yang, Joon Eon

    2016-01-01

    In this study, the changing process of an internal event PSA model to a fire event PSA model is analytically presented and discussed. Many fire PSA models have fire induced initiating event fault trees not shown in an internal event PSA model. Fire-induced initiating fault tree models are developed for addressing multiple initiating event issues. A single fire event within a fire compartment or fire scenario can cause multiple initiating events. As an example, a fire in a turbine building area can cause a loss of the main feed-water and loss of off-site power initiating events. Up to now, there has been no analytic study on the construction of a fire event PSA model using an internal event PSA model with fault trees of initiating events. In this paper, the changing process of an internal event PSA model to a fire event PSA model was analytically presented and discussed. This study results show that additional cutsets can be obtained if the fault trees of initiating events for a fire event PSA model are not exactly developed.

  10. Analytic expressions for the construction of a fire event PSA model

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Kim, Kil Yoo; Kim, Dong San; Hwang, Mee Jeong; Yang, Joon Eon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, the changing process of an internal event PSA model to a fire event PSA model is analytically presented and discussed. Many fire PSA models have fire induced initiating event fault trees not shown in an internal event PSA model. Fire-induced initiating fault tree models are developed for addressing multiple initiating event issues. A single fire event within a fire compartment or fire scenario can cause multiple initiating events. As an example, a fire in a turbine building area can cause a loss of the main feed-water and loss of off-site power initiating events. Up to now, there has been no analytic study on the construction of a fire event PSA model using an internal event PSA model with fault trees of initiating events. In this paper, the changing process of an internal event PSA model to a fire event PSA model was analytically presented and discussed. This study results show that additional cutsets can be obtained if the fault trees of initiating events for a fire event PSA model are not exactly developed.

  11. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    Science.gov (United States)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  12. The millennium water vapour drop in chemistry–climate model simulations

    Directory of Open Access Journals (Sweden)

    S. Brinkop

    2016-07-01

    Full Text Available This study investigates the abrupt and severe water vapour decline in the stratosphere beginning in the year 2000 (the "millennium water vapour drop" and other similarly strong stratospheric water vapour reductions by means of various simulations with the state-of-the-art Chemistry-Climate Model (CCM EMAC (ECHAM/MESSy Atmospheric Chemistry Model. The model simulations differ with respect to the prescribed sea surface temperatures (SSTs and whether nudging is applied or not. The CCM EMAC is able to most closely reproduce the signature and pattern of the water vapour drop in agreement with those derived from satellite observations if the model is nudged. Model results confirm that this extraordinary water vapour decline is particularly obvious in the tropical lower stratosphere and is related to a large decrease in cold point temperature. The drop signal propagates under dilution to the higher stratosphere and to the poles via the Brewer–Dobson circulation (BDC. We found that the driving forces for this significant decline in water vapour mixing ratios are tropical sea surface temperature (SST changes due to a coincidence with a preceding strong El Niño–Southern Oscillation event (1997/1998 followed by a strong La Niña event (1999/2000 and supported by the change of the westerly to the easterly phase of the equatorial stratospheric quasi-biennial oscillation (QBO in 2000. Correct (observed SSTs are important for triggering the strong decline in water vapour. There are indications that, at least partly, SSTs contribute to the long period of low water vapour values from 2001 to 2006. For this period, the specific dynamical state of the atmosphere (overall atmospheric large-scale wind and temperature distribution is important as well, as it causes the observed persistent low cold point temperatures. These are induced by a period of increased upwelling, which, however, has no corresponding pronounced signature in SSTs anomalies in the tropics

  13. Numerical simulations of atmospheric dispersion of iodine-131 by different models.

    Directory of Open Access Journals (Sweden)

    Ádám Leelőssy

    Full Text Available Nowadays, several dispersion models are available to simulate the transport processes of air pollutants and toxic substances including radionuclides in the atmosphere. Reliability of atmospheric transport models has been demonstrated in several recent cases from local to global scale; however, very few actual emission data are available to evaluate model results in real-life cases. In this study, the atmospheric dispersion of 131I emitted to the atmosphere during an industrial process was simulated with different models, namely the WRF-Chem Eulerian online coupled model and the HYSPLIT and the RAPTOR Lagrangian models. Although only limited data of 131I detections has been available, the accuracy of modeled plume direction could be evaluated in complex late autumn weather situations. For the studied cases, the general reliability of models has been demonstrated. However, serious uncertainties arise related to low level inversions, above all in case of an emission event on 4 November 2011, when an important wind shear caused a significant difference between simulated and real transport directions. Results underline the importance of prudent interpretation of dispersion model results and the identification of weather conditions with a potential to cause large model errors.

  14. Modeling study of the 2010 regional haze event in the North China Plain

    Directory of Open Access Journals (Sweden)

    M. Gao

    2016-02-01

    Full Text Available The online coupled Weather Research and Forecasting-Chemistry (WRF-Chem model was applied to simulate a haze event that happened in January 2010 in the North China Plain (NCP, and was validated against various types of measurements. The evaluations indicate that WRF-Chem provides reliable simulations for the 2010 haze event in the NCP. This haze event was mainly caused by high emissions of air pollutants in the NCP and stable weather conditions in winter. Secondary inorganic aerosols also played an important role and cloud chemistry had important contributions. Air pollutants outside Beijing contributed about 64.5 % to the PM2.5 levels in Beijing during this haze event, and most of them are from south Hebei, Tianjin city, Shandong and Henan provinces. In addition, aerosol feedback has important impacts on surface temperature, relative humidity (RH and wind speeds, and these meteorological variables affect aerosol distribution and formation in turn. In Shijiazhuang, Planetary Boundary Layer (PBL decreased about 278.2 m and PM2.5 increased more than 20 µg m−3 due to aerosol feedback. It was also shown that black carbon (BC absorption has significant impacts on meteorology and air quality changes, indicating more attention should be paid to BC from both air pollution control and climate change perspectives.

  15. A Dynamic Approach to Modeling Dependence Between Human Failure Events

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory

    2015-09-01

    In practice, most HRA methods use direct dependence from THERP—the notion that error be- gets error, and one human failure event (HFE) may increase the likelihood of subsequent HFEs. In this paper, we approach dependence from a simulation perspective in which the effects of human errors are dynamically modeled. There are three key concepts that play into this modeling: (1) Errors are driven by performance shaping factors (PSFs). In this context, the error propagation is not a result of the presence of an HFE yielding overall increases in subsequent HFEs. Rather, it is shared PSFs that cause dependence. (2) PSFs have qualities of lag and latency. These two qualities are not currently considered in HRA methods that use PSFs. Yet, to model the effects of PSFs, it is not simply a matter of identifying the discrete effects of a particular PSF on performance. The effects of PSFs must be considered temporally, as the PSFs will have a range of effects across the event sequence. (3) Finally, there is the concept of error spilling. When PSFs are activated, they not only have temporal effects but also lateral effects on other PSFs, leading to emergent errors. This paper presents the framework for tying together these dynamic dependence concepts.

  16. Quantifying the predictive accuracy of time-to-event models in the presence of competing risks.

    Science.gov (United States)

    Schoop, Rotraut; Beyersmann, Jan; Schumacher, Martin; Binder, Harald

    2011-02-01

    Prognostic models for time-to-event data play a prominent role in therapy assignment, risk stratification and inter-hospital quality assurance. The assessment of their prognostic value is vital not only for responsible resource allocation, but also for their widespread acceptance. The additional presence of competing risks to the event of interest requires proper handling not only on the model building side, but also during assessment. Research into methods for the evaluation of the prognostic potential of models accounting for competing risks is still needed, as most proposed methods measure either their discrimination or calibration, but do not examine both simultaneously. We adapt the prediction error proposal of Graf et al. (Statistics in Medicine 1999, 18, 2529–2545) and Gerds and Schumacher (Biometrical Journal 2006, 48, 1029–1040) to handle models with competing risks, i.e. more than one possible event type, and introduce a consistent estimator. A simulation study investigating the behaviour of the estimator in small sample size situations and for different levels of censoring together with a real data application follows.

  17. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi......-dimensional schemes that are customized to serve specific information needs. EVER is based on an event concept that is very well suited for multi-dimensional modeling because measurement data often represent events in multi-dimensional databases...

  18. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  19. Irregularities in Early Seismic Rupture Propagation for Large Events in a Crustal Earthquake Model

    Science.gov (United States)

    Lapusta, N.; Rice, J. R.; Rice, J. R.

    2001-12-01

    We study early seismic propagation of model earthquakes in a 2-D model of a vertical strike-slip fault with depth-variable rate and state friction properties. Our model earthquakes are obtained in fully dynamic simulations of sequences of instabilities on a fault subjected to realistically slow tectonic loading (Lapusta et al., JGR, 2000). This work is motivated by results of Ellsworth and Beroza (Science, 1995), who observe that for many earthquakes, far-field velocity seismograms during initial stages of dynamic rupture propagation have irregular fluctuations which constitute a "seismic nucleation phase". In our simulations, we find that such irregularities in velocity seismograms can be caused by two factors: (1) rupture propagation over regions of stress concentrations and (2) partial arrest of rupture in neighboring creeping regions. As rupture approaches a region of stress concentration, it sees increasing background stress and its moment acceleration (to which velocity seismographs in the far field are proportional) increases. After the peak in stress concentration, the rupture sees decreasing background stress and moment acceleration decreases. Hence a fluctuation in moment acceleration is created. If rupture starts sufficiently far from a creeping region, then partial arrest of rupture in the creeping region causes a decrease in moment acceleration. As the other parts of rupture continue to develop, moment acceleration then starts to grow again, and a fluctuation again results. Other factors may cause the irregularities in moment acceleration, e.g., phenomena such as branching and/or intermittent rupture propagation (Poliakov et al., submitted to JGR, 2001) which we have not studied here. Regions of stress concentration are created in our model by arrest of previous smaller events as well as by interactions with creeping regions. One such region is deep in the fault zone, and is caused by the temperature-induced transition from seismogenic to creeping

  20. Using global magnetospheric models for simulation and interpretation of Swarm external field measurements

    DEFF Research Database (Denmark)

    Moretto, T.; Vennerstrøm, Susanne; Olsen, Nils

    2006-01-01

    simulated external contributions relevant for internal field modeling. These have proven very valuable for the design and planning of the up-coming multi-satellite Swarm mission. In addition, a real event simulation was carried out for a moderately active time interval when observations from the Orsted...... it consistently underestimates the dayside region 2 currents and overestimates the horizontal ionospheric closure currents in the dayside polar cap. Furthermore, with this example we illustrate the great benefit of utilizing the global model for the interpretation of Swarm external field observations and......, likewise, the potential of using Swarm measurements to test and improve the global model....

  1. Occurrence of blowing snow events at an alpine site over a 10-year period: Observations and modelling

    Science.gov (United States)

    Vionnet, V.; Guyomarc'h, G.; Naaim Bouvet, F.; Martin, E.; Durand, Y.; Bellot, H.; Bel, C.; Puglièse, P.

    2013-05-01

    Blowing snow events control the evolution of the snow pack in mountainous areas and cause inhomogeneous snow distribution. The goal of this study is to identify the main features of blowing snow events at an alpine site and assess the ability of the detailed snowpack model Crocus to reproduce the occurrence of these events in a 1D configuration. We created a database of blowing snow events observed over 10 years at our experimental site. Occurrences of blowing snow events were divided into cases with and without concurrent falling snow. Overall, snow transport is observed during 10.5% of the time in winter and occurs with concurrent falling snow 37.3% of the time. Wind speed and snow age control the frequency of occurrence. Model results illustrate the necessity of taking the wind-dependence of falling snow grain characteristics into account to simulate periods of snow transport and mass fluxes satisfactorily during those periods. The high rate of false alarms produced by the model is investigated in detail for winter 2010/2011 using measurements from snow particle counters.

  2. Urban nonpoint source pollution buildup and washoff models for simulating storm runoff quality in the Los Angeles County

    International Nuclear Information System (INIS)

    Wang Long; Wei Jiahua; Huang Yuefei; Wang Guangqian; Maqsood, Imran

    2011-01-01

    Many urban nonpoint source pollution models utilize pollutant buildup and washoff functions to simulate storm runoff quality of urban catchments. In this paper, two urban pollutant washoff load models are derived using pollutant buildup and washoff functions. The first model assumes that there is no residual pollutant after a storm event while the second one assumes that there is always residual pollutant after each storm event. The developed models are calibrated and verified with observed data from an urban catchment in the Los Angeles County. The application results show that the developed model with consideration of residual pollutant is more capable of simulating nonpoint source pollution from urban storm runoff than that without consideration of residual pollutant. For the study area, residual pollutant should be considered in pollutant buildup and washoff functions for simulating urban nonpoint source pollution when the total runoff volume is less than 30 mm. - Highlights: → An improved urban NPS model was developed. → It performs well in areas where storm events have great temporal variation. → Threshold of total runoff volume for ignoring residual pollutant was determined. - An improved urban NPS model was developed. Threshold of total runoff volume for ignoring residual pollutant was determined.

  3. Numerical Investigation of Acoustic Emission Events of Argillaceous Sandstones under Confining Pressure

    Directory of Open Access Journals (Sweden)

    Zhaohui Chong

    2017-01-01

    Full Text Available At the laboratory scale, locating acoustic emission (AE events is a comparatively mature method for evaluating cracks in rock materials, and the method plays an important role in numerical simulations. This study is aimed at developing a quantitative method for the measurement of acoustic emission (AE events in numerical simulations. Furthermore, this method was applied to estimate the crack initiation, propagation, and coalescence in rock materials. The discrete element method-acoustic emission model (DEM-AE model was developed using an independent subprogram. This model was designed to calculate the scalar seismic tensor of particles in the process of movement and further to determine the magnitude of AE events. An algorithm for identifying the same spatiotemporal AE event is being presented. To validate the model, a systematic physical experiment and numerical simulation for argillaceous sandstones were performed to present a quantitative comparison of the results with confining pressure. The results showed good agreement in terms of magnitude and spatiotemporal evolution between the simulation and the physical experiment. Finally, the magnitude of AE events was analyzed, and the relationship between AE events and microcracks was discussed. This model can provide the research basis for preventing seismic hazards caused by underground coal mining.

  4. An integrated model for the assessment of unmitigated fault events in ITER's superconducting magnets

    Energy Technology Data Exchange (ETDEWEB)

    McIntosh, S., E-mail: simon.mcintosh@ccfe.ac.uk [Culham Centre for Fusion Energy, Culham Science Center, Abingdon OX14 3DB, Oxfordshire (United Kingdom); Holmes, A. [Marcham Scientific Ltd., Sarum House, 10 Salisbury Rd., Hungerford RG17 0LH, Berkshire (United Kingdom); Cave-Ayland, K.; Ash, A.; Domptail, F.; Zheng, S.; Surrey, E.; Taylor, N. [Culham Centre for Fusion Energy, Culham Science Center, Abingdon OX14 3DB, Oxfordshire (United Kingdom); Hamada, K.; Mitchell, N. [ITER Organization, Magnet Division, St Paul Lez Durance Cedex (France)

    2016-11-01

    A large amount of energy is stored in ITER superconducting magnet system. Faults which initiate a discharge are typically mitigated to quickly transfer away the stored magnetic energy for dissipation through a bank of resistors. In an extreme unlikely occurrence, an unmitigated fault event represents a potentially severe discharge of energy into the coils and the surrounding structure. A new simulation tool has been developed for the detailed study of these unmitigated fault events. The tool integrates: the propagation of multiple quench fronts initiated by an initial fault or by subsequent coil heating; the 3D convection and conduction of heat through the magnet structure; the 3D conduction of current and Ohmic heating both along the conductor and via alternate pathways generated by arcing or material melt. Arcs linking broken sections of conductor or separate turns are simulated with a new unconstrained arc model to balance electrical current paths and heat generation within the arc column in the multi-physics model. The influence under the high Lorenz forces present is taken into account. Simulation results for an unmitigated fault in a poloidal field coil are presented.

  5. Modelling the interaction between flooding events and economic growth

    Science.gov (United States)

    Grames, Johanna; Fürnkranz-Prskawetz, Alexia; Grass, Dieter; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Recently socio-hydrology models have been proposed to analyze the interplay of community risk-coping culture, flooding damage and economic growth. These models descriptively explain the feedbacks between socio-economic development and natural disasters such as floods. Complementary to these descriptive models, we develop a dynamic optimization model, where the inter-temporal decision of an economic agent interacts with the hydrological system. This interdisciplinary approach matches with the goals of Panta Rhei i.e. to understand feedbacks between hydrology and society. It enables new perspectives but also shows limitations of each discipline. Young scientists need mentors from various scientific backgrounds to learn their different research approaches and how to best combine them such that interdisciplinary scientific work is also accepted by different science communities. In our socio-hydrology model we apply a macro-economic decision framework to a long-term flood-scenario. We assume a standard macro-economic growth model where agents derive utility from consumption and output depends on physical capital that can be accumulated through investment. To this framework we add the occurrence of flooding events which will destroy part of the capital. We identify two specific periodic long term solutions and denote them rich and poor economies. Whereas rich economies can afford to invest in flood defense and therefore avoid flood damage and develop high living standards, poor economies prefer consumption instead of investing in flood defense capital and end up facing flood damages every time the water level rises. Nevertheless, they manage to sustain at least a low level of physical capital. We identify optimal investment strategies and compare simulations with more frequent and more intense high water level events.

  6. Analytical model and behavioral simulation approach for a ΣΔ fractional-N synthesizer employing a sample-hold element

    DEFF Research Database (Denmark)

    Cassia, Marco; Shah, Peter Jivan; Bruun, Erik

    2003-01-01

    is discussed. Also, a new methodology for behavioral simulation is presented: the proposed methodology is based on an object-oriented event-driven approach and offers the possibility to perform very fast and accurate simulations, and the theoretical models developed validate the simulation results. We show...

  7. Satellite Collision Modeling with Physics-Based Hydrocodes: Debris Generation Predictions of the Iridium-Cosmos Collision Event and Other Impact Events

    International Nuclear Information System (INIS)

    Springer, H.K.; Miller, W.O.; Levatin, J.L.; Pertica, A.J.; Olivier, S.S.

    2010-01-01

    Satellite collision debris poses risks to existing space assets and future space missions. Predictive models of debris generated from these hypervelocity collisions are critical for developing accurate space situational awareness tools and effective mitigation strategies. Hypervelocity collisions involve complex phenomenon that spans several time- and length-scales. We have developed a satellite collision debris modeling approach consisting of a Lagrangian hydrocode enriched with smooth particle hydrodynamics (SPH), advanced material failure models, detailed satellite mesh models, and massively parallel computers. These computational studies enable us to investigate the influence of satellite center-of-mass (CM) overlap and orientation, relative velocity, and material composition on the size, velocity, and material type distributions of collision debris. We have applied our debris modeling capability to the recent Iridium 33-Cosmos 2251 collision event. While the relative velocity was well understood in this event, the degree of satellite CM overlap and orientation was ill-defined. In our simulations, we varied the collision CM overlap and orientation of the satellites from nearly maximum overlap to partial overlap on the outermost extents of the satellites (i.e, solar panels and gravity boom). As expected, we found that with increased satellite overlap, the overall debris cloud mass and momentum (transfer) increases, the average debris size decreases, and the debris velocity increases. The largest predicted debris can also provide insight into which satellite components were further removed from the impact location. A significant fraction of the momentum transfer is imparted to the smallest debris (< 1-5mm, dependent on mesh resolution), especially in large CM overlap simulations. While the inclusion of the smallest debris is critical to enforcing mass and momentum conservation in hydrocode simulations, there seems to be relatively little interest in their

  8. Evolution of Storm-time Subauroral Electric Fields: RCM Event Simulations

    Science.gov (United States)

    Sazykin, S.; Spiro, R. W.; Wolf, R. A.; Toffoletto, F.; Baker, J.; Ruohoniemi, J. M.

    2012-12-01

    Subauroral polarization streams (SAPS) are regions of strongly-enhanced westward ExB plasma drift (poleward-directed electric fields) located just equatorward of the evening auroral oval. Several recently -installed HF (coherent scatter) radars in the SuperDARN chain at mid-latitudes present a novel opportunity for obtaining two-dimensional maps of ionospheric ExB flows at F-region altitudes that span several hours of the evening and nighttime subauroral ionosphere. These new and exciting observations of SAPS provide an opportunity and a challenge to coupled magnetosphere-ionosphere models. In this paper, we use the Rice Convection Model (RCM) to simulate several events where SAPS were observed by the mid-latitude SuperDARN chain. RCM frequently predicts the occurrence of SAPS in the subauroral evening MLT sector; the mechanism is essentially current closure on the dusk side where downward Birkeland currents (associated with the ion plasma sheet inner edge) map to a region of reduced ionospheric conductance just equatorward of the diffuse auroral precipitation (associated with the electron plasma sheet inner edge). We present detailed comparisons of model-computed ionospheric convection patterns with observations, with two goals in mind: (1) to analyze to what extent the observed appearance and time evolution of SAPS structures are driven by time variations of the cross polar cap potential drop (or, equivalently, the z-component of the interplanetary magnetic field), and (2) to evaluate the ability of the model to reproduce the spatial extent and magnitude of SAPS structures.

  9. MadGraph/MadEvent. The new web generation

    International Nuclear Information System (INIS)

    Alwall, J.

    2007-01-01

    The new web-based version of the automatized process and event generator MadGraph/MadEvent is now available. Recent developments are: New models, notably MSSM, 2HDM and a framework for addition of user-defined models, inclusive sample generation and on-line hadronization and detector simulation. Event generation can be done on-line on any of our clusters. (author)

  10. Wroclaw neutrino event generator

    International Nuclear Information System (INIS)

    Nowak, J A

    2006-01-01

    A neutrino event generator developed by the Wroclaw Neutrino Group is described. The physical models included in the generator are discussed and illustrated with the results of simulations. The considered processes are quasi-elastic scattering and pion production modelled by combining the Δ resonance excitation and deep inelastic scattering

  11. Discrete event simulation and the resultant data storage system response in the operational mission environment of Jupiter-Saturn /Voyager/ spacecraft

    Science.gov (United States)

    Mukhopadhyay, A. K.

    1978-01-01

    The Data Storage Subsystem Simulator (DSSSIM) simulating (by ground software) occurrence of discrete events in the Voyager mission is described. Functional requirements for Data Storage Subsystems (DSS) simulation are discussed, and discrete event simulation/DSSSIM processing is covered. Four types of outputs associated with a typical DSSSIM run are presented, and DSSSIM limitations and constraints are outlined.

  12. Non-axisymmetric simulation of the vertical displacement event in tokamaks

    International Nuclear Information System (INIS)

    Lim, Y.Y.; Lee, J.K.; Shin, K.J.; Hur, M.S.

    1999-01-01

    Tokamak plasmas with highly elongated cross sections are subject to a vertical displacement event (VDE). The nonlinear magnetohydrodynamic (MHD) evolutions of tokamak plasmas during the VDE are simulated by a three-dimensional MHD code as a combination of N=0 and N=1 components. The nonlinear evolution during the VDE is strongly affected by the relative amplitude of the N=1 to the N=0 modes. (author)

  13. Event-by-Event Simulation of the Hanbury Brown-Twiss Experiment with Coherent Light

    NARCIS (Netherlands)

    Jin, F.; De Raedt, H.; Michielsen, K.

    We present a computer simulation model for the Hanbury Brown-Twiss experiment that is entirely particle-based and reproduces the results of wave theory. The model is solely based on experimental facts, satisfies Einstein's criterion of local causality and does not require knowledge of the solution

  14. Mutiple simultaneous event model for radiation carcinogenesis

    International Nuclear Information System (INIS)

    Baum, J.W.

    1979-01-01

    Theoretical Radiobiology and Risk Estimates includes reports on: Multiple Simultaneous Event Model for Radiation Carcinogenesis; Cancer Risk Estimates and Neutron RBE Based on Human Exposures; A Rationale for Nonlinear Dose Response Functions of Power Greater or Less Than One; and Rationale for One Double Event in Model for Radiation Carcinogenesis

  15. Investigating the impact of climate change on crop phenological events in Europe with a phenology model

    Science.gov (United States)

    Ma, Shaoxiu; Churkina, Galina; Trusilova, Kristina

    2012-07-01

    Predicting regional and global carbon and water dynamics requires a realistic representation of vegetation phenology. Vegetation models including cropland models exist (e.g. LPJmL, Daycent, SIBcrop, ORCHIDEE-STICS, PIXGRO) but they have various limitations in predicting cropland phenological events and their responses to climate change. Here, we investigate how leaf onset and offset days of major European croplands responded to changes in climate from 1971 to 2000 using a newly developed phenological model, which solely relies on climate data. Net ecosystem exchange (NEE) data measured with eddy covariance technique at seven sites in Europe were used to adjust model parameters for wheat, barley, and rapeseed. Observational data from the International Phenology Gardens were used to corroborate modeled phenological responses to changes in climate. Enhanced vegetation index (EVI) and a crop calendar were explored as alternative predictors of leaf onset and harvest days, respectively, over a large spatial scale. In each spatial model simulation, we assumed that all European croplands were covered by only one crop type. Given this assumption, the model estimated that the leaf onset days for wheat, barley, and rapeseed in Germany advanced by 1.6, 3.4, and 3.4 days per decade, respectively, during 1961-2000. The majority of European croplands (71.4%) had an advanced mean leaf onset day for wheat, barley, and rapeseed (7.0% significant), whereas 28.6% of European croplands had a delayed leaf onset day (0.9% significant) during 1971-2000. The trend of advanced onset days estimated by the model is similar to observations from the International Phenology Gardens in Europe. The developed phenological model can be integrated into a large-scale ecosystem model to simulate the dynamics of phenological events at different temporal and spatial scales. Crop calendars and enhanced vegetation index have substantial uncertainties in predicting phenological events of croplands. Caution

  16. Data-driven Markov models and their application in the evaluation of adverse events in radiotherapy

    CERN Document Server

    Abler, Daniel; Davies, Jim; Dosanjh, Manjit; Jena, Raj; Kirkby, Norman; Peach, Ken

    2013-01-01

    Decision-making processes in medicine rely increasingly on modelling and simulation techniques; they are especially useful when combining evidence from multiple sources. Markov models are frequently used to synthesize the available evidence for such simulation studies, by describing disease and treatment progress, as well as associated factors such as the treatment's effects on a patient's life and the costs to society. When the same decision problem is investigated by multiple stakeholders, differing modelling assumptions are often applied, making synthesis and interpretation of the results difficult. This paper proposes a standardized approach towards the creation of Markov models. It introduces the notion of ‘general Markov models’, providing a common definition of the Markov models that underlie many similar decision problems, and develops a language for their specification. We demonstrate the application of this language by developing a general Markov model for adverse event analysis in radiotherapy ...

  17. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  18. Selecting a dynamic simulation modeling method for health care delivery research-part 2: report of the ISPOR Dynamic Simulation Modeling Emerging Good Practices Task Force.

    Science.gov (United States)

    Marshall, Deborah A; Burgos-Liz, Lina; IJzerman, Maarten J; Crown, William; Padula, William V; Wong, Peter K; Pasupathy, Kalyan S; Higashi, Mitchell K; Osgood, Nathaniel D

    2015-03-01

    In a previous report, the ISPOR Task Force on Dynamic Simulation Modeling Applications in Health Care Delivery Research Emerging Good Practices introduced the fundamentals of dynamic simulation modeling and identified the types of health care delivery problems for which dynamic simulation modeling can be used more effectively than other modeling methods. The hierarchical relationship between the health care delivery system, providers, patients, and other stakeholders exhibits a level of complexity that ought to be captured using dynamic simulation modeling methods. As a tool to help researchers decide whether dynamic simulation modeling is an appropriate method for modeling the effects of an intervention on a health care system, we presented the System, Interactions, Multilevel, Understanding, Loops, Agents, Time, Emergence (SIMULATE) checklist consisting of eight elements. This report builds on the previous work, systematically comparing each of the three most commonly used dynamic simulation modeling methods-system dynamics, discrete-event simulation, and agent-based modeling. We review criteria for selecting the most suitable method depending on 1) the purpose-type of problem and research questions being investigated, 2) the object-scope of the model, and 3) the method to model the object to achieve the purpose. Finally, we provide guidance for emerging good practices for dynamic simulation modeling in the health sector, covering all aspects, from the engagement of decision makers in the model design through model maintenance and upkeep. We conclude by providing some recommendations about the application of these methods to add value to informed decision making, with an emphasis on stakeholder engagement, starting with the problem definition. Finally, we identify areas in which further methodological development will likely occur given the growing "volume, velocity and variety" and availability of "big data" to provide empirical evidence and techniques

  19. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  20. Cost-effectiveness of total hip and knee replacements for the Australian population with osteoarthritis: discrete-event simulation model.

    Directory of Open Access Journals (Sweden)

    Hideki Higashi

    Full Text Available BACKGROUND: Osteoarthritis constitutes a major musculoskeletal burden for the aged Australians. Hip and knee replacement surgeries are effective interventions once all conservative therapies to manage the symptoms have been exhausted. This study aims to evaluate the cost-effectiveness of hip and knee replacements in Australia. To our best knowledge, the study is the first attempt to account for the dual nature of hip and knee osteoarthritis in modelling the severities of right and left joints separately. METHODOLOGY/PRINCIPAL FINDINGS: We developed a discrete-event simulation model that follows up the individuals with osteoarthritis over their lifetimes. The model defines separate attributes for right and left joints and accounts for several repeat replacements. The Australian population with osteoarthritis who were 40 years of age or older in 2003 were followed up until extinct. Intervention effects were modelled by means of disability-adjusted life-years (DALYs averted. Both hip and knee replacements are highly cost effective (AUD 5,000 per DALY and AUD 12,000 per DALY respectively under an AUD 50,000/DALY threshold level. The exclusion of cost offsets, and inclusion of future unrelated health care costs in extended years of life, did not change the findings that the interventions are cost-effective (AUD 17,000 per DALY and AUD 26,000 per DALY respectively. However, there was a substantial difference between hip and knee replacements where surgeries administered for hips were more cost-effective than for knees. CONCLUSIONS/SIGNIFICANCE: Both hip and knee replacements are cost-effective interventions to improve the quality of life of people with osteoarthritis. It was also shown that the dual nature of hip and knee OA should be taken into account to provide more accurate estimation on the cost-effectiveness of hip and knee replacements.

  1. Catchment & sewer network simulation model to benchmark control strategies within urban wastewater systems

    DEFF Research Database (Denmark)

    Saagi, Ramesh; Flores Alsina, Xavier; Fu, Guangtao

    2016-01-01

    This paper aims at developing a benchmark simulation model to evaluate control strategies for the urban catchment and sewer network. Various modules describing wastewater generation in the catchment, its subsequent transport and storage in the sewer system are presented. Global/local overflow based...... evaluation criteria describing the cumulative and acute effects are presented. Simulation results show that the proposed set of models is capable of generating daily, weekly and seasonal variations as well as describing the effect of rain events on wastewater characteristics. Two sets of case studies...

  2. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    Energy Technology Data Exchange (ETDEWEB)

    Auld, Joshua; Hope, Michael; Ley, Hubert; Sokolov, Vadim; Xu, Bo; Zhang, Kuilin

    2016-03-01

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typically done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.

  3. Spatially Explicit Modelling of the Belgian Major Endurance Event 'The 100 km Dodentocht'.

    Directory of Open Access Journals (Sweden)

    Steffie Van Nieuland

    Full Text Available 'The 100 km Dodentocht', which takes place annually and has its start in Bornem, Belgium, is a long distance march where participants have to cover a 100 km trail in at most 24 hours. The approximately 11 000 marchers per edition are tracked by making use of passive radio-frequency-identification (RFID. These tracking data were analyzed to build a spatially explicit marching model that gives insights into the dynamics of the event and allows to evaluate the effect of changes in the starting procedure of the event. For building the model, the empirical distribution functions (edf of the marching speeds at every section of the trail in between two consecutive checkpoints and of the checkpoints where marchers retire, are determined, taking into account age, gender, and marching speeds at previous sections. These distribution functions are then used to sample the consecutive speeds and retirement, and as such to simulate the times when individual marchers pass by the consecutive checkpoints. We concluded that the data-driven model simulates the event reliably. Furthermore, we tested three scenarios to reduce the crowdiness along the first part of the trail and in this way were able to conclude that either the start should be moved to a location outside the town center where the streets are at least 25% wider, or that the marchers should start in two groups at two different locations, and that these groups should ideally merge at about 20 km after the start. The crowdiness at the start might also be reduced by installing a bottleneck at the start in order to limit the number of marchers that can pass per unit of time. Consequently, the operating hours of the consecutive checkpoints would be longer. The developed framework can likewise be used to analyze and improve the operation of other endurance events if sufficient tracking data are available.

  4. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  5. Simulation of Flash-Flood-Producing Storm Events in Saudi Arabia Using the Weather Research and Forecasting Model

    KAUST Repository

    Deng, Liping; McCabe, Matthew; Stenchikov, Georgiy L.; Evans, Jason P.; Kucera, Paul A.

    2015-01-01

    The challenges of monitoring and forecasting flash-flood-producing storm events in data-sparse and arid regions are explored using the Weather Research and Forecasting (WRF) Model (version 3.5) in conjunction with a range of available satellite

  6. How update schemes influence crowd simulations

    International Nuclear Information System (INIS)

    Seitz, Michael J; Köster, Gerta

    2014-01-01

    Time discretization is a key modeling aspect of dynamic computer simulations. In current pedestrian motion models based on discrete events, e.g. cellular automata and the Optimal Steps Model, fixed-order sequential updates and shuffle updates are prevalent. We propose to use event-driven updates that process events in the order they occur, and thus better match natural movement. In addition, we present a parallel update with collision detection and resolution for situations where computational speed is crucial. Two simulation studies serve to demonstrate the practical impact of the choice of update scheme. Not only do density-speed relations differ, but there is a statistically significant effect on evacuation times. Fixed-order sequential and random shuffle updates with a short update period come close to event-driven updates. The parallel update scheme overestimates evacuation times. All schemes can be employed for arbitrary simulation models with discrete events, such as car traffic or animal behavior. (paper)

  7. Simulation of extreme rainfall event of November 2009 over Jeddah, Saudi Arabia: the explicit role of topography and surface heating

    Science.gov (United States)

    Almazroui, Mansour; Raju, P. V. S.; Yusef, A.; Hussein, M. A. A.; Omar, M.

    2018-04-01

    In this paper, a nonhydrostatic Weather Research and Forecasting (WRF) model has been used to simulate the extreme precipitation event of 25 November 2009, over Jeddah, Saudi Arabia. The model is integrated in three nested (27, 9, and 3 km) domains with the initial and boundary forcing derived from the NCEP reanalysis datasets. As a control experiment, the model integrated for 48 h initiated at 0000 UTC on 24 November 2009. The simulated rainfall in the control experiment depicts in well agreement with Tropical Rainfall Measurement Mission rainfall estimates in terms of intensity as well as spatio-temporal distribution. Results indicate that a strong low-level (850 hPa) wind over Jeddah and surrounding regions enhanced the moisture and temperature gradient and created a conditionally unstable atmosphere that favored the development of the mesoscale system. The influences of topography and heat exchange process in the atmosphere were investigated on the development of extreme precipitation event; two sensitivity experiments are carried out: one without topography and another without exchange of surface heating to the atmosphere. The results depict that both surface heating and topography played crucial role in determining the spatial distribution and intensity of the extreme rainfall over Jeddah. The topography favored enhanced uplift motion that further strengthened the low-level jet and hence the rainfall over Jeddah and adjacent areas. On the other hand, the absence of surface heating considerably reduced the simulated rainfall by 30% as compared to the observations.

  8. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    Science.gov (United States)

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  9. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  10. The More Extreme Nature of North American Monsoon Precipitation in the Southwestern United States as Revealed by a Historical Climatology of Simulated Severe Weather Events

    KAUST Repository

    Luong, Thang M.; Castro, Christopher L.; Chang, Hsin-I; Lahmers, Timothy; Adams, David K.; Ochoa-Moya, Carlos A.

    2017-01-01

    Long-term changes in North American monsoon (NAM) precipitation intensity in the southwestern United States are evaluated through the use of convective-permitting model simulations of objectively identified severe weather events during

  11. The More Extreme Nature of North American Monsoon Precipitation in the Southwestern United States as Revealed by a Historical Climatology of Simulated Severe Weather Events

    KAUST Repository

    Luong, Thang M.

    2017-07-03

    Long-term changes in North American monsoon (NAM) precipitation intensity in the southwestern United States are evaluated through the use of convective-permitting model simulations of objectively identified severe weather events during

  12. Development of a Nuclear Reaction Database on Silicon for Simulation of Neutron-Induced Single-Event Upsets in Microelectronics and its Application

    International Nuclear Information System (INIS)

    Watanabe, Yukinobu; Kodama, Akihiro; Tukamoto, Yasuyuki; Nakashima, Hideki

    2005-01-01

    We have developed a cross-section database for neutron-induced reactions on 28Si in the energy range between 2 MeV and 3 GeV in order to analyze single-event upsets (SEUs) phenomena induced by cosmic-ray neutrons in microelectronic devices. A simplified spherical device model is proposed for simulation of the initial processes of SEUs. The model is applied to SEU cross-section calculations for semiconductor memory devices. The calculated results are compared with measured SEU cross sections and the other simulation result. The dependence of SEU cross sections on incident neutron energy and secondary ions having the most important effects on SEUs are discussed

  13. A Participatory Modeling Application of a Distributed Hydrologic Model in Nuevo Leon, Mexico for the 2010 Hurricane Alex Flood Event

    Science.gov (United States)

    Baish, A. S.; Vivoni, E. R.; Payan, J. G.; Robles-Morua, A.; Basile, G. M.

    2011-12-01

    A distributed hydrologic model can help bring consensus among diverse stakeholders in regional flood planning by producing quantifiable sets of alternative futures. This value is acute in areas with high uncertainties in hydrologic conditions and sparse observations. In this study, we conduct an application of the Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator (tRIBS) in the Santa Catarina basin of Nuevo Leon, Mexico, where Hurricane Alex in July 2010 led to catastrophic flooding of the capital city of Monterrey. Distributed model simulations utilize best-available information on the regional topography, land cover, and soils obtained from Mexican government agencies or analysis of remotely-sensed imagery from MODIS and ASTER. Furthermore, we developed meteorological forcing for the flood event based on multiple data sources, including three local gauge networks, satellite-based estimates from TRMM and PERSIANN, and the North American Land Data Assimilation System (NLDAS). Remotely-sensed data allowed us to quantify rainfall distributions in the upland, rural portions of the Santa Catarina that are sparsely populated and ungauged. Rural areas had significant contributions to the flood event and as a result were considered by stakeholders for flood control measures, including new reservoirs and upland vegetation management. Participatory modeling workshops with the stakeholders revealed a disconnect between urban and rural populations in regard to understanding the hydrologic conditions of the flood event and the effectiveness of existing and potential flood control measures. Despite these challenges, the use of the distributed flood forecasts developed within this participatory framework facilitated building consensus among diverse stakeholders and exploring alternative futures in the basin.

  14. Numerical Simulation of a Breaking Gravity Wave Event Over Greenland Observed During Fastex

    National Research Council Canada - National Science Library

    Doyle, James

    1997-01-01

    Measurements from the NOAA G4 research aircraft and high-resolution numerical simulations are used to study the evolution and dynamics of a large-amplitude gravity wave event over Greenland that took...

  15. The effect of medical trainees on pediatric emergency department flow: a discrete event simulation modeling study.

    Science.gov (United States)

    Genuis, Emerson D; Doan, Quynh

    2013-11-01

    Providing patient care and medical education are both important missions of teaching hospital emergency departments (EDs). With medical school enrollment rising, and ED crowding becoming an increasing prevalent issue, it is important for both pediatric EDs (PEDs) and general EDs to find a balance between these two potentially competing goals. The objective was to determine how the number of trainees in a PED affects patient wait time, total ED length of stay (LOS), and rates of patients leaving without being seen (LWBS) for PED patients overall and stratified by acuity level as defined by the Pediatric Canadian Triage and Acuity Scale (CTAS) using discrete event simulation (DES) modeling. A DES model of an urban tertiary care PED, which receives approximately 40,000 visits annually, was created and validated. Thirteen different trainee schedules, which ranged from averaging zero to six trainees per shift, were input into the DES model and the outcome measures were determined using the combined output of five model iterations. An increase in LOS of approximately 7 minutes was noted to be associated with each additional trainee per attending emergency physician working in the PED. The relationship between the number of trainees and wait time varied with patients' level of acuity and with the degree of PED utilization. Patient wait time decreased as the number of trainees increased for low-acuity visits and when the PED was not operating at full capacity. With rising numbers of trainees, the PED LWBS rate decreased in the whole department and in the CTAS 4 and 5 patient groups, but it rose in patients triaged CTAS 3 or higher. A rising numbers of trainees was not associated with any change to flow outcomes for CTAS 1 patients. The results of this study demonstrate that trainees in PEDs have an impact mainly on patient LOS and that the effect on wait time differs between patients presenting with varying degrees of acuity. These findings will assist PEDs in finding a

  16. Urban nonpoint source pollution buildup and washoff models for simulating storm runoff quality in the Los Angeles County.

    Science.gov (United States)

    Wang, Long; Wei, Jiahua; Huang, Yuefei; Wang, Guangqian; Maqsood, Imran

    2011-07-01

    Many urban nonpoint source pollution models utilize pollutant buildup and washoff functions to simulate storm runoff quality of urban catchments. In this paper, two urban pollutant washoff load models are derived using pollutant buildup and washoff functions. The first model assumes that there is no residual pollutant after a storm event while the second one assumes that there is always residual pollutant after each storm event. The developed models are calibrated and verified with observed data from an urban catchment in the Los Angeles County. The application results show that the developed model with consideration of residual pollutant is more capable of simulating nonpoint source pollution from urban storm runoff than that without consideration of residual pollutant. For the study area, residual pollutant should be considered in pollutant buildup and washoff functions for simulating urban nonpoint source pollution when the total runoff volume is less than 30 mm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Budget Impact Analysis of Switching to Digital Mammography in a Population-Based Breast Cancer Screening Program: A Discrete Event Simulation Model

    Science.gov (United States)

    Comas, Mercè; Arrospide, Arantzazu; Mar, Javier; Sala, Maria; Vilaprinyó, Ester; Hernández, Cristina; Cots, Francesc; Martínez, Juan; Castells, Xavier

    2014-01-01

    Objective To assess the budgetary impact of switching from screen-film mammography to full-field digital mammography in a population-based breast cancer screening program. Methods A discrete-event simulation model was built to reproduce the breast cancer screening process (biennial mammographic screening of women aged 50 to 69 years) combined with the natural history of breast cancer. The simulation started with 100,000 women and, during a 20-year simulation horizon, new women were dynamically entered according to the aging of the Spanish population. Data on screening were obtained from Spanish breast cancer screening programs. Data on the natural history of breast cancer were based on US data adapted to our population. A budget impact analysis comparing digital with screen-film screening mammography was performed in a sample of 2,000 simulation runs. A sensitivity analysis was performed for crucial screening-related parameters. Distinct scenarios for recall and detection rates were compared. Results Statistically significant savings were found for overall costs, treatment costs and the costs of additional tests in the long term. The overall cost saving was 1,115,857€ (95%CI from 932,147 to 1,299,567) in the 10th year and 2,866,124€ (95%CI from 2,492,610 to 3,239,638) in the 20th year, representing 4.5% and 8.1% of the overall cost associated with screen-film mammography. The sensitivity analysis showed net savings in the long term. Conclusions Switching to digital mammography in a population-based breast cancer screening program saves long-term budget expense, in addition to providing technical advantages. Our results were consistent across distinct scenarios representing the different results obtained in European breast cancer screening programs. PMID:24832200

  18. Productivity improvement using discrete events simulation

    Science.gov (United States)

    Hazza, M. H. F. Al; Elbishari, E. M. Y.; Ismail, M. Y. Bin; Adesta, E. Y. T.; Rahman, Nur Salihah Binti Abdul

    2018-01-01

    The increasing in complexity of the manufacturing systems has increased the cost of investment in many industries. Furthermore, the theoretical feasibility studies are not enough to take the decision in investing for that particular area. Therefore, the development of the new advanced software is protecting the manufacturer from investing money in production lines that may not be sufficient and effective with their requirement in terms of machine utilization and productivity issue. By conducting a simulation, using accurate model will reduce and eliminate the risk associated with their new investment. The aim of this research is to prove and highlight the importance of simulation in decision-making process. Delmia quest software was used as a simulation program to run a simulation for the production line. A simulation was first done for the existing production line and show that the estimated production rate is 261 units/day. The results have been analysed based on utilization percentage and idle time. Two different scenarios have been proposed based on different objectives. The first scenario is by focusing on low utilization machines and their idle time, this was resulted in minimizing the number of machines used by three with the addition of the works who maintain them without having an effect on the production rate. The second scenario is to increase the production rate by upgrading the curing machine which lead to the increase in the daily productivity by 7% from 261 units to 281 units.

  19. Modeling and simulation of botnet based cyber-threats

    Directory of Open Access Journals (Sweden)

    Kasprzyk Rafał

    2017-01-01

    Full Text Available The paper presents an analysis of cyber-threats, with particular emphasis on the threats resulting from botnet activity. Botnets are the most common types of threats and often perceived as crucial in terms of national security. Their classification and methods of spreading are the basis for creating cyberspace model including the presence of different types of cyber-threats. A well-designed cyberspace model enables to construct an experimental environment that allows for the analysis of botnet characteristics, testing its resistance to various events and simulation of the spread and evolution. For this purpose, dedicated platforms with capabilities and functional characteristics to meet these requirements have been proposed.

  20. Conceptual modeling for simulation-based serious gaming

    NARCIS (Netherlands)

    van der Zee, D.J.; Holkenborg, Bart; Robinson, Stewart

    2012-01-01

    In recent years many simulation-based serious games have been developed for supporting (future) managers in operations management decision making. They illustrate the high potential of using discrete event simulation for pedagogical purposes. Unfortunately, this potential does not seem to go

  1. Simulation of Wind-Driven Snow Redistribution at a High-Elevation Alpine Site Using a Meso-Scale Atmospheric Model

    Science.gov (United States)

    Vionnet, V.; Martin, E.; Masson, V.; Guyomarc'h, G.; Naaim-Bouvet, F.; Prokop, A.; Durand, Y.; Lac, C.

    2012-12-01

    In alpine regions, blowing snow events strongly influence the temporal and spatial evolution of the snow depth distribution throughout the winter season. We recently developed a new simulation system to gain understanding on the complex processes that drive the redistribution of snow by the wind in complex terrain. This new system couples directly the detailed snow-pack model Crocus with the meso-scale atmospheric model Meso-NH. A blowing snow scheme allows Meso-NH to simulate the transport of snow particles in the atmosphere. We used the coupled system to study a blowing snow event with snowfall that occurred in February 2011 in the Grandes Rousses range (French Alps). Three nested domains at an horizontal resolution of 450, 150 and 50 m allow the model to simulate the complex 3D precipitation and wind fields around our experimental site (2720 m a.s.l.) during this 22-hour event. Wind-induced snow transport is activated over the domains of higher resolution (150 and 50 m). We firstly assessed the ability of the model to reproduce atmospheric flows at high resolution in alpine terrain using a large dataset of observations (meteorological data, vertical profile of wind speed). Simulated blowing snow fluxes are then compared with measurements from SPC and mechanical snow traps. Finally a map of snow erosion and accumulation produced by Terrestrial Laser measurements allows to evaluate the quality of the simulated snow depth redistribution.

  2. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department

    Directory of Open Access Journals (Sweden)

    Cholada Kittipittayakorn

    2016-01-01

    Full Text Available Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries’ healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES and agent-based simulation (ABS to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.

  3. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    Science.gov (United States)

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.

  4. A Probabilistic Model of Meter Perception: Simulating Enculturation

    Directory of Open Access Journals (Sweden)

    Bastiaan van der Weij

    2017-05-01

    Full Text Available Enculturation is known to shape the perception of meter in music but this is not explicitly accounted for by current cognitive models of meter perception. We hypothesize that the induction of meter is a result of predictive coding: interpreting onsets in a rhythm relative to a periodic meter facilitates prediction of future onsets. Such prediction, we hypothesize, is based on previous exposure to rhythms. As such, predictive coding provides a possible explanation for the way meter perception is shaped by the cultural environment. Based on this hypothesis, we present a probabilistic model of meter perception that uses statistical properties of the relation between rhythm and meter to infer meter from quantized rhythms. We show that our model can successfully predict annotated time signatures from quantized rhythmic patterns derived from folk melodies. Furthermore, we show that by inferring meter, our model improves prediction of the onsets of future events compared to a similar probabilistic model that does not infer meter. Finally, as a proof of concept, we demonstrate how our model can be used in a simulation of enculturation. From the results of this simulation, we derive a class of rhythms that are likely to be interpreted differently by enculturated listeners with different histories of exposure to rhythms.

  5. Combining Latin Hypercube Designs and Discrete Event Simulation in a Study of a Surgical Unit

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Andersen, Klaus Kaae; Kulahci, Murat

    Summary form given only:In this article experiments on a discrete event simulation model for an orthopedic surgery are considered. The model is developed as part of a larger project in co-operation with Copenhagen University Hospital in Gentofte. Experiments on the model are performed by using...... Latin hypercube designs. The parameter set consists of system settings such as use of preparation room for sedation and the number of operating rooms, as well as management decisions such as staffing, size of the recovery room and the number of simultaneously active operating rooms. Sensitivity analysis...... and optimization combined with meta-modeling are employed in search for optimal setups. The primary objective in this article is to minimize time spent by the patients in the system. The overall long-term objective for the orthopedic surgery unit is to minimize time lost during the pre- and post operation...

  6. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    Science.gov (United States)

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  7. Initiating Events Modeling for On-Line Risk Monitoring Application

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.

    1998-01-01

    In order to make on-line risk monitoring application of Probabilistic Risk Assessment more complete and realistic, a special attention need to be dedicated to initiating events modeling. Two different issues are of special importance: one is how to model initiating events frequency according to current plant configuration (equipment alignment and out of service status) and operating condition (weather and various activities), and the second is how to preserve dependencies between initiating events model and rest of PRA model. First, the paper will discuss how initiating events can be treated in on-line risk monitoring application. Second, practical example of initiating events modeling in EPRI's Equipment Out of Service on-line monitoring tool will be presented. Gains from application and possible improvements will be discussed in conclusion. (author)

  8. Forecasting severe ice storms using numerical weather prediction: the March 2010 Newfoundland event

    Directory of Open Access Journals (Sweden)

    J. Hosek

    2011-02-01

    Full Text Available The northeast coast of North America is frequently hit by severe ice storms. These freezing rain events can produce large ice accretions that damage structures, frequently power transmission and distribution infrastructure. For this reason, it is highly desirable to model and forecast such icing events, so that the consequent damages can be prevented or mitigated. The case study presented in this paper focuses on the March 2010 ice storm event that took place in eastern Newfoundland. We apply a combination of a numerical weather prediction model and an ice accretion algorithm to simulate a forecast of this event.

    The main goals of this study are to compare the simulated meteorological variables to observations, and to assess the ability of the model to accurately predict the ice accretion load for different forecast horizons. The duration and timing of the freezing rain event that occurred between the night of 4 March and the morning of 6 March was simulated well in all model runs. The total precipitation amounts in the model, however, differed by up to a factor of two from the observations. The accuracy of the model air temperature strongly depended on the forecast horizon, but it was acceptable for all simulation runs. The simulated accretion loads were also compared to the design values for power delivery structures in the region. The results indicated that the simulated values exceeded design criteria in the areas of reported damage and power outages.

  9. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    Science.gov (United States)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  10. Fractional counts-the simulation of low probability events

    International Nuclear Information System (INIS)

    Coldwell, R.L.; Lasche, G.P.; Jadczyk, A.

    2001-01-01

    The code RobSim has been added to RobWin.1 It simulates spectra resulting from gamma rays striking an array of detectors made up of different components. These are frequently used to set coincidence and anti-coincidence windows that decide if individual events are part of the signal. The first problem addressed is the construction of the detector. Then owing to the statistical nature of the responses of these elements there is a random nature in the response that can be taken into account by including fractional counts in the output spectrum. This somewhat complicates the error analysis, as Poisson statistics are no longer applicable

  11. Calibration of a rainfall-runoff hydrological model and flood simulation using data assimilation

    Science.gov (United States)

    Piacentini, A.; Ricci, S. M.; Thual, O.; Coustau, M.; Marchandise, A.

    2010-12-01

    velocity travel before the flood peak. These optimal values are used for a new simulation of the event in forecast mode (under the assumption of perfect rain-fall). On both catchments, it was shown over a significant number of flood events, that the data assimilation procedure improves the flood peak forecast. The improvement is globally more important for the Gardon d'Anduze catchment where the flood events are stronger. The peak can be forecasted up to 36 hours head of time assimilating very few observations (up to 4) during the rise of the water level. For multiple peaks events, the assimilation of the observations from the first peak leads to a significant improvement of the second peak simulation. It was also shown that the flood rise is often faster in reality than it is represented by the model. In this case and when the flood peak is under estimated in the simulation, the use of the first observations can be misleading for the data assimilation algorithm. The careful estimation of the observation and background error variances enabled the satisfying use of the data assimilation in these complex cases even though it does not allow the model error correction.

  12. Modelado del transformador para eventos de alta frecuencia; Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana Galván Sánchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético. Debido a su construcción física, su representación como un circuito y su modelo matemático son muy complejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de la red eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienen fenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muy detallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operación de un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea de transmisión a una distancia de 5 km de una subestación de potencia. The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physical construction, its representation as a circuit and its mathematical model are very complex. The electromagnetic behavior and all the elements in the power network depend on the involved frequency. So, for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transient behavior. This work analyzes how to pass from a simple model to a very detailed model to simulated high frequency events. The simulated events are the switch operation due to a fault in the system and the impact of an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  13. Modelado del transformador para eventos de alta frecuencia ;Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana – Galván Sanchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético.Debido a su construcción física, su representación como un circuito y su modelo matemático son muycomplejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de lared eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienenfenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muydetallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operaciónde un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea detransmisión a una distancia de 5 km de una subestación de potencia.The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physicalconstruction, its representation as a circuit and its mathematical model are very complex. Theelectromagnetic behavior and all the elements in the power network depend on the involved frequency. So,for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transientbehavior. This work analyzes how to pass from a simple model to a very detailed model to simulated highfrequency events. The simulated events are the switch operation due to a fault in the system and the impactof an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  14. Micromechanics and statistics of slipping events in a granular seismic fault model

    Energy Technology Data Exchange (ETDEWEB)

    Arcangelis, L de [Department of Information Engineering and CNISM, Second University of Naples, Aversa (Italy); Ciamarra, M Pica [CNR-SPIN, Dipartimento di Scienze Fisiche, Universita di Napoli Federico II (Italy); Lippiello, E; Godano, C, E-mail: dearcangelis@na.infn.it [Department of Environmental Sciences and CNISM, Second University of Naples, Caserta (Italy)

    2011-09-15

    The stick-slip is investigated in a seismic fault model made of a confined granular system under shear stress via three dimensional Molecular Dynamics simulations. We study the statistics of slipping events and, in particular, the dependence of the distribution on model parameters. The distribution consistently exhibits two regimes: an initial power law and a bump at large slips. The initial power law decay is in agreement with the the Gutenberg-Richter law characterizing real seismic occurrence. The exponent of the initial regime is quite independent of model parameters and its value is in agreement with experimental results. Conversely, the position of the bump is solely controlled by the ratio of the drive elastic constant and the system size. Large slips also become less probable in absence of fault gouge and tend to disappear for stiff drives. A two-time force-force correlation function, and a susceptibility related to the system response to pressure changes, characterize the micromechanics of slipping events. The correlation function unveils the micromechanical changes occurring both during microslips and slips. The mechanical susceptibility encodes the magnitude of the incoming microslip. Numerical results for the cellular-automaton version of the spring block model confirm the parameter dependence observed for size distribution in the granular model.

  15. Simulation of Random Events for Air Traffic Applications

    Directory of Open Access Journals (Sweden)

    Stéphane Puechmorel

    2018-05-01

    Full Text Available Resilience to uncertainties must be ensured in air traffic management. Unexpected events can either be disruptive, like thunderstorms or the famous volcano ash cloud resulting from the Eyjafjallajökull eruption in Iceland, or simply due to imprecise measurements or incomplete knowledge of the environment. While human operators are able to cope with such situations, it is generally not the case for automated decision support tools. Important examples originate from the numerous attempts made to design algorithms able to solve conflicts between aircraft occurring during flights. The STARGATE (STochastic AppRoach for naviGATion functions in uncertain Environment project was initiated in order to study the feasibility of inherently robust automated planning algorithms that will not fail when submitted to random perturbations. A mandatory first step is the ability to simulate the usual stochastic phenomenons impairing the system: delays due to airport platforms or air traffic control (ATC and uncertainties on the wind velocity. The work presented here will detail algorithms suitable for the simulation task.

  16. Optimizing patient flow in a large hospital surgical centre by means of discrete-event computer simulation models.

    Science.gov (United States)

    Ferreira, Rodrigo B; Coelli, Fernando C; Pereira, Wagner C A; Almeida, Renan M V R

    2008-12-01

    This study used the discrete-events computer simulation methodology to model a large hospital surgical centre (SC), in order to analyse the impact of increases in the number of post-anaesthetic beds (PABs), of changes in surgical room scheduling strategies and of increases in surgery numbers. The used inputs were: number of surgeries per day, type of surgical room scheduling, anaesthesia and surgery duration, surgical teams' specialty and number of PABs, and the main outputs were: number of surgeries per day, surgical rooms' use rate and blocking rate, surgical teams' use rate, patients' blocking rate, surgery delays (minutes) and the occurrence of postponed surgeries. Two basic strategies were implemented: in the first strategy, the number of PABs was increased under two assumptions: (a) following the scheduling plan actually used by the hospital (the 'rigid' scheduling - surgical rooms were previously assigned and assignments could not be changed) and (b) following a 'flexible' scheduling (surgical rooms, when available, could be freely used by any surgical team). In the second, the same analysis was performed, increasing the number of patients (up to the system 'feasible maximum') but fixing the number of PABs, in order to evaluate the impact of the number of patients over surgery delays. It was observed that the introduction of a flexible scheduling/increase in PABs would lead to a significant improvement in the SC productivity.

  17. A predictive model of nuclear power plant crew decision-making and performance in a dynamic simulation environment

    Science.gov (United States)

    Coyne, Kevin Anthony

    The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional

  18. Evolution of Precipitation Structure During the November DYNAMO MJO Event: Cloud-Resolving Model Intercomparison and Cross Validation Using Radar Observations

    Science.gov (United States)

    Li, Xiaowen; Janiga, Matthew A.; Wang, Shuguang; Tao, Wei-Kuo; Rowe, Angela; Xu, Weixin; Liu, Chuntao; Matsui, Toshihisa; Zhang, Chidong

    2018-04-01

    Evolution of precipitation structures are simulated and compared with radar observations for the November Madden-Julian Oscillation (MJO) event during the DYNAmics of the MJO (DYNAMO) field campaign. Three ground-based, ship-borne, and spaceborne precipitation radars and three cloud-resolving models (CRMs) driven by observed large-scale forcing are used to study precipitation structures at different locations over the central equatorial Indian Ocean. Convective strength is represented by 0-dBZ echo-top heights, and convective organization by contiguous 17-dBZ areas. The multi-radar and multi-model framework allows for more stringent model validations. The emphasis is on testing models' ability to simulate subtle differences observed at different radar sites when the MJO event passed through. The results show that CRMs forced by site-specific large-scale forcing can reproduce not only common features in cloud populations but also subtle variations observed by different radars. The comparisons also revealed common deficiencies in CRM simulations where they underestimate radar echo-top heights for the strongest convection within large, organized precipitation features. Cross validations with multiple radars and models also enable quantitative comparisons in CRM sensitivity studies using different large-scale forcing, microphysical schemes and parameters, resolutions, and domain sizes. In terms of radar echo-top height temporal variations, many model sensitivity tests have better correlations than radar/model comparisons, indicating robustness in model performance on this aspect. It is further shown that well-validated model simulations could be used to constrain uncertainties in observed echo-top heights when the low-resolution surveillance scanning strategy is used.

  19. Modelling machine ensembles with discrete event dynamical system theory

    Science.gov (United States)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  20. Discrete event simulations for glycolysis pathway and energy balance

    NARCIS (Netherlands)

    Zwieten, van D.A.J.; Rooda, J.E.; Armbruster, H.D.; Nagy, J.D.

    2010-01-01

    In this report, the biological network of the glycolysis pathway has been modeled using discrete event models (DEMs). The most important feature of this pathway is that energy is released. To create a stable steady-state system an energy molecule equilibrating enzyme and metabolic reactions have