WorldWideScience

Sample records for event realistically simulated

  1. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    Science.gov (United States)

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  2. Novel high-fidelity realistic explosion damage simulation for urban environments

    Science.gov (United States)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  3. Realistic and efficient 2D crack simulation

    Science.gov (United States)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  4. A more realistic simulation of the performance of the infra-sound monitoring network

    International Nuclear Information System (INIS)

    Le Pichon, A.; Vergoz, J.; Blanc, E.

    2008-01-01

    The first global maps showing the performance of the infra-sound network of the international monitoring system were set in the nineties. Recent measurement of the background noise by the 36 operating stations combined with advanced models of wind give now a more realistic mapping. It has become possible to validate simulations by measuring real events. For instance the explosion that happened in March 2008 in an ammunition storehouse in Albania was detected till Zalesovo (Russia) 4920 km away. These new simulations confirm the detection capability of the network to detect and localize atmospheric explosions whose energy is over 1 kt. It is also shown that the detection performance are very sensitive to both time and places. (A.C.)

  5. MetAssimulo:Simulation of Realistic NMR Metabolic Profiles

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2010-10-01

    Full Text Available Abstract Background Probing the complex fusion of genetic and environmental interactions, metabolic profiling (or metabolomics/metabonomics, the study of small molecules involved in metabolic reactions, is a rapidly expanding 'omics' field. A major technique for capturing metabolite data is 1H-NMR spectroscopy and this yields highly complex profiles that require sophisticated statistical analysis methods. However, experimental data is difficult to control and expensive to obtain. Thus data simulation is a productive route to aid algorithm development. Results MetAssimulo is a MATLAB-based package that has been developed to simulate 1H-NMR spectra of complex mixtures such as metabolic profiles. Drawing data from a metabolite standard spectral database in conjunction with concentration information input by the user or constructed automatically from the Human Metabolome Database, MetAssimulo is able to create realistic metabolic profiles containing large numbers of metabolites with a range of user-defined properties. Current features include the simulation of two groups ('case' and 'control' specified by means and standard deviations of concentrations for each metabolite. The software enables addition of spectral noise with a realistic autocorrelation structure at user controllable levels. A crucial feature of the algorithm is its ability to simulate both intra- and inter-metabolite correlations, the analysis of which is fundamental to many techniques in the field. Further, MetAssimulo is able to simulate shifts in NMR peak positions that result from matrix effects such as pH differences which are often observed in metabolic NMR spectra and pose serious challenges for statistical algorithms. Conclusions No other software is currently able to simulate NMR metabolic profiles with such complexity and flexibility. This paper describes the algorithm behind MetAssimulo and demonstrates how it can be used to simulate realistic NMR metabolic profiles with

  6. Realistic electricity market simulator for energy and economic studies

    International Nuclear Information System (INIS)

    Bernal-Agustin, Jose L.; Contreras, Javier; Conejo, Antonio J.; Martin-Flores, Raul

    2007-01-01

    Electricity market simulators have become a useful tool to train engineers in the power industry. With the maturing of electricity markets throughout the world, there is a need for sophisticated software tools that can replicate the actual behavior of power markets. In most of these markets, power producers/consumers submit production/demand bids and the Market Operator clears the market producing a single price per hour. What makes markets different from each other are the bidding rules and the clearing algorithms to balance the market. This paper presents a realistic simulator of the day-ahead electricity market of mainland Spain. All the rules that govern this market are modeled. This simulator can be used either to train employees by power companies or to teach electricity markets courses in universities. To illustrate the tool, several realistic case studies are presented and discussed. (author)

  7. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  8. Comparative study of the effectiveness of three learning environments: Hyper-realistic virtual simulations, traditional schematic simulations and traditional laboratory

    Directory of Open Access Journals (Sweden)

    Maria Isabel Suero

    2011-10-01

    Full Text Available This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output. This new virtual environment concept, which we call hyper-realistic, transcends basic schematic simulation; it provides the user with a more realistic perception of a physical phenomenon being simulated. We compared the learning achievements of three equivalent, homogeneous groups of undergraduates—an experimental group who used only the hyper-realistic virtual laboratory, a first control group who used a schematic simulation, and a second control group who used the traditional laboratory. The three groups received the same theoretical preparation and carried out equivalent practicals in their respective learning environments. The topic chosen for the experiment was optical aberrations. An analysis of variance applied to the data of the study demonstrated a statistically significant difference (p value <0.05 between the three groups. The learning achievements attained by the group using the hyper-realistic virtual environment were 6.1 percentage points higher than those for the group using the traditional schematic simulations and 9.5 percentage points higher than those for the group using the traditional laboratory.

  9. Development of BWR [boiling water reactor] and PWR [pressurized water reactor] event descriptions for nuclear facility simulator training

    International Nuclear Information System (INIS)

    Carter, R.J.; Bovell, C.R.

    1987-01-01

    A number of tools that can aid nuclear facility training developers in designing realistic simulator scenarios have been developed. This paper describes each of the tools, i.e., event lists, events-by-competencies matrices, and event descriptions, and illustrates how the tools can be used to construct scenarios

  10. Realistic Simulation of Rice Plant

    Directory of Open Access Journals (Sweden)

    Wei-long DING

    2011-09-01

    Full Text Available The existing research results of virtual modeling of rice plant, however, is far from perfect compared to that of other crops due to its complex structure and growth process. Techniques to visually simulate the architecture of rice plant and its growth process are presented based on the analysis of the morphological characteristics at different stages. Firstly, the simulations of geometrical shape, the bending status and the structural distortion of rice leaves are conducted. Then, by using an improved model for bending deformation, the curved patterns of panicle axis and various types of panicle branches are generated, and the spatial shape of rice panicle is therefore created. Parametric L-system is employed to generate its topological structures, and finite-state automaton is adopted to describe the development of geometrical structures. Finally, the computer visualization of three-dimensional morphologies of rice plant at both organ and individual levels is achieved. The experimental results showed that the proposed methods of modeling the three-dimensional shapes of organs and simulating the growth of rice plant are feasible and effective, and the generated three-dimensional images are realistic.

  11. HELIOSEISMOLOGY OF A REALISTIC MAGNETOCONVECTIVE SUNSPOT SIMULATION

    International Nuclear Information System (INIS)

    Braun, D. C.; Birch, A. C.; Rempel, M.; Duvall, T. L. Jr.

    2012-01-01

    We compare helioseismic travel-time shifts measured from a realistic magnetoconvective sunspot simulation using both helioseismic holography and time-distance helioseismology, and measured from real sunspots observed with the Helioseismic and Magnetic Imager instrument on board the Solar Dynamics Observatory and the Michelson Doppler Imager instrument on board the Solar and Heliospheric Observatory. We find remarkable similarities in the travel-time shifts measured between the methodologies applied and between the simulated and real sunspots. Forward modeling of the travel-time shifts using either Born or ray approximation kernels and the sound-speed perturbations present in the simulation indicates major disagreements with the measured travel-time shifts. These findings do not substantially change with the application of a correction for the reduction of wave amplitudes in the simulated and real sunspots. Overall, our findings demonstrate the need for new methods for inferring the subsurface structure of sunspots through helioseismic inversions.

  12. Using remotely sensed data and stochastic models to simulate realistic flood hazard footprints across the continental US

    Science.gov (United States)

    Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.

    2017-12-01

    Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in

  13. UE4Sim: A Photo-Realistic Simulator for Computer Vision Applications

    KAUST Repository

    Mueller, Matthias; Casser, Vincent; Lahoud, Jean; Smith, Neil; Ghanem, Bernard

    2017-01-01

    We present a photo-realistic training and evaluation simulator (UE4Sim) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network (DNN) architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.

  14. UE4Sim: A Photo-Realistic Simulator for Computer Vision Applications

    KAUST Repository

    Mueller, Matthias

    2017-08-19

    We present a photo-realistic training and evaluation simulator (UE4Sim) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network (DNN) architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.

  15. Sim4CV: A Photo-Realistic Simulator for Computer Vision Applications

    KAUST Repository

    Müller, Matthias

    2018-03-24

    We present a photo-realistic training and evaluation simulator (Sim4CV) (http://www.sim4cv.org) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.

  16. Discrete-Event Simulation

    Directory of Open Access Journals (Sweden)

    Prateek Sharma

    2015-04-01

    Full Text Available Abstract Simulation can be regarded as the emulation of the behavior of a real-world system over an interval of time. The process of simulation relies upon the generation of the history of a system and then analyzing that history to predict the outcome and improve the working of real systems. Simulations can be of various kinds but the topic of interest here is one of the most important kind of simulation which is Discrete-Event Simulation which models the system as a discrete sequence of events in time. So this paper aims at introducing about Discrete-Event Simulation and analyzing how it is beneficial to the real world systems.

  17. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, H.; Zhao, S.; Yuan, S.; Jin, F.; Michielsen, K.; Miyashita, S.

    We discuss recent progress in the development of simulation algorithms that do not rely on any concept of quantum theory but are nevertheless capable of reproducing the averages computed from quantum theory through an event-by-event simulation. The simulation approach is illustrated by applications

  18. Convective aggregation in realistic convective-scale simulations

    Science.gov (United States)

    Holloway, Christopher E.

    2017-06-01

    To investigate the real-world relevance of idealized-model convective self-aggregation, five 15 day cases of real organized convection in the tropics are simulated. These include multiple simulations of each case to test sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. These simulations are compared to self-aggregation seen in the same model configured to run in idealized radiative-convective equilibrium. Analysis of the budget of the spatial variance of column-integrated frozen moist static energy shows that control runs have significant positive contributions to organization from radiation and negative contributions from surface fluxes and transport, similar to idealized runs once they become aggregated. Despite identical lateral boundary conditions for all experiments in each case, systematic differences in mean column water vapor (CWV), CWV distribution shape, and CWV autocorrelation length scale are found between the different sensitivity runs, particularly for those without interactive radiation, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations (although the organization of precipitation shows less sensitivity to interactive radiation). The magnitudes and signs of these systematic differences are consistent with a rough equilibrium between (1) equalization due to advection from the lateral boundaries and (2) disaggregation due to the absence of interactive radiation, implying disaggregation rates comparable to those in idealized runs with aggregated initial conditions and noninteractive radiation. This points to a plausible similarity in the way that radiation feedbacks maintain aggregated convection in both idealized simulations and the real world.Plain Language SummaryUnderstanding the processes that lead to the organization of tropical rainstorms is an important challenge for weather

  19. Challenges and solutions for realistic room simulation

    Science.gov (United States)

    Begault, Durand R.

    2002-05-01

    Virtual room acoustic simulation (auralization) techniques have traditionally focused on answering questions related to speech intelligibility or musical quality, typically in large volumetric spaces. More recently, auralization techniques have been found to be important for the externalization of headphone-reproduced virtual acoustic images. Although externalization can be accomplished using a minimal simulation, data indicate that realistic auralizations need to be responsive to head motion cues for accurate localization. Computational demands increase when providing for the simulation of coupled spaces, small rooms lacking meaningful reverberant decays, or reflective surfaces in outdoor environments. Auditory threshold data for both early reflections and late reverberant energy levels indicate that much of the information captured in acoustical measurements is inaudible, minimizing the intensive computational requirements of real-time auralization systems. Results are presented for early reflection thresholds as a function of azimuth angle, arrival time, and sound-source type, and reverberation thresholds as a function of reverberation time and level within 250-Hz-2-kHz octave bands. Good agreement is found between data obtained in virtual room simulations and those obtained in real rooms, allowing a strategy for minimizing computational requirements of real-time auralization systems.

  20. Simulation of microarray data with realistic characteristics

    Directory of Open Access Journals (Sweden)

    Lehmussola Antti

    2006-07-01

    Full Text Available Abstract Background Microarray technologies have become common tools in biological research. As a result, a need for effective computational methods for data analysis has emerged. Numerous different algorithms have been proposed for analyzing the data. However, an objective evaluation of the proposed algorithms is not possible due to the lack of biological ground truth information. To overcome this fundamental problem, the use of simulated microarray data for algorithm validation has been proposed. Results We present a microarray simulation model which can be used to validate different kinds of data analysis algorithms. The proposed model is unique in the sense that it includes all the steps that affect the quality of real microarray data. These steps include the simulation of biological ground truth data, applying biological and measurement technology specific error models, and finally simulating the microarray slide manufacturing and hybridization. After all these steps are taken into account, the simulated data has realistic biological and statistical characteristics. The applicability of the proposed model is demonstrated by several examples. Conclusion The proposed microarray simulation model is modular and can be used in different kinds of applications. It includes several error models that have been proposed earlier and it can be used with different types of input data. The model can be used to simulate both spotted two-channel and oligonucleotide based single-channel microarrays. All this makes the model a valuable tool for example in validation of data analysis algorithms.

  1. Fault-Tolerant Robot Programming through Simulation with Realistic Sensor Models

    Directory of Open Access Journals (Sweden)

    Axel Waggershauser

    2008-11-01

    Full Text Available We introduce a simulation system for mobile robots that allows a realistic interaction of multiple robots in a common environment. The simulated robots are closely modeled after robots from the EyeBot family and have an identical application programmer interface. The simulation supports driving commands at two levels of abstraction as well as numerous sensors such as shaft encoders, infrared distance sensors, and compass. Simulation of on-board digital cameras via synthetic images allows the use of image processing routines for robot control within the simulation. Specific error models for actuators, distance sensors, camera sensor, and wireless communication have been implemented. Progressively increasing error levels for an application program allows for testing and improving its robustness and fault-tolerance.

  2. Studies on switch-based event building systems in RD13

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    One of the goals of the RD13 project at CERN is to investigate the feasibility of parallel event building system for detectors at the LHC. Studies were performed by building a prototype based on the HiPPI standard and by modeling this prototype and extended architectures with MODSIM II. The prototype used commercially available VME-HiPPI interfaces and a HiPPI switch together with a modular software. The setup was tested successfully as a parallel event building system in different configurations and with different data flow control schemes. The simulation program was used with realistic parameters from the prototype measurements to simulate large-scale event building systems. This includes simulations of a realistic setup of the ATLAS event building system. The influence of different parameters and scaling behavior were investigated. The influence of realistic event size distributions was checked with data from off-line simulations. Different control schemes for destination assignment and traffic shaping were investigated as well as a two-stage event building system. (author)

  3. Interactive Web-based Floodplain Simulation System for Realistic Experiments of Flooding and Flood Damage

    Science.gov (United States)

    Demir, I.

    2013-12-01

    Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.

  4. The Skateboard Factory: a teaching case on discrete-event simulation

    Directory of Open Access Journals (Sweden)

    Marco Aurélio de Mesquita

    Full Text Available Abstract Real-life applications during the teaching process are a desirable practice in simulation education. However, access to real cases imposes some difficulty in implement such practice, especially when the classes are large. This paper presents a teaching case for a computer simulation course in a production engineering undergraduate program. The motivation for the teaching case was to provide students with a realistic manufacturing case to stimulate the learning of simulation concepts and methods in the context of industrial engineering. The case considers a virtual factory of skateboards, which operations include parts manufacturing, final assembly and storage of raw materials, work-in-process and finished products. Students should model and simulate the factory, under push and pull production strategies, using any simulation software available in the laboratory. The teaching case, applied in the last two years, contributed to motivate and consolidate the students’ learning of discrete-event simulation. It proved to be a feasible alternative to the previous practice of letting students freely choose a case for their final project, while keeping the essence of project-based learning approach.

  5. Realistic Simulations of Coronagraphic Observations with WFIRST

    Science.gov (United States)

    Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)

    2018-01-01

    We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.

  6. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, Hans; Michielsen, Kristel

    A discrete-event simulation approach is reviewed that does not require the knowledge of the solution of the wave equation of the whole system, yet reproduces the statistical distributions of wave theory by generating detection events one-by-one. The simulation approach is illustrated by applications

  7. DROpS: an object of learning in computer simulation of discrete events

    Directory of Open Access Journals (Sweden)

    Hugo Alves Silva Ribeiro

    2015-09-01

    Full Text Available This work presents the “Realistic Dynamics Of Simulated Operations” (DROpS, the name given to the dynamics using the “dropper” device as an object of teaching and learning. The objective is to present alternatives for professors teaching content related to simulation of discrete events to graduate students in production engineering. The aim is to enable students to develop skills related to data collection, modeling, statistical analysis, and interpretation of results. This dynamic has been developed and applied to the students by placing them in a situation analogous to a real industry, where various concepts related to computer simulation were discussed, allowing the students to put these concepts into practice in an interactive manner, thus facilitating learning

  8. The Attentional Demand of Automobile Driving Revisited: Occlusion Distance as a Function of Task-Relevant Event Density in Realistic Driving Scenarios.

    Science.gov (United States)

    Kujala, Tuomo; Mäkelä, Jakke; Kotilainen, Ilkka; Tokkonen, Timo

    2016-02-01

    We studied the utility of occlusion distance as a function of task-relevant event density in realistic traffic scenarios with self-controlled speed. The visual occlusion technique is an established method for assessing visual demands of driving. However, occlusion time is not a highly informative measure of environmental task-relevant event density in self-paced driving scenarios because it partials out the effects of changes in driving speed. Self-determined occlusion times and distances of 97 drivers with varying backgrounds were analyzed in driving scenarios simulating real Finnish suburban and highway traffic environments with self-determined vehicle speed. Occlusion distances varied systematically with the expected environmental demands of the manipulated driving scenarios whereas the distributions of occlusion times remained more static across the scenarios. Systematic individual differences in the preferred occlusion distances were observed. More experienced drivers achieved better lane-keeping accuracy than inexperienced drivers with similar occlusion distances; however, driving experience was unexpectedly not a major factor for the preferred occlusion distances. Occlusion distance seems to be an informative measure for assessing task-relevant event density in realistic traffic scenarios with self-controlled speed. Occlusion time measures the visual demand of driving as the task-relevant event rate in time intervals, whereas occlusion distance measures the experienced task-relevant event density in distance intervals. The findings can be utilized in context-aware distraction mitigation systems, human-automated vehicle interaction, road speed prediction and design, as well as in the testing of visual in-vehicle tasks for inappropriate in-vehicle glancing behaviors in any dynamic traffic scenario for which appropriate individual occlusion distances can be defined. © 2015, Human Factors and Ergonomics Society.

  9. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  10. Generic Simulator Environment for Realistic Simulation - Autonomous Entity Proof and Emotion in Decision Making

    Directory of Open Access Journals (Sweden)

    Mickaël Camus

    2004-10-01

    Full Text Available Simulation is usually used as an evaluation and testing system. Many sectors are concerned such as EUROPEAN SPACE AGENCY or the EUROPEAN DEFENCE. It is important to make sure that the project is error-free in order to continue it. The difficulty is to develop a realistic environment for the simulation and the execution of a scenario. This paper presents PALOMA, a Generic Simulator Environment. This project is based essantially on the Chaos Theory and Complex Systems to create and direct an environment for a simulation. An important point is the generic aspect. PALOMA will be able to create an environment for different sectors (Aero-space, Biology, Mathematic, .... PALOMA includes six components : the Simulation Engine, the Direction Module, the Environment Generator, the Natural Behavior Restriction, the Communication API and the User API. Three languages are used to develop this simulator. SCHEME for the Direction language, C/C++ for the development of modules and OZ/MOZART for the heart of PALOMA.

  11. Synchronization Of Parallel Discrete Event Simulations

    Science.gov (United States)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  12. Synchronization Techniques in Parallel Discrete Event Simulation

    OpenAIRE

    Lindén, Jonatan

    2018-01-01

    Discrete event simulation is an important tool for evaluating system models in many fields of science and engineering. To improve the performance of large-scale discrete event simulations, several techniques to parallelize discrete event simulation have been developed. In parallel discrete event simulation, the work of a single discrete event simulation is distributed over multiple processing elements. A key challenge in parallel discrete event simulation is to ensure that causally dependent ...

  13. Comparative Study of the Effectiveness of Three Learning Environments: Hyper-Realistic Virtual Simulations, Traditional Schematic Simulations and Traditional Laboratory

    Science.gov (United States)

    Martinez, Guadalupe; Naranjo, Francisco L.; Perez, Angel L.; Suero, Maria Isabel; Pardo, Pedro J.

    2011-01-01

    This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output.…

  14. Meredys, a multi-compartment reaction-diffusion simulator using multistate realistic molecular complexes

    Directory of Open Access Journals (Sweden)

    Le Novère Nicolas

    2010-03-01

    Full Text Available Abstract Background Most cellular signal transduction mechanisms depend on a few molecular partners whose roles depend on their position and movement in relation to the input signal. This movement can follow various rules and take place in different compartments. Additionally, the molecules can form transient complexes. Complexation and signal transduction depend on the specific states partners and complexes adopt. Several spatial simulator have been developed to date, but none are able to model reaction-diffusion of realistic multi-state transient complexes. Results Meredys allows for the simulation of multi-component, multi-feature state molecular species in two and three dimensions. Several compartments can be defined with different diffusion and boundary properties. The software employs a Brownian dynamics engine to simulate reaction-diffusion systems at the reactive particle level, based on compartment properties, complex structure, and hydro-dynamic radii. Zeroth-, first-, and second order reactions are supported. The molecular complexes have realistic geometries. Reactive species can contain user-defined feature states which can modify reaction rates and outcome. Models are defined in a versatile NeuroML input file. The simulation volume can be split in subvolumes to speed up run-time. Conclusions Meredys provides a powerful and versatile way to run accurate simulations of molecular and sub-cellular systems, that complement existing multi-agent simulation systems. Meredys is a Free Software and the source code is available at http://meredys.sourceforge.net/.

  15. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.

    Science.gov (United States)

    Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-06-23

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.

  16. Simulating realistic implementations of spin field effect transistor

    Science.gov (United States)

    Gao, Yunfei; Lundstrom, Mark S.; Nikonov, Dmitri E.

    2011-04-01

    The spin field effect transistor (spinFET), consisting of two ferromagnetic source/drain contacts and a Si channel, is predicted to have outstanding device and circuit performance. We carry out a rigorous numerical simulation of the spinFET based on the nonequilibrium Green's function formalism self-consistently coupled with a Poisson solver to produce the device I-V characteristics. Good agreement with the recent experiments in terms of spin injection, spin transport, and the magnetoresistance ratio (MR) is obtained. We include factors crucial for realistic devices: tunneling through a dielectric barrier, and spin relaxation at the interface and in the channel. Using these simulations, we suggest ways of optimizing the device. We propose that by choosing the right contact material and inserting tunnel oxide barriers between the source/drain and channel to filter different spins, the MR can be restored to ˜2000%, which would be beneficial to the reconfigurable logic circuit application.

  17. Discrete-Event Simulation

    OpenAIRE

    Prateek Sharma

    2015-01-01

    Abstract Simulation can be regarded as the emulation of the behavior of a real-world system over an interval of time. The process of simulation relies upon the generation of the history of a system and then analyzing that history to predict the outcome and improve the working of real systems. Simulations can be of various kinds but the topic of interest here is one of the most important kind of simulation which is Discrete-Event Simulation which models the system as a discrete sequence of ev...

  18. Integrative computational models of cardiac arrhythmias -- simulating the structurally realistic heart

    Science.gov (United States)

    Trayanova, Natalia A; Tice, Brock M

    2009-01-01

    Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585

  19. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    Science.gov (United States)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  20. Atomistic simulations of materials: Methods for accurate potentials and realistic time scales

    Science.gov (United States)

    Tiwary, Pratyush

    This thesis deals with achieving more realistic atomistic simulations of materials, by developing accurate and robust force-fields, and algorithms for practical time scales. I develop a formalism for generating interatomic potentials for simulating atomistic phenomena occurring at energy scales ranging from lattice vibrations to crystal defects to high-energy collisions. This is done by fitting against an extensive database of ab initio results, as well as to experimental measurements for mixed oxide nuclear fuels. The applicability of these interactions to a variety of mixed environments beyond the fitting domain is also assessed. The employed formalism makes these potentials applicable across all interatomic distances without the need for any ambiguous splining to the well-established short-range Ziegler-Biersack-Littmark universal pair potential. We expect these to be reliable potentials for carrying out damage simulations (and molecular dynamics simulations in general) in nuclear fuels of varying compositions for all relevant atomic collision energies. A hybrid stochastic and deterministic algorithm is proposed that while maintaining fully atomistic resolution, allows one to achieve milliseconds and longer time scales for several thousands of atoms. The method exploits the rare event nature of the dynamics like other such methods, but goes beyond them by (i) not having to pick a scheme for biasing the energy landscape, (ii) providing control on the accuracy of the boosted time scale, (iii) not assuming any harmonic transition state theory (HTST), and (iv) not having to identify collective coordinates or interesting degrees of freedom. The method is validated by calculating diffusion constants for vacancy-mediated diffusion in iron metal at low temperatures, and comparing against brute-force high temperature molecular dynamics. We also calculate diffusion constants for vacancy diffusion in tantalum metal, where we compare against low-temperature HTST as well

  1. Methods and computing challenges of the realistic simulation of physics events in the presence of pile-up in the ATLAS experiment

    CERN Document Server

    Chapman, J D; The ATLAS collaboration

    2014-01-01

    We are now in a regime where we observe substantial multiple proton-proton collisions within each filled LHC bunch-crossing and also multiple filled bunch-crossings within the sensitive time window of the ATLAS detector. This will increase with increased luminosity in the near future. Including these effects in Monte Carlo simulation poses significant computing challenges. We present a description of the standard approach used by the ATLAS experiment and details of how we manage the conflicting demands of keeping the background dataset size as small as possible while minimizing the effect of background event re-use. We also present details of the methods used to minimize the memory footprint of these digitization jobs, to keep them within the grid limit, despite combining the information from thousands of simulated events at once. We also describe an alternative approach, known as Overlay. Here, the actual detector conditions are sampled from raw data using a special zero-bias trigger, and the simulated physi...

  2. Realistically Rendering SoC Traffic Patterns with Interrupt Awareness

    DEFF Research Database (Denmark)

    Angiolini, Frederico; Mahadevan, Sharkar; Madsen, Jan

    2005-01-01

    to generate realistic test traffic. This paper presents a selection of applications using interrupt-based synchronization; a reference methodology to split such applications in execution subflows and to adjust the overall execution stream based upon hardware events; a reactive simulation device capable...... of correctly replicating such software behaviours in the MPSoC design phase. Additionally, we validate the proposed concept by showing cycle-accurate reproduction of a previously traced application flow....

  3. Evaluation and simulation of event building techniques for a detector at the LHC

    CERN Document Server

    Spiwoks, R

    1995-01-01

    The main objectives of future experiments at the Large Hadron Collider are the search for the Higgs boson (or bosons), the verification of the Standard Model and the search beyond the Standard Model in a new energy range up to a few TeV. These experiments will have to cope with unprecedented high data rates and will need event building systems which can offer a bandwidth of 1 to 100GB/s and which can assemble events from 100 to 1000 readout memories at rates of 1 to 100kHz. This work investigates the feasibility of parallel event building sys- tems using commercially available high speed interconnects and switches. Studies are performed by building a small-scale prototype and by modelling this proto- type and realistic architectures with discrete-event simulations. The prototype is based on the HiPPI standard and uses commercially available VME-HiPPI interfaces and a HiPPI switch together with modular and scalable software. The setup operates successfully as a parallel event building system of limited size in...

  4. Radiation Damage to Nervous System: Designing Optimal Models for Realistic Neuron Morphology in Hippocampus

    Science.gov (United States)

    Batmunkh, Munkhbaatar; Bugay, Alexander; Bayarchimeg, Lkhagvaa; Lkhagva, Oidov

    2018-02-01

    The present study is focused on the development of optimal models of neuron morphology for Monte Carlo microdosimetry simulations of initial radiation-induced events of heavy charged particles in the specific types of cells of the hippocampus, which is the most radiation-sensitive structure of the central nervous system. The neuron geometry and particles track structures were simulated by the Geant4/Geant4-DNA Monte Carlo toolkits. The calculations were made for beams of protons and heavy ions with different energies and doses corresponding to real fluxes of galactic cosmic rays. A simple compartmental model and a complex model with realistic morphology extracted from experimental data were constructed and compared. We estimated the distribution of the energy deposition events and the production of reactive chemical species within the developed models of CA3/CA1 pyramidal neurons and DG granule cells of the rat hippocampus under exposure to different particles with the same dose. Similar distributions of the energy deposition events and concentration of some oxidative radical species were obtained in both the simplified and realistic neuron models.

  5. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions

    Science.gov (United States)

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas

    2014-01-01

    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  6. Realistic 3D Terrain Roaming and Real-Time Flight Simulation

    Science.gov (United States)

    Que, Xiang; Liu, Gang; He, Zhenwen; Qi, Guang

    2014-12-01

    This paper presents an integrate method, which can provide access to current status and the dynamic visible scanning topography, to enhance the interactive during the terrain roaming and real-time flight simulation. A digital elevation model and digital ortho-photo map data integrated algorithm is proposed as the base algorithm for our approach to build a realistic 3D terrain scene. A new technique with help of render to texture and head of display for generating the navigation pane is used. In the flight simulating, in order to eliminate flying "jump", we employs the multidimensional linear interpolation method to adjust the camera parameters dynamically and steadily. Meanwhile, based on the principle of scanning laser imaging, we draw pseudo color figures by scanning topography in different directions according to the real-time flying status. Simulation results demonstrate that the proposed algorithm is prospective for applications and the method can improve the effect and enhance dynamic interaction during the real-time flight.

  7. Characteristics of 454 pyrosequencing data--enabling realistic simulation with flowsim.

    Science.gov (United States)

    Balzer, Susanne; Malde, Ketil; Lanzén, Anders; Sharma, Animesh; Jonassen, Inge

    2010-09-15

    The commercial launch of 454 pyrosequencing in 2005 was a milestone in genome sequencing in terms of performance and cost. Throughout the three available releases, average read lengths have increased to approximately 500 base pairs and are thus approaching read lengths obtained from traditional Sanger sequencing. Study design of sequencing projects would benefit from being able to simulate experiments. We explore 454 raw data to investigate its characteristics and derive empirical distributions for the flow values generated by pyrosequencing. Based on our findings, we implement Flowsim, a simulator that generates realistic pyrosequencing data files of arbitrary size from a given set of input DNA sequences. We finally use our simulator to examine the impact of sequence lengths on the results of concrete whole-genome assemblies, and we suggest its use in planning of sequencing projects, benchmarking of assembly methods and other fields. Flowsim is freely available under the General Public License from http://blog.malde.org/index.php/flowsim/.

  8. A task-related and resting state realistic fMRI simulator for fMRI data validation

    Science.gov (United States)

    Hill, Jason E.; Liu, Xiangyu; Nutter, Brian; Mitra, Sunanda

    2017-02-01

    After more than 25 years of published functional magnetic resonance imaging (fMRI) studies, careful scrutiny reveals that most of the reported results lack fully decisive validation. The complex nature of fMRI data generation and acquisition results in unavoidable uncertainties in the true estimation and interpretation of both task-related activation maps and resting state functional connectivity networks, despite the use of various statistical data analysis methodologies. The goal of developing the proposed STANCE (Spontaneous and Task-related Activation of Neuronally Correlated Events) simulator is to generate realistic task-related and/or resting-state 4D blood oxygenation level dependent (BOLD) signals, given the experimental paradigm and scan protocol, by using digital phantoms of twenty normal brains available from BrainWeb (http://brainweb.bic.mni.mcgill.ca/brainweb/). The proposed simulator will include estimated system and modelled physiological noise as well as motion to serve as a reference to measured brain activities. In its current form, STANCE is a MATLAB toolbox with command line functions serving as an open-source add-on to SPM8 (http://www.fil.ion.ucl.ac.uk/spm/software/spm8/). The STANCE simulator has been designed in a modular framework so that the hemodynamic response (HR) and various noise models can be iteratively improved to include evolving knowledge about such models.

  9. Event-by-event simulation of quantum cryptography protocols

    NARCIS (Netherlands)

    Zhao, S.; Raedt, H. De

    We present a new approach to simulate quantum cryptography protocols using event-based processes. The method is validated by simulating the BB84 protocol and the Ekert protocol, both without and with the presence of an eavesdropper.

  10. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    people who use computers every moment of their waking lives, others even ... How is discrete event simulation different from other kinds of simulation? ... time, energy consumption .... Schedule the CustomerDeparture event for this customer.

  11. Design of a Realistic Test Simulator For a Built-In Self Test Environment

    Directory of Open Access Journals (Sweden)

    A. Ahmad

    2010-12-01

    Full Text Available This paper presents a realistic test approach suitable to Design For Testability (DFT and Built- In Self Test (BIST environments. The approach is culminated in the form of a test simulator which is capable of providing a required goal of test for the System Under Test (SUT. The simulator uses the approach of fault diagnostics with fault grading procedure to provide the tests. The tool is developed on a common PC platform and hence no special software is required. Thereby, it is a low cost tool and hence economical. The tool is very much suitable for determining realistic test sequences for a targeted goal of testing for any SUT. The developed tool incorporates a flexible Graphical User Interface (GUI procedure and can be operated without any special programming skill. The tool is debugged and tested with the results of many bench mark circuits. Further, this developed tool can be utilized for educational purposes for many courses such as fault-tolerant computing, fault diagnosis, digital electronics, and safe - reliable - testable digital logic designs.

  12. Parallel discrete event simulation using shared memory

    Science.gov (United States)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1988-01-01

    With traditional event-list techniques, evaluating a detailed discrete-event simulation-model can often require hours or even days of computation time. By eliminating the event list and maintaining only sufficient synchronization to ensure causality, parallel simulation can potentially provide speedups that are linear in the numbers of processors. A set of shared-memory experiments, using the Chandy-Misra distributed-simulation algorithm, to simulate networks of queues is presented. Parameters of the study include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential-simulation of most queueing network models.

  13. Atomistic simulations of graphite etching at realistic time scales.

    Science.gov (United States)

    Aussems, D U B; Bal, K M; Morgan, T W; van de Sanden, M C M; Neyts, E C

    2017-10-01

    Hydrogen-graphite interactions are relevant to a wide variety of applications, ranging from astrophysics to fusion devices and nano-electronics. In order to shed light on these interactions, atomistic simulation using Molecular Dynamics (MD) has been shown to be an invaluable tool. It suffers, however, from severe time-scale limitations. In this work we apply the recently developed Collective Variable-Driven Hyperdynamics (CVHD) method to hydrogen etching of graphite for varying inter-impact times up to a realistic value of 1 ms, which corresponds to a flux of ∼10 20 m -2 s -1 . The results show that the erosion yield, hydrogen surface coverage and species distribution are significantly affected by the time between impacts. This can be explained by the higher probability of C-C bond breaking due to the prolonged exposure to thermal stress and the subsequent transition from ion- to thermal-induced etching. This latter regime of thermal-induced etching - chemical erosion - is here accessed for the first time using atomistic simulations. In conclusion, this study demonstrates that accounting for long time-scales significantly affects ion bombardment simulations and should not be neglected in a wide range of conditions, in contrast to what is typically assumed.

  14. Thermohydraulic simulation of HTR-10 nuclear reactor core using realistic CFD approach

    International Nuclear Information System (INIS)

    Silva, Alexandro S.; Dominguez, Dany S.; Mazaira, Leorlen Y. Rojas; Hernandez, Carlos R.G.; Lira, Carlos Alberto Brayner de Oliveira

    2015-01-01

    High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal–hydraulic characteristics. In this article, it was performed the thermal–hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a column of FCC (Face Centered Cubic) cells, with 41 layers and 82 pebbles. The input data used were taken from the thermohydraulic IAEA Benchmark (TECDOC-1694). The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)

  15. Cerebral blood flow simulations in realistic geometries

    Directory of Open Access Journals (Sweden)

    Szopos Marcela

    2012-04-01

    Full Text Available The aim of this work is to perform the computation of the blood flow in all the cerebral network, obtained from medical images as angiographies. We use free finite elements codes as FreeFEM++. We first test the code on analytical solutions in simplified geometries. Then, we study the influence of boundary conditions on the flow and we finally perform first computations on realistic meshes. L’objectif est ici de simuler l’écoulement sanguin dans tout le réseau cérébral (artériel et veineux obtenu à partir d’angiographies cérébrales 3D à l’aide de logiciels d’éléments finis libres, comme FreeFEM++. Nous menons d’abord une étude détaillée des résultats sur des solutions analytiques et l’influence des conditions limites à imposer dans des géométries simplifiées avant de travailler sur les maillages réalistes.

  16. Investigation of realistic PET simulations incorporating tumor patient's specificity using anthropomorphic models: Creation of an oncology database

    International Nuclear Information System (INIS)

    Papadimitroulas, Panagiotis; Efthimiou, Nikos; Nikiforidis, George C.; Kagadis, George C.; Loudos, George; Le Maitre, Amandine; Hatt, Mathieu; Tixier, Florent; Visvikis, Dimitris

    2013-01-01

    Purpose: The GATE Monte Carlo simulation toolkit is used for the implementation of realistic PET simulations incorporating tumor heterogeneous activity distributions. The reconstructed patient images include noise from the acquisition process, imaging system's performance restrictions and have limited spatial resolution. For those reasons, the measured intensity cannot be simply introduced in GATE simulations, to reproduce clinical data. Investigation of the heterogeneity distribution within tumors applying partial volume correction (PVC) algorithms was assessed. The purpose of the present study was to create a simulated oncology database based on clinical data with realistic intratumor uptake heterogeneity properties.Methods: PET/CT data of seven oncology patients were used in order to create a realistic tumor database investigating the heterogeneity activity distribution of the simulated tumors. The anthropomorphic models (NURBS based cardiac torso and Zubal phantoms) were adapted to the CT data of each patient, and the activity distribution was extracted from the respective PET data. The patient-specific models were simulated with the Monte Carlo Geant4 application for tomography emission (GATE) in three different levels for each case: (a) using homogeneous activity within the tumor, (b) using heterogeneous activity distribution in every voxel within the tumor as it was extracted from the PET image, and (c) using heterogeneous activity distribution corresponding to the clinical image following PVC. The three different types of simulated data in each case were reconstructed with two iterations and filtered with a 3D Gaussian postfilter, in order to simulate the intratumor heterogeneous uptake. Heterogeneity in all generated images was quantified using textural feature derived parameters in 3D according to the ground truth of the simulation, and compared to clinical measurements. Finally, profiles were plotted in central slices of the tumors, across lines with

  17. Nucleation and arrest of slow slip earthquakes: mechanisms and nonlinear simulations using realistic fault geometries and heterogeneous medium properties

    Science.gov (United States)

    Alves da Silva Junior, J.; Frank, W.; Campillo, M.; Juanes, R.

    2017-12-01

    Current models for slow slip earthquakes (SSE) assume a simplified fault embedded on a homogeneous half-space. In these models SSE events nucleate on the transition from velocity strengthening (VS) to velocity weakening (VW) down dip from the trench and propagate towards the base of the seismogenic zone, where high normal effective stress is assumed to arrest slip. Here, we investigate SSE nucleation and arrest using quasi-static finite element simulations, with rate and state friction, on a domain with heterogeneous properties and realistic fault geometry. We use the fault geometry of the Guerrero Gap in the Cocos subduction zone, where SSE events occurs every 4 years, as a proxy for subduction zone. Our model is calibrated using surface displacements from GPS observations. We apply boundary conditions according to the plate convergence rate and impose a depth-dependent pore pressure on the fault. Our simulations indicate that the fault geometry and elastic properties of the medium play a key role in the arrest of SSE events at the base of the seismogenic zone. SSE arrest occurs due to aseismic deformations of the domain that result in areas with elevated effective stress. SSE nucleation occurs in the transition from VS to VW and propagates as a crack-like expansion with increased nucleation length prior to dynamic instability. Our simulations encompassing multiple seismic cycles indicate SSE interval times between 1 and 10 years and, importantly, a systematic increase of rupture area prior to dynamic instability, followed by a hiatus in the SSE occurrence. We hypothesize that these SSE characteristics, if confirmed by GPS observations in different subduction zones, can add to the understanding of nucleation of large earthquakes in the seismogenic zone.

  18. A Simbol-X Event Simulator

    International Nuclear Information System (INIS)

    Puccetti, S.; Giommi, P.; Fiore, F.

    2009-01-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  19. A Simbol-X Event Simulator

    Science.gov (United States)

    Puccetti, S.; Fiore, F.; Giommi, P.

    2009-05-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  20. 3D realistic head model simulation based on transcranial magnetic stimulation.

    Science.gov (United States)

    Yang, Shuo; Xu, Guizhi; Wang, Lei; Chen, Yong; Wu, Huanli; Li, Ying; Yang, Qingxin

    2006-01-01

    Transcranial magnetic stimulation (TMS) is a powerful non-invasive tool for investigating functions in the brain. The target inside the head is stimulated with eddy currents induced in the tissue by the time-varying magnetic field. Precise spatial localization of stimulation sites is the key of efficient functional magnetic stimulations. Many researchers devote to magnetic field analysis in empty free space. In this paper, a realistic head model used in Finite Element Method has been developed. The magnetic field inducted in the head bt TMS has been analysed. This three-dimensional simulation is useful for spatial localization of stimulation.

  1. Synchronous Parallel System for Emulation and Discrete Event Simulation

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor)

    2001-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to the state variables of the simulation object attributable to the event object and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  2. Parallel discrete event simulation: A shared memory approach

    Science.gov (United States)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  3. Realistic simulations of a cyclotron spiral inflector within a particle-in-cell framework

    Science.gov (United States)

    Winklehner, Daniel; Adelmann, Andreas; Gsell, Achim; Kaman, Tulin; Campo, Daniela

    2017-12-01

    We present an upgrade to the particle-in-cell ion beam simulation code opal that enables us to run highly realistic simulations of the spiral inflector system of a compact cyclotron. This upgrade includes a new geometry class and field solver that can handle the complicated boundary conditions posed by the electrode system in the central region of the cyclotron both in terms of particle termination, and calculation of self-fields. Results are benchmarked against the analytical solution of a coasting beam. As a practical example, the spiral inflector and the first revolution in a 1 MeV /amu test cyclotron, located at Best Cyclotron Systems, Inc., are modeled and compared to the simulation results. We find that opal can now handle arbitrary boundary geometries with relative ease. Simulated injection efficiencies and beam shape compare well with measured efficiencies and a preliminary measurement of the beam distribution after injection.

  4. How realistic are air quality hindcasts driven by forcings from climate model simulations?

    Science.gov (United States)

    Lacressonnière, G.; Peuch, V.-H.; Arteta, J.; Josse, B.; Joly, M.; Marécal, V.; Saint Martin, D.; Déqué, M.; Watson, L.

    2012-12-01

    Predicting how European air quality could evolve over the next decades in the context of changing climate requires the use of climate models to produce results that can be averaged in a climatologically and statistically sound manner. This is a very different approach from the one that is generally used for air quality hindcasts for the present period; analysed meteorological fields are used to represent specifically each date and hour. Differences arise both from the fact that a climate model run results in a pure model output, with no influence from observations (which are useful to correct for a range of errors), and that in a "climate" set-up, simulations on a given day, month or even season cannot be related to any specific period of time (but can just be interpreted in a climatological sense). Hence, although an air quality model can be thoroughly validated in a "realistic" set-up using analysed meteorological fields, the question remains of how far its outputs can be interpreted in a "climate" set-up. For this purpose, we focus on Europe and on the current decade using three 5-yr simulations performed with the multiscale chemistry-transport model MOCAGE and use meteorological forcings either from operational meteorological analyses or from climate simulations. We investigate how statistical skill indicators compare in the different simulations, discriminating also the effects of meteorology on atmospheric fields (winds, temperature, humidity, pressure, etc.) and on the dependent emissions and deposition processes (volatile organic compound emissions, deposition velocities, etc.). Our results show in particular how differing boundary layer heights and deposition velocities affect horizontal and vertical distributions of species. When the model is driven by operational analyses, the simulation accurately reproduces the observed values of O3, NOx, SO2 and, with some bias that can be explained by the set-up, PM10. We study how the simulations driven by climate

  5. Simulating events

    Energy Technology Data Exchange (ETDEWEB)

    Ferretti, C; Bruzzone, L [Techint Italimpianti, Milan (Italy)

    2000-06-01

    The Petacalco Marine terminal on the Pacific coast in the harbour of Lazaro Carclenas (Michoacan) in Mexico, provides coal to the thermoelectric power plant at Pdte Plutarco Elias Calles in the port area. The plant is being converted from oil to burn coal to generate 2100 MW of power. The article describes the layout of the terminal and equipment employed in the unloading, coal stacking, coal handling areas and the receiving area at the power plant. The contractor Techint Italimpianti has developed a software system, MHATIS, for marine terminal management which is nearly complete. The discrete event simulator with its graphic interface provides a real-type decision support system for simulating changes to the terminal operations and evaluating impacts. The article describes how MHATIS is used. 7 figs.

  6. Using GPU parallelization to perform realistic simulations of the LPCTrap experiments

    Energy Technology Data Exchange (ETDEWEB)

    Fabian, X., E-mail: fabian@lpccaen.in2p3.fr; Mauger, F.; Quéméner, G. [Université de Caen, CNRS/IN2P3, LPC-Caen, ENSICAEN (France); Velten, Ph. [KU Leuven, Instituut voor Kern- en Straglingsfysica (Belgium); Ban, G.; Couratin, C. [Université de Caen, CNRS/IN2P3, LPC-Caen, ENSICAEN (France); Delahaye, P. [GANIL, CEA/DSM-CNRS/IN2P3 (France); Durand, D. [Université de Caen, CNRS/IN2P3, LPC-Caen, ENSICAEN (France); Fabre, B. [CELIA, Université de Bordeaux, CEA/CNRS (France); Finlay, P. [KU Leuven, Instituut voor Kern- en Straglingsfysica (Belgium); Fléchard, X.; Liénard, E. [Université de Caen, CNRS/IN2P3, LPC-Caen, ENSICAEN (France); Méry, A. [Université de Caen, CIMAP, CEA/CNRS/ENSICAEN (France); Naviliat-Cuncic, O. [NSCL and Department of Physics and Astronomy, MSU (United States); Pons, B. [CELIA, Université de Bordeaux, CEA/CNRS (France); Porobic, T.; Severijns, N. [KU Leuven, Instituut voor Kern- en Straglingsfysica (Belgium); Thomas, J. C. [GANIL, CEA/DSM-CNRS/IN2P3 (France)

    2015-11-15

    The LPCTrap setup is a sensitive tool to measure the β − ν angular correlation coefficient, a{sub βν}, which can yield the mixing ratio ρ of a β decay transition. The latter enables the extraction of the Cabibbo-Kobayashi-Maskawa (CKM) matrix element V{sub ud}. In such a measurement, the most relevant observable is the energy distribution of the recoiling daughter nuclei following the nuclear β decay, which is obtained using a time-of-flight technique. In order to maximize the precision, one can reduce the systematic errors through a thorough simulation of the whole set-up, especially with a correct model of the trapped ion cloud. This paper presents such a simulation package and focuses on the ion cloud features; particular attention is therefore paid to realistic descriptions of trapping field dynamics, buffer gas cooling and the N-body space charge effects.

  7. Using GPU parallelization to perform realistic simulations of the LPCTrap experiments

    International Nuclear Information System (INIS)

    Fabian, X.; Mauger, F.; Quéméner, G.; Velten, Ph.; Ban, G.; Couratin, C.; Delahaye, P.; Durand, D.; Fabre, B.; Finlay, P.; Fléchard, X.; Liénard, E.; Méry, A.; Naviliat-Cuncic, O.; Pons, B.; Porobic, T.; Severijns, N.; Thomas, J. C.

    2015-01-01

    The LPCTrap setup is a sensitive tool to measure the β − ν angular correlation coefficient, a βν , which can yield the mixing ratio ρ of a β decay transition. The latter enables the extraction of the Cabibbo-Kobayashi-Maskawa (CKM) matrix element V ud . In such a measurement, the most relevant observable is the energy distribution of the recoiling daughter nuclei following the nuclear β decay, which is obtained using a time-of-flight technique. In order to maximize the precision, one can reduce the systematic errors through a thorough simulation of the whole set-up, especially with a correct model of the trapped ion cloud. This paper presents such a simulation package and focuses on the ion cloud features; particular attention is therefore paid to realistic descriptions of trapping field dynamics, buffer gas cooling and the N-body space charge effects

  8. Use cases of discrete event simulation. Appliance and research

    Energy Technology Data Exchange (ETDEWEB)

    Bangsow, Steffen (ed.)

    2012-11-01

    Use Cases of Discrete Event Simulation. Includes case studies from various important industries such as automotive, aerospace, robotics, production industry. Written by leading experts in the field. Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book. The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and optimization this book provides a contribution to the orientation, what specific problems could be solved with the help of Discrete Event Simulation within the organization.

  9. Full Quantum Dynamics Simulation of a Realistic Molecular System Using the Adaptive Time-Dependent Density Matrix Renormalization Group Method.

    Science.gov (United States)

    Yao, Yao; Sun, Ke-Wei; Luo, Zhen; Ma, Haibo

    2018-01-18

    The accurate theoretical interpretation of ultrafast time-resolved spectroscopy experiments relies on full quantum dynamics simulations for the investigated system, which is nevertheless computationally prohibitive for realistic molecular systems with a large number of electronic and/or vibrational degrees of freedom. In this work, we propose a unitary transformation approach for realistic vibronic Hamiltonians, which can be coped with using the adaptive time-dependent density matrix renormalization group (t-DMRG) method to efficiently evolve the nonadiabatic dynamics of a large molecular system. We demonstrate the accuracy and efficiency of this approach with an example of simulating the exciton dissociation process within an oligothiophene/fullerene heterojunction, indicating that t-DMRG can be a promising method for full quantum dynamics simulation in large chemical systems. Moreover, it is also shown that the proper vibronic features in the ultrafast electronic process can be obtained by simulating the two-dimensional (2D) electronic spectrum by virtue of the high computational efficiency of the t-DMRG method.

  10. Investigation of realistic PET simulations incorporating tumor patient's specificity using anthropomorphic models: Creation of an oncology database

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, Panagiotis; Efthimiou, Nikos; Nikiforidis, George C.; Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 265 04 (Greece); Loudos, George [Department of Biomedical Engineering, Technological Educational Institute of Athens, Ag. Spyridonos Street, Egaleo GR 122 10, Athens (Greece); Le Maitre, Amandine; Hatt, Mathieu; Tixier, Florent; Visvikis, Dimitris [Medical Information Processing Laboratory (LaTIM), National Institute of Health and Medical Research (INSERM), 29609 Brest (France)

    2013-11-15

    Purpose: The GATE Monte Carlo simulation toolkit is used for the implementation of realistic PET simulations incorporating tumor heterogeneous activity distributions. The reconstructed patient images include noise from the acquisition process, imaging system's performance restrictions and have limited spatial resolution. For those reasons, the measured intensity cannot be simply introduced in GATE simulations, to reproduce clinical data. Investigation of the heterogeneity distribution within tumors applying partial volume correction (PVC) algorithms was assessed. The purpose of the present study was to create a simulated oncology database based on clinical data with realistic intratumor uptake heterogeneity properties.Methods: PET/CT data of seven oncology patients were used in order to create a realistic tumor database investigating the heterogeneity activity distribution of the simulated tumors. The anthropomorphic models (NURBS based cardiac torso and Zubal phantoms) were adapted to the CT data of each patient, and the activity distribution was extracted from the respective PET data. The patient-specific models were simulated with the Monte Carlo Geant4 application for tomography emission (GATE) in three different levels for each case: (a) using homogeneous activity within the tumor, (b) using heterogeneous activity distribution in every voxel within the tumor as it was extracted from the PET image, and (c) using heterogeneous activity distribution corresponding to the clinical image following PVC. The three different types of simulated data in each case were reconstructed with two iterations and filtered with a 3D Gaussian postfilter, in order to simulate the intratumor heterogeneous uptake. Heterogeneity in all generated images was quantified using textural feature derived parameters in 3D according to the ground truth of the simulation, and compared to clinical measurements. Finally, profiles were plotted in central slices of the tumors, across lines

  11. ATLAS simulated black hole event

    CERN Multimedia

    Pequenão, J

    2008-01-01

    The simulated collision event shown is viewed along the beampipe. The event is one in which a microscopic-black-hole was produced in the collision of two protons (not shown). The microscopic-black-hole decayed immediately into many particles. The colors of the tracks show different types of particles emerging from the collision (at the center).

  12. A Novel CPU/GPU Simulation Environment for Large-Scale Biologically-Realistic Neural Modeling

    Directory of Open Access Journals (Sweden)

    Roger V Hoang

    2013-10-01

    Full Text Available Computational Neuroscience is an emerging field that provides unique opportunities to studycomplex brain structures through realistic neural simulations. However, as biological details are added tomodels, the execution time for the simulation becomes longer. Graphics Processing Units (GPUs are now being utilized to accelerate simulations due to their ability to perform computations in parallel. As such, they haveshown significant improvement in execution time compared to Central Processing Units (CPUs. Most neural simulators utilize either multiple CPUs or a single GPU for better performance, but still show limitations in execution time when biological details are not sacrificed. Therefore, we present a novel CPU/GPU simulation environment for large-scale biological networks,the NeoCortical Simulator version 6 (NCS6. NCS6 is a free, open-source, parallelizable, and scalable simula-tor, designed to run on clusters of multiple machines, potentially with high performance computing devicesin each of them. It has built-in leaky-integrate-and-fire (LIF and Izhikevich (IZH neuron models, but usersalso have the capability to design their own plug-in interface for different neuron types as desired. NCS6is currently able to simulate one million cells and 100 million synapses in quasi real time by distributing dataacross these heterogeneous clusters of CPUs and GPUs.

  13. Event-by-event simulation of single-neutron experiments to test uncertainty relations

    International Nuclear Information System (INIS)

    Raedt, H De; Michielsen, K

    2014-01-01

    Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. (paper)

  14. Use Cases of Discrete Event Simulation Appliance and Research

    CERN Document Server

    2012-01-01

    Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book.   The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and o...

  15. Event-by-event simulation of a quantum delayed-choice experiment

    NARCIS (Netherlands)

    Donker, Hylke C.; De Raedt, Hans; Michielsen, Kristel

    2014-01-01

    The quantum delayed-choice experiment of Tang et al. (2012) is simulated on the level of individual events without making reference to concepts of quantum theory or without solving a wave equation. The simulation results are in excellent agreement with the quantum theoretical predictions of this

  16. A code for simulation of human failure events in nuclear power plants: SIMPROC

    International Nuclear Information System (INIS)

    Gil, Jesus; Fernandez, Ivan; Murcia, Santiago; Gomez, Javier; Marrao, Hugo; Queral, Cesar; Exposito, Antonio; Rodriguez, Gabriel; Ibanez, Luisa; Hortal, Javier; Izquierdo, Jose M.; Sanchez, Miguel; Melendez, Enrique

    2011-01-01

    Over the past years, many Nuclear Power Plant organizations have performed Probabilistic Safety Assessments to identify and understand key plant vulnerabilities. As part of enhancing the PSA quality, the Human Reliability Analysis is essential to make a realistic evaluation of safety and about the potential facility's weaknesses. Moreover, it has to be noted that HRA continues to be a large source of uncertainty in the PSAs. Within their current joint collaborative activities, Indizen, Universidad Politecnica de Madrid and Consejo de Seguridad Nuclear have developed the so-called SIMulator of PROCedures (SIMPROC), a tool aiming at simulate events related with human actions and able to interact with a plant simulation model. The tool helps the analyst to quantify the importance of human actions in the final plant state. Among others, the main goal of SIMPROC is to check the Emergency Operating Procedures being used by operating crew in order to lead the plant to a safe shutdown plant state. Currently SIMPROC is coupled with the SCAIS software package, but the tool is flexible enough to be linked to other plant simulation codes. SIMPROC-SCAIS applications are shown in the present article to illustrate the tool performance. The applications were developed in the framework of the Nuclear Energy Agency project on Safety Margin Assessment and Applications (SM2A). First an introductory example was performed to obtain the damage domain boundary of a selected sequence from a SBLOCA. Secondly, the damage domain area of a selected sequence from a loss of Component Cooling Water with a subsequent seal LOCA was calculated. SIMPROC simulates the corresponding human actions in both cases. The results achieved shown how the system can be adapted to a wide range of purposes such as Dynamic Event Tree delineation, Emergency Operating Procedures and damage domain search.

  17. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  18. Program For Parallel Discrete-Event Simulation

    Science.gov (United States)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  19. Collaborative Event-Driven Coverage and Rate Allocation for Event Miss-Ratio Assurances in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ozgur Sanli H

    2010-01-01

    Full Text Available Wireless sensor networks are often required to provide event miss-ratio assurance for a given event type. To meet such assurances along with minimum energy consumption, this paper shows how a node's activation and rate assignment is dependent on its distance to event sources, and proposes a practical coverage and rate allocation (CORA protocol to exploit this dependency in realistic environments. Both uniform event distribution and nonuniform event distribution are considered and the notion of ideal correlation distance around a clusterhead is introduced for on-duty node selection. In correlation distance guided CORA, rate assignment assists coverage scheduling by determining which nodes should be activated for minimizing data redundancy in transmission. Coverage scheduling assists rate assignment by controlling the amount of overlap among sensing regions of neighboring nodes, thereby providing sufficient data correlation for rate assignment. Extensive simulation results show that CORA meets the required event miss-ratios in realistic environments. CORA's joint coverage scheduling and rate allocation reduce the total energy expenditure by 85%, average battery energy consumption by 25%, and the overhead of source coding up to 90% as compared to existing rate allocation techniques.

  20. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  1. Positive Affect Is Associated With Reduced Fixation in a Realistic Medical Simulation.

    Science.gov (United States)

    Crane, Monique F; Brouwers, Sue; Forrest, Kirsty; Tan, Suyin; Loveday, Thomas; Wiggins, Mark W; Munday, Chris; David, Leila

    2017-08-01

    This study extends previous research by exploring the association between mood states (i.e., positive and negative affect) and fixation in practicing anesthetists using a realistic medical simulation. The impact of practitioner emotional states on fixation is a neglected area of research. Emerging evidence is demonstrating the role of positive affect in facilitating problem solving and innovation, with demonstrated implications for practitioner fixation. Twelve practicing anesthetists (4 females; M age = 39 years; SD = 6.71) were involved in a medical simulation. Prior to the simulation, practitioners rated the frequency they had experienced various positive and negative emotions in the previous three days. During the simulation, the patient deteriorated rapidly, and anesthetists were observed for their degree of fixation. After the simulation, practitioners indicated the frequency of these same emotions during the simulation. Nonparametric correlations were used to explore the independent relationships between positive and negative affect and the behavioral measures. Only positive affect impacted the likelihood of fixation. Anesthetists who reported more frequent recent positive affect in the three days prior to the simulation and during the simulation tended to be less fixated as judged by independent raters, identified a decline in patient oxygen saturation more quickly, and more rapidly implemented the necessary intervention (surgical cricothyroidotomy). These findings have some real-world implications for positive affect in patient safety. This research has broad implications for professions where fixation may impair practice. This research suggests that professional training should teach practitioners to identify their emotions and understand the role of these emotions in fixation.

  2. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    Science.gov (United States)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  3. Realistic Visualization of Virtual Views

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...... phenomena captured in the reference photographs, (i.e. the transfer of photographic-realism). An overview of most prominent approaches in realistic virtual view synthesis will be presented and briefly discussed. Applications of proposed methods to visual survey, virtual cinematography, as well as mobile...

  4. Simple gaze-contingent cues guide eye movements in a realistic driving simulator

    Science.gov (United States)

    Pomarjanschi, Laura; Dorr, Michael; Bex, Peter J.; Barth, Erhardt

    2013-03-01

    Looking at the right place at the right time is a critical component of driving skill. Therefore, gaze guidance has the potential to become a valuable driving assistance system. In previous work, we have already shown that complex gaze-contingent stimuli can guide attention and reduce the number of accidents in a simple driving simulator. We here set out to investigate whether cues that are simple enough to be implemented in a real car can also capture gaze during a more realistic driving task in a high-fidelity driving simulator. We used a state-of-the-art, wide-field-of-view driving simulator with an integrated eye tracker. Gaze-contingent warnings were implemented using two arrays of light-emitting diodes horizontally fitted below and above the simulated windshield. Thirteen volunteering subjects drove along predetermined routes in a simulated environment popu­ lated with autonomous traffic. Warnings were triggered during the approach to half of the intersections, cueing either towards the right or to the left. The remaining intersections were not cued, and served as controls. The analysis of the recorded gaze data revealed that the gaze-contingent cues did indeed have a gaze guiding effect, triggering a significant shift in gaze position towards the highlighted direction. This gaze shift was not accompanied by changes in driving behaviour, suggesting that the cues do not interfere with the driving task itself.

  5. Universality and Realistic Extensions to the Semi-Analytic Simulation Principle in GNSS Signal Processing

    Directory of Open Access Journals (Sweden)

    O. Jakubov

    2012-06-01

    Full Text Available Semi-analytic simulation principle in GNSS signal processing bypasses the bit-true operations at high sampling frequency. Instead, signals at the output branches of the integrate&dump blocks are successfully modeled, thus making extensive Monte Carlo simulations feasible. Methods for simulations of code and carrier tracking loops with BPSK, BOC signals have been introduced in the literature. Matlab toolboxes were designed and published. In this paper, we further extend the applicability of the approach. Firstly, we describe any GNSS signal as a special instance of linear multi-dimensional modulation. Thereby, we state universal framework for classification of differently modulated signals. Using such description, we derive the semi-analytic models generally. Secondly, we extend the model for realistic scenarios including delay in the feed back, slowly fading multipath effects, finite bandwidth, phase noise, and a combination of these. Finally, a discussion on connection of this semi-analytic model and position-velocity-time estimator is delivered, as well as comparison of theoretical and simulated characteristics, produced by a prototype simulator developed at CTU in Prague.

  6. Rare event simulation using Monte Carlo methods

    CERN Document Server

    Rubino, Gerardo

    2009-01-01

    In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...

  7. Manufacturing plant performance evaluation by discrete event simulation

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mohd Rasid Osman; Rosnah Mohd Yusuff; Napsiah Ismail; Zulkiflie Leman

    2002-01-01

    A case study was conducted to evaluate the performance of a manufacturing plant using discrete event simulation technique. The study was carried out on animal feed production plant. Sterifeed plant at Malaysian Institute for Nuclear Technology Research (MINT), Selangor, Malaysia. The plant was modelled base on the actual manufacturing activities recorded by the operators. The simulation was carried out using a discrete event simulation software. The model was validated by comparing the simulation results with the actual operational data of the plant. The simulation results show some weaknesses with the current plant design and proposals were made to improve the plant performance. (Author)

  8. Synchronization of autonomous objects in discrete event simulation

    Science.gov (United States)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  9. A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    International Nuclear Information System (INIS)

    George-Williams, Hindolo; Patelli, Edoardo

    2016-01-01

    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system's failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability. - Highlights: • A discrete event simulation model based on load flow principles. • Model does not require system path or cut sets. • Applicable to binary and multi-state systems of any topology. • Supports multiple output systems with competing demand. • Model is intuitive and generally applicable.

  10. Multimodal interaction in the perception of impact events displayed via a multichannel audio and simulated structure-borne vibration

    Science.gov (United States)

    Martens, William L.; Woszczyk, Wieslaw

    2005-09-01

    For multimodal display systems in which realistic reproduction of impact events is desired, presenting structure-borne vibration along with multichannel audio recordings has been observed to create a greater sense of immersion in a virtual acoustic environment. Furthermore, there is an increased proportion of reports that the impact event took place within the observer's local area (this is termed ``presence with'' the event, in contrast to ``presence in'' the environment in which the event occurred). While holding the audio reproduction constant, varying the intermodal arrival time and level of mechanically displayed, synthetic whole-body vibration revealed a number of other subjective attributes that depend upon multimodal interaction in the perception of a representative impact event. For example, when the structure-borne component of the displayed impact event arrived 10 to 20 ms later than the airborne component, the intermodal delay was not only tolerated, but gave rise to an increase in the proportion of reports that the impact event had greater power. These results have enabled the refinement of a multimodal simulation in which the manipulation of synthetic whole-body vibration can be used to control perceptual attributes of impact events heard within an acoustic environment reproduced via a multichannel loudspeaker array.

  11. LCG MCDB - a Knowledgebase of Monte Carlo Simulated Events

    CERN Document Server

    Belov, S; Galkin, E; Gusev, A; Pokorski, Witold; Sherstnev, A V

    2008-01-01

    In this paper we report on LCG Monte Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC collaborations by experts. In many cases, the modern Monte Carlo simulation of physical processes requires expert knowledge in Monte Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project.

  12. Recent advances on thermohydraulic simulation of HTR-10 nuclear reactor core using realistic CFD approach

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Alexandro S., E-mail: alexandrossilva@ifba.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia da Bahia (IFBA), Vitoria da Conquista, BA (Brazil); Mazaira, Leorlen Y.R., E-mail: leored1984@gmail.com, E-mail: cgh@instec.cu [Instituto Superior de Tecnologias y Ciencias Aplicadas (INSTEC), La Habana (Cuba); Dominguez, Dany S.; Hernandez, Carlos R.G., E-mail: alexandrossilva@gmail.com, E-mail: dsdominguez@gmail.com [Universidade Estadual de Santa Cruz (UESC), Ilheus, BA (Brazil). Programa de Pos-Graduacao em Modelagem Computacional; Lira, Carlos A.B.O., E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil)

    2015-07-01

    High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal-hydraulic characteristics. In this article, it was performed the thermal-hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a FCC (Face Centered Cubic) cell with the half height of the core, with 21 layers and 95 pebbles. The input data used were taken from the thermal-hydraulic IAEA Bechmark. The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)

  13. Recent advances on thermohydraulic simulation of HTR-10 nuclear reactor core using realistic CFD approach

    International Nuclear Information System (INIS)

    Silva, Alexandro S.; Mazaira, Leorlen Y.R.; Dominguez, Dany S.; Hernandez, Carlos R.G.

    2015-01-01

    High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal-hydraulic characteristics. In this article, it was performed the thermal-hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a FCC (Face Centered Cubic) cell with the half height of the core, with 21 layers and 95 pebbles. The input data used were taken from the thermal-hydraulic IAEA Bechmark. The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)

  14. New simulated gas detector offers realistic training for mine rescue teams

    Energy Technology Data Exchange (ETDEWEB)

    Bealko, S.B.; Alexander, D.; Chasko, L.L. [National Inst. for Occupational Safety and Health, Pittsburgh, PA (United States). Office of Mine Safety and Health Research; Holtan, J. [LightsOn Safety Solutions, Spring, TX (United States)

    2010-07-01

    The National Institute for Occupational Safety and Health, together with LightsOn Safety Solutions, evaluated 2 versions of a multi-gas simulated gas monitor system (GMS) in separate field trials with mine rescue teams. This paper described the GMS wireless simulation tool along with its development and testing. It also described the GMS functions for the initial phase of testing as well as plans for the next phase of research which may introduce tracking and automation features. The GMS requires a personal computer and uses a wireless local area network. The GMS teaches mine rescue members about gas detection and helps them understand the importance of gas concentrations. In addition, it promotes decision-making actions by team members and offers a more realistic method of receiving gas concentration readings using a simulated hand-held gas detector. The purpose of the evaluation was to determine if the electronic placard in the GMS could be used by mine rescue teams instead of the currently used cardboard placards, and if the functionality of the device was suitable, reliable and practical. Results from the second field trial demonstrated improvements with the GMS over the original prototype technology, particularly with regards to wireless and connectivity issues. The GMS was successfully incorporated into the mine rescue exercises as planned, with very few problems encountered. 4 refs., 2 figs.

  15. New simulated gas detector offers realistic training for mine rescue teams

    International Nuclear Information System (INIS)

    Bealko, S.B.; Alexander, D.; Chasko, L.L.

    2010-01-01

    The National Institute for Occupational Safety and Health, together with LightsOn Safety Solutions, evaluated 2 versions of a multi-gas simulated gas monitor system (GMS) in separate field trials with mine rescue teams. This paper described the GMS wireless simulation tool along with its development and testing. It also described the GMS functions for the initial phase of testing as well as plans for the next phase of research which may introduce tracking and automation features. The GMS requires a personal computer and uses a wireless local area network. The GMS teaches mine rescue members about gas detection and helps them understand the importance of gas concentrations. In addition, it promotes decision-making actions by team members and offers a more realistic method of receiving gas concentration readings using a simulated hand-held gas detector. The purpose of the evaluation was to determine if the electronic placard in the GMS could be used by mine rescue teams instead of the currently used cardboard placards, and if the functionality of the device was suitable, reliable and practical. Results from the second field trial demonstrated improvements with the GMS over the original prototype technology, particularly with regards to wireless and connectivity issues. The GMS was successfully incorporated into the mine rescue exercises as planned, with very few problems encountered. 4 refs., 2 figs.

  16. Creating a Realistic Weather Environment for Motion-Based Piloted Flight Simulation

    Science.gov (United States)

    Daniels, Taumi S.; Schaffner, Philip R.; Evans, Emory T.; Neece, Robert T.; Young, Steve D.

    2012-01-01

    A flight simulation environment is being enhanced to facilitate experiments that evaluate research prototypes of advanced onboard weather radar, hazard/integrity monitoring (HIM), and integrated alerting and notification (IAN) concepts in adverse weather conditions. The simulation environment uses weather data based on real weather events to support operational scenarios in a terminal area. A simulated atmospheric environment was realized by using numerical weather data sets. These were produced from the High-Resolution Rapid Refresh (HRRR) model hosted and run by the National Oceanic and Atmospheric Administration (NOAA). To align with the planned flight simulation experiment requirements, several HRRR data sets were acquired courtesy of NOAA. These data sets coincided with severe weather events at the Memphis International Airport (MEM) in Memphis, TN. In addition, representative flight tracks for approaches and departures at MEM were generated and used to develop and test simulations of (1) what onboard sensors such as the weather radar would observe; (2) what datalinks of weather information would provide; and (3) what atmospheric conditions the aircraft would experience (e.g. turbulence, winds, and icing). The simulation includes a weather radar display that provides weather and turbulence modes, derived from the modeled weather along the flight track. The radar capabilities and the pilots controls simulate current-generation commercial weather radar systems. Appropriate data-linked weather advisories (e.g., SIGMET) were derived from the HRRR weather models and provided to the pilot consistent with NextGen concepts of use for Aeronautical Information Service (AIS) and Meteorological (MET) data link products. The net result of this simulation development was the creation of an environment that supports investigations of new flight deck information systems, methods for incorporation of better weather information, and pilot interface and operational improvements

  17. Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.

    Science.gov (United States)

    Caro, J Jaime

    2016-07-01

    Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically.

  18. The null-event method in computer simulation

    International Nuclear Information System (INIS)

    Lin, S.L.

    1978-01-01

    The simulation of collisions of ions moving under the influence of an external field through a neutral gas to non-zero temperatures is discussed as an example of computer models of processes in which a probe particle undergoes a series of interactions with an ensemble of other particles, such that the frequency and outcome of the events depends on internal properties of the second particles. The introduction of null events removes the need for much complicated algebra, leads to a more efficient simulation and reduces the likelihood of logical error. (Auth.)

  19. The Problem of Realist Events in American Journalism

    Directory of Open Access Journals (Sweden)

    Kevin G. Barnhurst

    2014-10-01

    Full Text Available Since the nineteenth century, more kinds of news outlets and ways of presenting news grew along with telegraphic, telephonic, and digital communications, leading journalists, policymakers, and critics to assume that more events became available than ever before. Attentive audiences say in surveys that they feel overloaded with information, and journalists tend to agree. Although news seems to have become more focused on events, several studies analyzing U.S. news content for the past century and a half show that journalists have been including fewer events within their coverage. In newspapers the events in stories declined over the twentieth century, and national newscasts decreased the share of event coverage since 1968 on television and since 1980 on public radio. Mainstream news websites continued the trend through the 2000s. Instead of providing access to more of the “what”, journalists moved from event-centered to meaning-centered news, still claiming to give a factual account in their stories, built on a foundation of American realism. As journalists concentrated on fewer and bigger events to compete, audiences turned away from mainstream news to look for what seems like an abundance of events in digital media.

  20. Manual for the Jet Event and Background Simulation Library

    Energy Technology Data Exchange (ETDEWEB)

    Heinz, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Soltz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Angerami, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-11

    Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momenta are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.

  1. Massively parallel simulations of strong electronic correlations: Realistic Coulomb vertex and multiplet effects

    Science.gov (United States)

    Baumgärtel, M.; Ghanem, K.; Kiani, A.; Koch, E.; Pavarini, E.; Sims, H.; Zhang, G.

    2017-07-01

    We discuss the efficient implementation of general impurity solvers for dynamical mean-field theory. We show that both Lanczos and quantum Monte Carlo in different flavors (Hirsch-Fye, continuous-time hybridization- and interaction-expansion) exhibit excellent scaling on massively parallel supercomputers. We apply these algorithms to simulate realistic model Hamiltonians including the full Coulomb vertex, crystal-field splitting, and spin-orbit interaction. We discuss how to remove the sign problem in the presence of non-diagonal crystal-field and hybridization matrices. We show how to extract the physically observable quantities from imaginary time data, in particular correlation functions and susceptibilities. Finally, we present benchmarks and applications for representative correlated systems.

  2. Using discrete-event simulation in strategic capacity planning for an outpatient physical therapy service.

    Science.gov (United States)

    Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan

    2013-12-01

    This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.

  3. Transfer validity of laparoscopic knot-tying training on a VR simulator to a realistic environment : A randomized controlled trial

    NARCIS (Netherlands)

    Verdaasdonk, E.G.G.; Dankelman, J.; Lange, J.F.; Stassen, L.P.S.

    2007-01-01

    Background- Laparoscopic suturing is one of the most difficult tasks in endoscopic surgery, requiring extensive training. The aim of this study was to determine the transfer validity of knot-tying training on a virtual-reality (VR) simulator to a realistic laparoscopic environment. Methods- Twenty

  4. Modeling and simulation of single-event effect in CMOS circuit

    International Nuclear Information System (INIS)

    Yue Suge; Zhang Xiaolin; Zhao Yuanfu; Liu Lin; Wang Hanning

    2015-01-01

    This paper reviews the status of research in modeling and simulation of single-event effects (SEE) in digital devices and integrated circuits. After introducing a brief historical overview of SEE simulation, different level simulation approaches of SEE are detailed, including material-level physical simulation where two primary methods by which ionizing radiation releases charge in a semiconductor device (direct ionization and indirect ionization) are introduced, device-level simulation where the main emerging physical phenomena affecting nanometer devices (bipolar transistor effect, charge sharing effect) and the methods envisaged for taking them into account are focused on, and circuit-level simulation where the methods for predicting single-event response about the production and propagation of single-event transients (SETs) in sequential and combinatorial logic are detailed, as well as the soft error rate trends with scaling are particularly addressed. (review)

  5. Modelling of dynamic and quasistatic events with special focus on wood-drying distortions

    OpenAIRE

    Ekevad, Mats

    2006-01-01

    This thesis deals mainly with computer simulations of wood-drying distortions, especially twist. The reason for this is that such distortions often appear in dried timber, and the results are quality downgrades and thus value losses in the wood value chain. A computer simulation is a way to theoretically simulate what happens in reality when moisture content in timber changes. If the computer simulation model is appropriate and capable of realistic simulations of real events, then it is possi...

  6. Disaster Response Modeling Through Discrete-Event Simulation

    Science.gov (United States)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  7. Progress in realistic LOCA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M Y; Bajorek, S M; Ohkawa, K [Westinghouse Electric Corporation, Pittsburgh, PA (United States)

    1994-12-31

    While LOCA is a complex transient to simulate, the state of art in thermal hydraulics has advanced sufficiently to allow its realistic prediction and application of advanced methods to actual reactor design as demonstrated by methodology described in this paper 6 refs, 5 figs, 3 tabs

  8. SU-D-BRF-06: A Brachytherapy Simulator with Realistic Haptic Force Feedback and Real-Time Ultrasounds Image Simulation for Training and Teaching

    International Nuclear Information System (INIS)

    Beaulieu, L; Carette, A; Comtois, S; Lavigueur, M; Cardou, P; Laurendeau, D

    2014-01-01

    Purpose: Surgical procedures require dexterity, expertise and repetition to reach optimal patient outcomes. However, efficient training opportunities are usually limited. This work presents a simulator system with realistic haptic force-feedback and full, real-time ultrasounds image simulation. Methods: The simulator is composed of a custom-made Linear-DELTA force-feedback robotic platform. The needle tip is mounted on a force gauge at the end effector of the robot, which responds to needle insertion by providing reaction forces. 3D geometry of the tissue is using a tetrahedral finite element mesh (FEM) mimicking tissue properties. As the needle is inserted/retracted, tissue deformation is computed using a mass-tensor nonlinear visco-elastic FEM. The real-time deformation is fed to the L-DELTA to take into account the force imparted to the needle, providing feedback to the end-user when crossing tissue boundaries or needle bending. Real-time 2D US image is also generated synchronously showing anatomy, needle insertion and tissue deformation. The simulator is running on an Intel I7 6- core CPU at 3.26 MHz. 3D tissue rendering and ultrasound display are performed on a Windows 7 computer; the FEM computation and L-DELTA control are executed on a similar PC using the Neutrino real-time OS. Both machines communicate through an Ethernet link. Results: The system runs at 500 Hz for a 8333-tetrahedron tissue mesh and a 100-node angular spring needle model. This frame rate ensures a relatively smooth displacement of the needle when pushed or retracted (±20 N in all directions at speeds of up to 2 m/s). Unlike commercially-available haptic platforms, the oblong workspace of the L-DELTA robot complies with that required for brachytherapy needle displacements of 0.1m by 0.1m by 0.25m. Conclusion: We have demonstrated a real-life, realistic brachytherapy simulator developed for prostate implants (LDR/HDR). The platform could be adapted to other sites or training for other

  9. Realistic training scenario simulations and simulation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dunlop, William H.; Koncher, Tawny R.; Luke, Stanley John; Sweeney, Jerry Joseph; White, Gregory K.

    2017-12-05

    In one embodiment, a system includes a signal generator operatively coupleable to one or more detectors; and a controller, the controller being both operably coupled to the signal generator and configured to cause the signal generator to: generate one or more signals each signal being representative of at least one emergency event; and communicate one or more of the generated signal(s) to a detector to which the signal generator is operably coupled. In another embodiment, a method includes: receiving data corresponding to one or more emergency events; generating at least one signal based on the data; and communicating the generated signal(s) to a detector.

  10. Event-by-event simulation of Einstein-Podolsky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    Zhao, Shuang; De Raedt, Hans; Michielsen, Kristel

    We construct an event-based computer simulation model of the Einstein-Podolsky-Rosen-Bohm experiments with photons. The algorithm is a one-to-one copy of the data gathering and analysis procedures used in real laboratory experiments. We consider two types of experiments, those with a source emitting

  11. Clinical simulation as an evaluation method in health informatics

    DEFF Research Database (Denmark)

    Jensen, Sanne

    2016-01-01

    Safe work processes and information systems are vital in health care. Methods for design of health IT focusing on patient safety are one of many initiatives trying to prevent adverse events. Possible patient safety hazards need to be investigated before health IT is integrated with local clinical...... work practice including other technology and organizational structure. Clinical simulation is ideal for proactive evaluation of new technology for clinical work practice. Clinical simulations involve real end-users as they simulate the use of technology in realistic environments performing realistic...... tasks. Clinical simulation study assesses effects on clinical workflow and enables identification and evaluation of patient safety hazards before implementation at a hospital. Clinical simulation also offers an opportunity to create a space in which healthcare professionals working in different...

  12. The cost of conservative synchronization in parallel discrete event simulations

    Science.gov (United States)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  13. SINDBAD: a realistic multi-purpose and scalable X-ray simulation tool for NDT applications

    International Nuclear Information System (INIS)

    Tabary, J.; Hugonnard, P.; Mathy, F.

    2007-01-01

    The X-ray radiographic simulation software SINDBAD, has been developed to help the design stage of radiographic systems or to evaluate the efficiency of image processing techniques, in both medical imaging and Non-Destructive Evaluation (NDE) industrial fields. This software can model any radiographic set-up, including the X-ray source, the beam interaction inside the object represented by its Computed Aided Design (CAD) model, and the imaging process in the detector. For each step of the virtual experimental bench, SINDBAD combines different modelling modules, accessed via Graphical User Interfaces (GUI), to provide realistic synthetic images. In this paper, we present an overview of all the functionalities which are available in SINDBAD, with a complete description of all the physics taken into account in models as well as the CAD and GUI facilities available in many computing platforms. We underline the different modules usable for different applications which make SINDBAD a multi-purposed and scalable X-ray simulation tool. (authors)

  14. Three Dimensional Simulation of the Baneberry Nuclear Event

    Science.gov (United States)

    Lomov, Ilya N.; Antoun, Tarabay H.; Wagoner, Jeff; Rambo, John T.

    2004-07-01

    Baneberry, a 10-kiloton nuclear event, was detonated at a depth of 278 m at the Nevada Test Site on December 18, 1970. Shortly after detonation, radioactive gases emanating from the cavity were released into the atmosphere through a shock-induced fissure near surface ground zero. Extensive geophysical investigations, coupled with a series of 1D and 2D computational studies were used to reconstruct the sequence of events that led to the catastrophic failure. However, the geological profile of the Baneberry site is complex and inherently three-dimensional, which meant that some geological features had to be simplified or ignored in the 2D simulations. This left open the possibility that features unaccounted for in the 2D simulations could have had an important influence on the eventual containment failure of the Baneberry event. This paper presents results from a high-fidelity 3D Baneberry simulation based on the most accurate geologic and geophysical data available. The results are compared with available data, and contrasted against the results of the previous 2D computational studies.

  15. Optimization of Operations Resources via Discrete Event Simulation Modeling

    Science.gov (United States)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  16. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  17. On constructing optimistic simulation algorithms for the discrete event system specification

    International Nuclear Information System (INIS)

    Nutaro, James J.

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models

  18. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel

    2011-05-01

    In this work, we describe an approach for modeling and simulating group behaviors for pursuit-evasion that uses a graph-based representation of the environment and integrates multi-agent simulation with roadmap-based path planning. Our approach can be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility computations that allow evading agents to hide in crowds or behind hills. We demonstrate the utility of this approach on mobile robots and in simulation for a variety of scenarios including pursuit-evasion and tag on terrains, in multi-level buildings, and in crowds. © 2011 IEEE.

  19. The propagation of orographic gravity waves into the stratosphere. Linear theory, idealized and realistic numerical simulation; Die Ausbreitung orographisch angeregter Schwerewellen in die Stratosphaere. Lineare Theorie, idealisierte und realitaetsnahe numerische Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Leutbecher, M. [DLR Deutsches Zentrum fuer Luft- und Raumfahrt e.V., Wessling (Germany). Inst. fuer Physik der Atmosphaere

    1998-07-01

    Flow over mountains in the stably stratified atmosphere excites gravity waves. The three-dimensional propagation of these waves into the stratosphere is studied using linear theority as well as idealized and realistic numerical simulations. Stagnation, momentum fluxes and temperature anomalies are analyzed for idealized types of flow. Isolated mountains with elliptical contours are considered. The unperturbed atmosphere has constant wind speed and constant static stability or two layers (troposphere/stratosphere) of constant stability each. Real flow over orography is investigated where gravity waves in the stratosphere have been observed. Characteristics of the gravity wave event over the southern tip of Greenland on 6 January 1992 were recorded on a flight of the ER-2 at an altitude of 20 km. In the second case polar stratospheric clouds (PSC) were observed by an airborne Lidar over Northern Scandinavia on 9 January 1997. The PSC were induced by temperature anomalies in orographic gravity waves. (orig.)

  20. Considering economic and geological uncertainty in the simulation of realistic investment decisions for CO2-EOR projects in the North Sea

    NARCIS (Netherlands)

    Welkenhuysen, Kris; Rupert, Jort; Compernolle, Tine; Ramirez, Andrea|info:eu-repo/dai/nl/284852414; Swennen, Rudy; Piessens, Kris

    2017-01-01

    The use of anthropogenic CO2 for enhancing oil recovery from mature oil fields in the North Sea has several potential benefits, and a number of assessments have been conducted. It remains, however, difficult to realistically simulate the economic circumstances and decisions, while including the

  1. Modeling the Magnetopause Shadowing Loss during the October 2012 Dropout Event

    Science.gov (United States)

    Tu, Weichao; Cunningham, Gregory

    2017-04-01

    The relativistic electron flux in Earth's outer radiation belt are observed to drop by orders of magnitude on timescales of a few hours, which is called radiation belt dropouts. Where do the electrons go during the dropouts? This is one of the most important outstanding questions in radiation belt studies. Radiation belt electrons can be lost either by precipitation into the atmosphere or by transport across the magnetopause into interplanetary space. The latter mechanism is called magnetopause shadowing, usually combined with outward radial diffusion of electrons due to the sharp radial gradient it creates. In order to quantify the relative contribution of these two mechanisms to radiation belt dropout, we performed an event study on the October 2012 dropout event observed by Van Allen Probes. First, the precipitating MeV electrons observed by multiple NOAA POES satellites at low altitude did not show evidence of enhanced precipitation during the dropout, which suggested that precipitation was not the dominant loss mechanism for the event. Then, in order to simulate the magnetopause shadowing loss and outward radial diffusion during the dropout, we applied a radial diffusion model with electron lifetimes on the order of electron drift periods outside the last closed drift shell. In addition, realistic and event-specific inputs of radial diffusion coefficients (DLL) and last closed drift shell (LCDS) were implemented in the model. Specifically, we used the new DLL developed by Cunningham [JGR 2016] which were estimated in realistic TS04 [Tsyganenko and Sitnov, JGR 2005] storm time magnetic field model and included physical K (2nd adiabatic invariant) or pitch angle dependence. Event-specific LCDS traced in TS04 model with realistic K dependence was also implemented. Our simulation results showed that these event-specific inputs are critical to explain the electron dropout during the event. The new DLL greatly improved the model performance at low L* regions (L

  2. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  3. Event-by-Event Simulation of Induced Fission

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  4. Event-by-Event Simulation of Induced Fission

    Science.gov (United States)

    Vogt, Ramona; Randrup, Jørgen

    2008-04-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  5. Event-by-Event Simulation of Induced Fission

    International Nuclear Information System (INIS)

    Vogt, Ramona; Randrup, Joergen

    2008-01-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented

  6. Event-by-Event Simulation of Induced Fission

    International Nuclear Information System (INIS)

    Vogt, R; Randrup, J

    2007-01-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented

  7. Towards a unified European electricity market: The contribution of data-mining to support realistic simulation studies

    DEFF Research Database (Denmark)

    Pinto, Tiago; Santos, Gabriel; Pereira, Ivo F.

    2014-01-01

    Worldwide electricity markets have been evolving into regional and even continental scales. The aim at an efficient use of renewable based generation in places where it exceeds the local needs is one of the main reasons. A reference case of this evolution is the European Electricity Market, where...... countries are connected, and several regional markets were created, each one grouping several countries, and supporting transactions of huge amounts of electrical energy. The continuous transformations electricity markets have been experiencing over the years create the need to use simulation platforms...... to support operators, regulators, and involved players for understanding and dealing with this complex environment. This paper focuses on demonstrating the advantage that real electricity markets data has for the creation of realistic simulation scenarios, which allow the study of the impacts...

  8. Event-driven simulation of neural population synchronization facilitated by electrical coupling.

    Science.gov (United States)

    Carrillo, Richard R; Ros, Eduardo; Barbour, Boris; Boucheny, Christian; Coenen, Olivier

    2007-02-01

    Most neural communication and processing tasks are driven by spikes. This has enabled the application of the event-driven simulation schemes. However the simulation of spiking neural networks based on complex models that cannot be simplified to analytical expressions (requiring numerical calculation) is very time consuming. Here we describe briefly an event-driven simulation scheme that uses pre-calculated table-based neuron characterizations to avoid numerical calculations during a network simulation, allowing the simulation of large-scale neural systems. More concretely we explain how electrical coupling can be simulated efficiently within this computation scheme, reproducing synchronization processes observed in detailed simulations of neural populations.

  9. Realistic simulation of reduced-dose CT with noise modeling and sinogram synthesis using DICOM CT images

    International Nuclear Information System (INIS)

    Won Kim, Chang; Kim, Jong Hyo

    2014-01-01

    Purpose: Reducing the patient dose while maintaining the diagnostic image quality during CT exams is the subject of a growing number of studies, in which simulations of reduced-dose CT with patient data have been used as an effective technique when exploring the potential of various dose reduction techniques. Difficulties in accessing raw sinogram data, however, have restricted the use of this technique to a limited number of institutions. Here, we present a novel reduced-dose CT simulation technique which provides realistic low-dose images without the requirement of raw sinogram data. Methods: Two key characteristics of CT systems, the noise equivalent quanta (NEQ) and the algorithmic modulation transfer function (MTF), were measured for various combinations of object attenuation and tube currents by analyzing the noise power spectrum (NPS) of CT images obtained with a set of phantoms. Those measurements were used to develop a comprehensive CT noise model covering the reduced x-ray photon flux, object attenuation, system noise, and bow-tie filter, which was then employed to generate a simulated noise sinogram for the reduced-dose condition with the use of a synthetic sinogram generated from a reference CT image. The simulated noise sinogram was filtered with the algorithmic MTF and back-projected to create a noise CT image, which was then added to the reference CT image, finally providing a simulated reduced-dose CT image. The simulation performance was evaluated in terms of the degree of NPS similarity, the noise magnitude, the bow-tie filter effect, and the streak noise pattern at photon starvation sites with the set of phantom images. Results: The simulation results showed good agreement with actual low-dose CT images in terms of their visual appearance and in a quantitative evaluation test. The magnitude and shape of the NPS curves of the simulated low-dose images agreed well with those of real low-dose images, showing discrepancies of less than +/−3.2% in

  10. Realistic full wave modeling of focal plane array pixels.

    Energy Technology Data Exchange (ETDEWEB)

    Campione, Salvatore [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Warne, Larry K. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Jorgenson, Roy E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Davids, Paul [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.; Peters, David W. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.

    2017-11-01

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects, the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.

  11. Running Parallel Discrete Event Simulators on Sierra

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jefferson, D. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  12. Reproductive Health Services Discrete-Event Simulation

    OpenAIRE

    Lee, Sungjoo; Giles, Denise F.; Goldsman, David; Cook, Douglas A.; Mishra, Ninad; McCarthy, Brian

    2006-01-01

    Low resource healthcare environments are often characteristic of patient flow patterns with varying patient risks, extensive patient waiting times, uneven workload distributions, and inefficient service delivery. Models from industrial and systems engineering allow for a greater examination of processes by applying discrete-event computer simulation techniques to evaluate and optimize hospital performance.

  13. Realistic simulations of coaxial atomisation

    Science.gov (United States)

    Zaleski, Stephane; Fuster, Daniel; Arrufat Jackson, Tomas; Ling, Yue; Cenni, Matteo; Scardovelli, Ruben; Tryggvason, Gretar

    2015-11-01

    We discuss advances in the methodology for Direct Numerical Simulations of coaxial atomization in typical experimental conditions. Such conditions are extremely demanding for the numerical methods. The key difficulty seems to be the combination of high density ratios, surface tension, and large Reynolds numbers. We explore how using a momentum-conserving Volume-Of-Fluid scheme allows to improve the stability and accuracy of the simulations. We show computational evidence that the use of momentum conserving methods allows to reduce the required number of grid points by an order of magnitude in the simple case of a falling rain drop. We then apply these ideas to coaxial atomization. We show that in moderate-size simulations in air-water conditions close to real experiments, instabilities are still present and then discuss ways to fix them. Among those, removing small VOF debris and improving the time-stepping scheme are two important directions.The accuracy of the simulations is then discussed in comparison with experimental results and in particular the angle of ejection of the structures. The code used for this research is free and distributed at http://parissimulator.sf.net.

  14. Super-Eddington Accretion in Tidal Disruption Events: the Impact of Realistic Fallback Rates on Accretion Rates

    Science.gov (United States)

    Wu, Samantha; Coughlin, Eric R.; Nixon, Chris

    2018-04-01

    After the tidal disruption of a star by a massive black hole, disrupted stellar debris can fall back to the hole at a rate significantly exceeding its Eddington limit. To understand how black hole mass affects the duration of super-Eddington accretion in tidal disruption events, we first run a suite of simulations of the disruption of a Solar-like star by a supermassive black hole of varying mass to directly measure the fallback rate onto the hole, and we compare these fallback rates to the analytic predictions of the "frozen-in" model. Then, adopting a Zero-Bernoulli Accretion flow as an analytic prescription for the accretion flow around the hole, we investigate how the accretion rate onto the black hole evolves with the more accurate fallback rates calculated from the simulations. We find that numerically-simulated fallback rates yield accretion rates onto the hole that can, depending on the black hole mass, be nearly an order of magnitude larger than those predicted by the frozen-in approximation. Our results place new limits on the maximum black hole mass for which super-Eddington accretion occurs in tidal disruption events.

  15. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  16. Manual for the Jet Event and Background Simulation Library(JEBSimLib)

    Energy Technology Data Exchange (ETDEWEB)

    Heinz, Matthias [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Soltz, Ron [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Angerami, Aaron [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-08-29

    Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momenta are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.

  17. Simulation and study of small numbers of random events

    Science.gov (United States)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  18. Description of the signal and background event mixing as implemented in the Marlin processor OverlayTiming

    CERN Document Server

    Schade, P

    2011-01-01

    This note documents OverlayTiming, a processor in the Marlin software frame- work. OverlayTiming can model the timing structure of a linear collider bunch train and offers the possibility to merge simulated physics events with beam-beam background events. In addition, a realistic structure of the detector readout can be imitated by defining readout time windows for each subdetector.

  19. Realistic RF system and Beam Simulation in Real Time for a Synchrotron

    CERN Document Server

    Tückmantel, Joachim

    2001-01-01

    Due to heavy beam loading with gaps in the LHC beams, RF and beam are intimately linked to a complex system with fast transients where the RF loops and their limitations play a decisive role. Such a system is difficult to assess with analytical methods. To learn about overall system stability and for the definition of RF components to be built it is essential to understand the complete system long before the machine really exists. Therefore the author has written a general purpose real time simulation program and applied it to model the LHC machine with its beam pattern and complete double RF system. The latter is equipped with fast RF vector feedback loops having loop delay, transmitter power limitation and limited amplifier bandwidth as well as including one-turn-delay feedback and longitudinal batch injection damping. The development of all RF and beam quantities can be displayed graphically turn by turn. These frames can be assembled to a realistic multi-trace scope movie.

  20. Analysis of nucleation events in the European boundary layer using the regional aerosol–climate model REMO-HAM with a solar radiation-driven OH-proxy

    Directory of Open Access Journals (Sweden)

    J.-P. Pietikäinen

    2014-11-01

    Full Text Available This work describes improvements in the regional aerosol–climate model REMO-HAM in order to simulate more realistically the process of atmospheric new particle formation (NPF. A new scheme was implemented to simulate OH radical concentrations using a proxy approach based on observations and also accounting for the effects of clouds upon OH concentrations. Second, the nucleation rate calculation was modified to directly simulate the formation rates of 3 nm particles, which removes some unnecessary steps in the formation rate calculations used earlier in the model. Using the updated model version, NPF over Europe was simulated for the periods 2003–2004 and 2008–2009. The statistics of the simulated particle formation events were subsequently compared to observations from 13 ground-based measurement sites. The new model shows improved agreement with the observed NPF rates compared to former versions and can simulate the event statistics realistically for most parts of Europe.

  1. Does preliminary optimisation of an anatomically correct skull-brain model using simple simulants produce clinically realistic ballistic injury fracture patterns?

    Science.gov (United States)

    Mahoney, P F; Carr, D J; Delaney, R J; Hunt, N; Harrison, S; Breeze, J; Gibb, I

    2017-07-01

    Ballistic head injury remains a significant threat to military personnel. Studying such injuries requires a model that can be used with a military helmet. This paper describes further work on a skull-brain model using skulls made from three different polyurethane plastics and a series of skull 'fills' to simulate brain (3, 5, 7 and 10% gelatine by mass and PermaGel™). The models were subjected to ballistic impact from 7.62 × 39 mm mild steel core bullets. The first part of the work compares the different polyurethanes (mean bullet muzzle velocity of 708 m/s), and the second part compares the different fills (mean bullet muzzle velocity of 680 m/s). The impact events were filmed using high speed cameras. The resulting fracture patterns in the skulls were reviewed and scored by five clinicians experienced in assessing penetrating head injury. In over half of the models, one or more assessors felt aspects of the fracture pattern were close to real injury. Limitations of the model include the skull being manufactured in two parts and the lack of a realistic skin layer. Further work is ongoing to address these.

  2. Perceived synchrony for realistic and dynamic audiovisual events.

    Science.gov (United States)

    Eg, Ragnhild; Behne, Dawn M

    2015-01-01

    In well-controlled laboratory experiments, researchers have found that humans can perceive delays between auditory and visual signals as short as 20 ms. Conversely, other experiments have shown that humans can tolerate audiovisual asynchrony that exceeds 200 ms. This seeming contradiction in human temporal sensitivity can be attributed to a number of factors such as experimental approaches and precedence of the asynchronous signals, along with the nature, duration, location, complexity and repetitiveness of the audiovisual stimuli, and even individual differences. In order to better understand how temporal integration of audiovisual events occurs in the real world, we need to close the gap between the experimental setting and the complex setting of everyday life. With this work, we aimed to contribute one brick to the bridge that will close this gap. We compared perceived synchrony for long-running and eventful audiovisual sequences to shorter sequences that contain a single audiovisual event, for three types of content: action, music, and speech. The resulting windows of temporal integration showed that participants were better at detecting asynchrony for the longer stimuli, possibly because the long-running sequences contain multiple corresponding events that offer audiovisual timing cues. Moreover, the points of subjective simultaneity differ between content types, suggesting that the nature of a visual scene could influence the temporal perception of events. An expected outcome from this type of experiment was the rich variation among participants' distributions and the derived points of subjective simultaneity. Hence, the designs of similar experiments call for more participants than traditional psychophysical studies. Heeding this caution, we conclude that existing theories on multisensory perception are ready to be tested on more natural and representative stimuli.

  3. Modeling Supermassive Black Holes in Cosmological Simulations

    Science.gov (United States)

    Tremmel, Michael

    My thesis work has focused on improving the implementation of supermassive black hole (SMBH) physics in cosmological hydrodynamic simulations. SMBHs are ubiquitous in mas- sive galaxies, as well as bulge-less galaxies and dwarfs, and are thought to be a critical component to massive galaxy evolution. Still, much is unknown about how SMBHs form, grow, and affect their host galaxies. Cosmological simulations are an invaluable tool for un- derstanding the formation of galaxies, self-consistently tracking their evolution with realistic merger and gas accretion histories. SMBHs are often modeled in these simulations (generally as a necessity to produce realistic massive galaxies), but their implementations are commonly simplified in ways that can limit what can be learned. Current and future observations are opening new windows into the lifecycle of SMBHs and their host galaxies, but require more detailed, physically motivated simulations. Within the novel framework I have developed, SMBHs 1) are seeded at early times without a priori assumptions of galaxy occupation, 2) grow in a way that accounts for the angular momentum of gas, and 3) experience realistic orbital evolution. I show how this model, properly tuned with a novel parameter optimiza- tion technique, results in realistic galaxies and SMBHs. Utilizing the unique ability of these simulations to capture the dynamical evolution of SMBHs, I present the first self-consistent prediction for the formation timescales of close SMBH pairs, precursors to SMBH binaries and merger events potentially detected by future gravitational wave experiments.

  4. Parallel Stochastic discrete event simulation of calcium dynamics in neuron.

    Science.gov (United States)

    Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W

    2017-09-26

    The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.

  5. Fatigue - determination of a more realistic usage factor

    International Nuclear Information System (INIS)

    Lang, H.

    2001-01-01

    The ability to use a suitable counting method for determining the stress range spectrum in elastic and simplified elastic-plastic fatigue analyses is of crucial importance for enabling determination of a realistic usage factor. Determination of elastic-plastic strain range using the K e factor from fictitious elastically calculated loads is also important in the event of elastic behaviour being exceeded. This paper thus examines both points in detail. A fatigue module with additional options, which functions on this basis is presented. The much more realistic determination of usage factor presented here offers various economic benefits depending on the application

  6. Building a ROS-Based Testbed for Realistic Multi-Robot Simulation: Taking the Exploration as an Example

    Directory of Open Access Journals (Sweden)

    Zhi Yan

    2017-09-01

    Full Text Available While the robotics community agrees that the benchmarking is of high importance to objectively compare different solutions, there are only few and limited tools to support it. To address this issue in the context of multi-robot systems, we have defined a benchmarking process based on experimental designs, which aimed at improving the reproducibility of experiments by making explicit all elements of a benchmark such as parameters, measurements and metrics. We have also developed a ROS (Robot Operating System-based testbed with the goal of making it easy for users to validate, benchmark, and compare different algorithms including coordination strategies. Our testbed uses the MORSE (Modular OpenRobots Simulation Engine simulator for realistic simulation and a computer cluster for decentralized computation. In this paper, we present our testbed in details with the architecture and infrastructure, the issues encountered in implementing the infrastructure, and the automation of the deployment. We also report a series of experiments on multi-robot exploration, in order to demonstrate the capabilities of our testbed.

  7. Realistic Affective Forecasting: The Role of Personality

    Science.gov (United States)

    Hoerger, Michael; Chapman, Ben; Duberstein, Paul

    2016-01-01

    Affective forecasting often drives decision making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the ‘realistic paradigm’ in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesized that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine’s Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesized, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts. PMID:26212463

  8. Development of a realistic human airway model.

    Science.gov (United States)

    Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav

    2012-03-01

    Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.

  9. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  10. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  11. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  12. Challenges to the development of complex virtual reality surgical simulations.

    Science.gov (United States)

    Seymour, N E; Røtnes, J S

    2006-11-01

    Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.

  13. Analysis of the Steam Generator Tubes Rupture Initiating Event

    International Nuclear Information System (INIS)

    Trillo, A.; Minguez, E.; Munoz, R.; Melendez, E.; Sanchez-Perea, M.; Izquierd, J.M.

    1998-01-01

    In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)

  14. EIT forward problem parallel simulation environment with anisotropic tissue and realistic electrode models.

    Science.gov (United States)

    De Marco, Tommaso; Ries, Florian; Guermandi, Marco; Guerrieri, Roberto

    2012-05-01

    Electrical impedance tomography (EIT) is an imaging technology based on impedance measurements. To retrieve meaningful insights from these measurements, EIT relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of current flows therein. The nonhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeoff between physical accuracy and technical feasibility, which at present severely limits the capabilities of EIT. This work presents a complete algorithmic flow for an accurate EIT modeling environment featuring high anatomical fidelity with a spatial resolution equal to that provided by an MRI and a novel realistic complete electrode model implementation. At the same time, we demonstrate that current graphics processing unit (GPU)-based platforms provide enough computational power that a domain discretized with five million voxels can be numerically modeled in about 30 s.

  15. Building Realistic Mobility Models for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Adrian Pullin

    2018-04-01

    Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To

  16. The Electrostatic Instability for Realistic Pair Distributions in Blazar/EBL Cascades

    Science.gov (United States)

    Vafin, S.; Rafighi, I.; Pohl, M.; Niemiec, J.

    2018-04-01

    This work revisits the electrostatic instability for blazar-induced pair beams propagating through the intergalactic medium (IGM) using linear analysis and PIC simulations. We study the impact of the realistic distribution function of pairs resulting from the interaction of high-energy gamma-rays with the extragalactic background light. We present analytical and numerical calculations of the linear growth rate of the instability for the arbitrary orientation of wave vectors. Our results explicitly demonstrate that the finite angular spread of the beam dramatically affects the growth rate of the waves, leading to the fastest growth for wave vectors quasi-parallel to the beam direction and a growth rate at oblique directions that is only a factor of 2–4 smaller compared to the maximum. To study the nonlinear beam relaxation, we performed PIC simulations that take into account a realistic wide-energy distribution of beam particles. The parameters of the simulated beam-plasma system provide an adequate physical picture that can be extrapolated to realistic blazar-induced pairs. In our simulations, the beam looses only 1% of its energy, and we analytically estimate that the beam would lose its total energy over about 100 simulation times. An analytical scaling is then used to extrapolate the parameters of realistic blazar-induced pair beams. We find that they can dissipate their energy slightly faster by the electrostatic instability than through inverse-Compton scattering. The uncertainties arising from, e.g., details of the primary gamma-ray spectrum are too large to make firm statements for individual blazars, and an analysis based on their specific properties is required.

  17. MHD simulation of the Bastille day event

    Energy Technology Data Exchange (ETDEWEB)

    Linker, Jon, E-mail: linkerj@predsci.com; Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete [Predictive Science Inc., 9990 Mesa Rim Road, Suite 170, San Diego CA, USA 92121 (United States)

    2016-03-25

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 10{sup 33} ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  18. Rare event simulation in radiation transport

    International Nuclear Information System (INIS)

    Kollman, C.

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ''learning'' algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution

  19. Out-of-order parallel discrete event simulation for electronic system-level design

    CERN Document Server

    Chen, Weiwei

    2014-01-01

    This book offers readers a set of new approaches and tools a set of tools and techniques for facing challenges in parallelization with design of embedded systems.? It provides an advanced parallel simulation infrastructure for efficient and effective system-level model validation and development so as to build better products in less time.? Since parallel discrete event simulation (PDES) has the potential to exploit the underlying parallel computational capability in today's multi-core simulation hosts, the author begins by reviewing the parallelization of discrete event simulation, identifyin

  20. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    Science.gov (United States)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  1. Gyrokinetic simulation study of magnetic island effects on neoclassical physics and micro-instabilities in a realistic KSTAR plasma

    Science.gov (United States)

    Kwon, Jae-Min; Ku, S.; Choi, M. J.; Chang, C. S.; Hager, R.; Yoon, E. S.; Lee, H. H.; Kim, H. S.

    2018-05-01

    We perform gyrokinetic simulations to study the effects of a stationary magnetic island on neoclassical flow and micro-instability in a realistic KSTAR plasma condition. Through the simulations, we aim to analyze a recent KSTAR experiment, which was to measure the details of poloidal flow and fluctuation around a stationary (2, 1) magnetic island [M. J. Choi et al., Nucl. Fusion 57, 126058 (2017)]. From the simulations, it is found that the magnetic island can significantly enhance the equilibrium E × B flow. The corresponding flow shearing is strong enough to suppress a substantial portion of ambient micro-instabilities, particularly ∇Te -driven trapped electron modes. This implies that the enhanced E × B flow can sustain a quasi-internal transport barrier for Te in an inner region neighboring the magnetic island. The enhanced E × B flow has a (2, 1) mode structure with a finite phase shift from the mode structure of the magnetic island. It is shown that the flow shear and the fluctuation suppression patterns implied from the simulations are consistent with the observations on the KSTAR experiment.

  2. Any realistic theory must be computationally realistic: a response to N. Gisin's definition of a Realistic Physics Theory

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.

  3. Simulating the value of electric-vehicle-grid integration using a behaviourally realistic model

    Science.gov (United States)

    Wolinetz, Michael; Axsen, Jonn; Peters, Jotham; Crawford, Curran

    2018-02-01

    Vehicle-grid integration (VGI) uses the interaction between electric vehicles and the electrical grid to provide benefits that may include reducing the cost of using intermittent renwable electricity or providing a financial incentive for electric vehicle ownerhip. However, studies that estimate the value of VGI benefits have largely ignored how consumer behaviour will affect the magnitude of the impact. Here, we simulate the long-term impact of VGI using behaviourally realistic and empirically derived models of vehicle adoption and charging combined with an electricity system model. We focus on the case where a central entity manages the charging rate and timing for participating electric vehicles. VGI is found not to increase the adoption of electric vehicles, but does have a a small beneficial impact on electricity prices. By 2050, VGI reduces wholesale electricity prices by 0.6-0.7% (0.7 MWh-1, 2010 CAD) relative to an equivalent scenario without VGI. Excluding consumer behaviour from the analysis inflates the value of VGI.

  4. Discrete event simulation of Maglev transport considering traffic waves

    Directory of Open Access Journals (Sweden)

    Moo Hyun Cha

    2014-10-01

    Full Text Available A magnetically levitated vehicle (Maglev system is under commercialization as a new transportation system in Korea. The Maglev is operated by an unmanned automatic control system. Therefore, the plan of train operation should be carefully established and validated in advance. In general, when making a train operation plan, statistically predicted traffic data is used. However, a traffic wave often occurs in real train service, and demand-driven simulation technology is required to review a train operation plan and service quality considering traffic waves. We propose a method and model to simulate Maglev operation considering continuous demand changes. For this purpose, we employed a discrete event model that is suitable for modeling the behavior of railway passenger transportation. We modeled the system hierarchically using discrete event system specification (DEVS formalism. In addition, through implementation and an experiment using the DEVSim++ simulation environment, we tested the feasibility of the proposed model. Our experimental results also verified that our demand-driven simulation technology can be used for a priori review of train operation plans and strategies.

  5. Application of discrete event simulation to MRS design

    International Nuclear Information System (INIS)

    Bali, M.; Standley, W.

    1993-01-01

    The application of discrete event simulation to the Monitored, Retrievable Storage (MRS) material handling operations supported the MRS conceptual design effort and established a set of tools for use during MRS detail design and license application. The effort to develop a design analysis tool to support the MRS project started in 1991. The MRS simulation has so far identified potential savings and suggested methods of improving operations to enhance throughput. Immediately, simulation aided the MRS conceptual design effort through the investigation of alternative cask handling operations and the sizing and sharing of expensive equipment. The simulation also helped analyze the operability of the current design of MRS under various waste acceptance scenarios. Throughout the simulation effort, the model development and experimentation resulted in early identification and resolution of several design and operational issues

  6. A Numerical Approach for Hybrid Simulation of Power System Dynamics Considering Extreme Icing Events

    DEFF Research Database (Denmark)

    Chen, Lizheng; Zhang, Hengxu; Wu, Qiuwei

    2017-01-01

    numerical simulation scheme integrating icing weather events with power system dynamics is proposed to extend power system numerical simulation. A technique is developed to efficiently simulate the interaction of slow dynamics of weather events and fast dynamics of power systems. An extended package for PSS...

  7. Application of real space Kerker method in simulating gate-all-around nanowire transistors with realistic discrete dopants*

    International Nuclear Information System (INIS)

    Li Chang-Sheng; Ma Lei; Guo Jie-Rong

    2017-01-01

    We adopt a self-consistent real space Kerker method to prevent the divergence from charge sloshing in the simulating transistors with realistic discrete dopants in the source and drain regions. The method achieves efficient convergence by avoiding unrealistic long range charge sloshing but keeping effects from short range charge sloshing. Numerical results show that discrete dopants in the source and drain regions could have a bigger influence on the electrical variability than the usual continuous doping without considering charge sloshing. Few discrete dopants and the narrow geometry create a situation with short range Coulomb screening and oscillations of charge density in real space. The dopants induced quasi-localized defect modes in the source region experience short range oscillations in order to reach the drain end of the device. The charging of the defect modes and the oscillations of the charge density are identified by the simulation of the electron density. (paper)

  8. Electromagnetic field effect simulation over a realistic pixel ed phantom human's brain

    International Nuclear Information System (INIS)

    Rojas, R.; Calderon, J. A.; Rivera, T.; Azorin, J.

    2012-10-01

    The exposition to different types of electromagnetic radiations can produce damages and injures on the people's tissues. The scientist, spend time and resources studying the effects of electromagnetic fields over the organs. Particularly in medical areas, the specialist in imaging methodologies and radiological treatment, are very worried about no injure there patient. Determination of matter radiation interaction, can be experimental or theoretical is not an easy task anyway. At first case, is not possible make measures inside the patient, then the experimental procedure consist in make measures in human's dummy, however, is not possible see deformations of electromagnetic fields due the organs presence. In the second case, is necessary solve, the Maxwell's equations with the electromagnetic field, crossing a lot of organs and tissues with different electric and magnetic properties each one. One alternative for theoretical solution, is make a computational simulation, however, this option, require an enormous quantity of memory and large computational times. Then, the most simulations are making in 2 dimensional or in 3 dimensional although using human models approximations, build ed with basic geometrical figures, like spheres, cylinders, ellipsoids, etc. Obviously this models just lets obtain a coarse solution of the actually situation. In this work, we propose a novel methodology to build a realistic pixel ed phantom of human's organs, and solve the Maxwell's equations over this models, evidently, the solutions are more approximated to the real behaviour. Additionally, there models results optimized when they are discretized and the finite element method is used to calculate the electromagnetic field and the induced currents. (Author)

  9. 3D Simulation of External Flooding Events for the RISMC Pathway

    International Nuclear Information System (INIS)

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad; Smith, Curtis; Lin, Linyu

    2015-01-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  10. 3D Simulation of External Flooding Events for the RISMC Pathway

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, Steven [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sampath, Ramprasad [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lin, Linyu [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  11. Corpuscular event-by-event simulation of quantum optics experiments : application to a quantum-controlled delayed-choice experiment

    NARCIS (Netherlands)

    De Raedt, Hans; Delina, M; Jin, Fengping; Michielsen, Kristel

    2012-01-01

    A corpuscular simulation model of optical phenomena that does not require knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one by one is discussed. The event-based corpuscular model gives a unified

  12. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  13. Discrete event simulation: Modeling simultaneous complications and outcomes

    NARCIS (Netherlands)

    Quik, E.H.; Feenstra, T.L.; Krabbe, P.F.M.

    2012-01-01

    OBJECTIVES: To present an effective and elegant model approach to deal with specific characteristics of complex modeling. METHODS: A discrete event simulation (DES) model with multiple complications and multiple outcomes that each can occur simultaneously was developed. In this DES model parameters,

  14. Modeling of ultrasonic wave propagation in composite laminates with realistic discontinuity representation.

    Science.gov (United States)

    Zelenyak, Andreea-Manuela; Schorer, Nora; Sause, Markus G R

    2018-02-01

    This paper presents a method for embedding realistic defect geometries of a fiber reinforced material in a finite element modeling environment in order to simulate active ultrasonic inspection. When ultrasonic inspection is used experimentally to investigate the presence of defects in composite materials, the microscopic defect geometry may cause signal characteristics that are difficult to interpret. Hence, modeling of this interaction is key to improve our understanding and way of interpreting the acquired ultrasonic signals. To model the true interaction of the ultrasonic wave field with such defect structures as pores, cracks or delamination, a realistic three dimensional geometry reconstruction is required. We present a 3D-image based reconstruction process which converts computed tomography data in adequate surface representations ready to be embedded for processing with finite element methods. Subsequent modeling using these geometries uses a multi-scale and multi-physics simulation approach which results in quantitative A-Scan ultrasonic signals which can be directly compared with experimental signals. Therefore, besides the properties of the composite material, a full transducer implementation, piezoelectric conversion and simultaneous modeling of the attached circuit is applied. Comparison between simulated and experimental signals provides very good agreement in electrical voltage amplitude and the signal arrival time and thus validates the proposed modeling approach. Simulating ultrasound wave propagation in a medium with a realistic shape of the geometry clearly shows a difference in how the disturbance of the waves takes place and finally allows more realistic modeling of A-scans. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. NEVESIM: event-driven neural simulation framework with a Python interface.

    Science.gov (United States)

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  16. The accuracy of dysphoric and nondepressed groups' predictions of life events.

    Science.gov (United States)

    Kapçi, E G; Cramer, D

    1998-11-01

    The phenomenon of depressive realism was examined in relation to the future prediction of positive and negative life events. A group of dysphoric (n = 20) and nondepressed (n = 38) British undergraduates participated in a prospective study lasting 3 months. Partly consistent with the depressive realism hypotheses, dysphoric participants were more realistic concerning the negative life events they would experience, but they were less realistic concerning the negative life events they would not experience. Although no difference was found for predicting the occurrence of positive life events, dysphoric participants were found to be more realistic concerning positive life events that they would not experience.

  17. Towards a realistic plasma simulation code

    International Nuclear Information System (INIS)

    Anderson, D.V.

    1991-06-01

    Several new developments in the technology of simulating plasmas, both in particle and fluid models, now allow a stage of synthesis in which many of these advances can be combined into one simulation model. Accuracy and efficiency are the criteria to be satisfied in this quest. We want to build on the following research: 1. the development of the δf method of Barnes. 2. The moving node Galerkin model of Glasser, Miller and Carlson. 3. Particle moving schemes on unstructured grids by Ambrosiano and Bradon. 4. Particle simulations using sorted particles Anderson and Shumaker. Rather than being competing developments,these presumably can be combined into one computational model. We begin by summarizing the physics model for the plasma. The Vlasov equation can be solved as an initial value problem by integrating the plasma distribution function forward in time. 5 refs

  18. Discrete-event simulation of nuclear-waste transport in geologic sites subject to disruptive events. Final report

    International Nuclear Information System (INIS)

    Aggarwal, S.; Ryland, S.; Peck, R.

    1980-01-01

    This report outlines a methodology to study the effects of disruptive events on nuclear waste material in stable geologic sites. The methodology is based upon developing a discrete events model that can be simulated on the computer. This methodology allows a natural development of simulation models that use computer resources in an efficient manner. Accurate modeling in this area depends in large part upon accurate modeling of ion transport behavior in the storage media. Unfortunately, developments in this area are not at a stage where there is any consensus on proper models for such transport. Consequently, our work is directed primarily towards showing how disruptive events can be properly incorporated in such a model, rather than as a predictive tool at this stage. When and if proper geologic parameters can be determined, then it would be possible to use this as a predictive model. Assumptions and their bases are discussed, and the mathematical and computer model are described

  19. A simulation model of MAPS for the FairRoot framework

    Energy Technology Data Exchange (ETDEWEB)

    Amar-Youcef, Samir; Linnik, Benjamin; Sitzmann, Philipp [Goethe-Universitaet Frankfurt (Germany); Collaboration: CBM-MVD-Collaboration

    2014-07-01

    CMOS MAPS are the sensors of choice for the MVD of the CBM experiment at the FAIR facility. They offer a unique combination of features required for the CBM detector like low material budget, spatial resolution, radiation tolerance and yet sufficient read-out speed. The physics performance of various designs of the MVD integrated to the CBM detector system is evaluated in the CBM-/FairRoot simulation framework. In this context, algorithm are developed to simulate the realistic detector response and to optimize feature extraction from the sensor information. The objective of the sensor response model is to provide fast and realistic pixel response for a given track energy loss and position. In addition, we discuss aspects of simulating event pile-up and dataflow in the context of the CBM FLES event extraction and selection concept. This is of particular importance for the MVD since the sensors feature a comparably long integration time and a frame-wise read-out. All other detector systems operate with un-triggered front-end electronics and are freely streaming time-stamped data to the FLES. Because of the large data rates, event extraction is performed via distributed networking on a large HPC compute farm. We present an overview and status of the MVD software developments focusing on the integration of the system in a free-flowing read-out system and on the concurrent application for simulated and real data.

  20. Realistic page-turning of electronic books

    Science.gov (United States)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  1. Spontaneous abrupt climate change due to an atmospheric blocking–sea-ice–ocean feedback in an unforced climate model simulation

    NARCIS (Netherlands)

    Drijfhout, S.S.; Gleeson, E.; Dijkstra, H.A.|info:eu-repo/dai/nl/073504467; Livina, V.

    2013-01-01

    Abrupt climate change is abundant in geological records, but climate models rarely have been able to simulate such events in response to realistic forcing. Here we report on a spontaneous abrupt cooling event, lasting for more than a century, with a temperature anomaly similar to that of the Little

  2. Event-by-event simulation of quantum phenomena : Application to Einstein-Podolosky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    De Raedt, H.; De Raedt, K.; Michielsen, K.; Keimpema, K.; Miyashita, S.

    We review the data gathering and analysis procedure used in real E instein-Podolsky-Rosen-Bohm experiments with photons and we illustrate the procedure by analyzing experimental data. Based on this analysis, we construct event-based computer simulation models in which every essential element in the

  3. An Efficient Simulation Method for Rare Events

    KAUST Repository

    Rached, Nadhir B.

    2015-01-07

    Estimating the probability that a sum of random variables (RVs) exceeds a given threshold is a well-known challenging problem. Closed-form expressions for the sum distribution do not generally exist, which has led to an increasing interest in simulation approaches. A crude Monte Carlo (MC) simulation is the standard technique for the estimation of this type of probability. However, this approach is computationally expensive, especially when dealing with rare events. Variance reduction techniques are alternative approaches that can improve the computational efficiency of naive MC simulations. We propose an Importance Sampling (IS) simulation technique based on the well-known hazard rate twisting approach, that presents the advantage of being asymptotically optimal for any arbitrary RVs. The wide scope of applicability of the proposed method is mainly due to our particular way of selecting the twisting parameter. It is worth observing that this interesting feature is rarely satisfied by variance reduction algorithms whose performances were only proven under some restrictive assumptions. It comes along with a good efficiency, illustrated by some selected simulation results comparing the performance of our method with that of an algorithm based on a conditional MC technique.

  4. An Efficient Simulation Method for Rare Events

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Kammoun, Abla; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    Estimating the probability that a sum of random variables (RVs) exceeds a given threshold is a well-known challenging problem. Closed-form expressions for the sum distribution do not generally exist, which has led to an increasing interest in simulation approaches. A crude Monte Carlo (MC) simulation is the standard technique for the estimation of this type of probability. However, this approach is computationally expensive, especially when dealing with rare events. Variance reduction techniques are alternative approaches that can improve the computational efficiency of naive MC simulations. We propose an Importance Sampling (IS) simulation technique based on the well-known hazard rate twisting approach, that presents the advantage of being asymptotically optimal for any arbitrary RVs. The wide scope of applicability of the proposed method is mainly due to our particular way of selecting the twisting parameter. It is worth observing that this interesting feature is rarely satisfied by variance reduction algorithms whose performances were only proven under some restrictive assumptions. It comes along with a good efficiency, illustrated by some selected simulation results comparing the performance of our method with that of an algorithm based on a conditional MC technique.

  5. Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device

    Science.gov (United States)

    2016-06-01

    release; distribution is unlimited DISCRETE EVENT SIMULATION MODEL OF THE POLARIS 2.1 GAMMA RAY IMAGING RADIATION DETECTION DEVICE by Andres T...ONLY (Leave blank) 2. REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE DISCRETE EVENT SIMULATION MODEL...modeled. The platform, Simkit, was utilized to create a discrete event simulation (DES) model of the Polaris. After carefully constructing the DES

  6. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    Science.gov (United States)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  7. Discrete Event Simulation for the Analysis of Artillery Fired Projectiles from Shore

    Science.gov (United States)

    2017-06-01

    model. 2.1 Discrete Event Simulation with Simkit Simkit is a library of classes and interfaces, written in Java , that support ease of implemen- tation...Simkit allows simulation modelers to break complex systems into components through a framework of Listener Event Graph Objects (LEGOs), described in...Classes A disadvantage to using Java Enum Types is the inability to change the values of Enum Type parameters while conducting a designed experiment

  8. Powering stochastic reliability models by discrete event simulation

    DEFF Research Database (Denmark)

    Kozine, Igor; Wang, Xiaoyun

    2012-01-01

    it difficult to find a solution to the problem. The power of modern computers and recent developments in discrete-event simulation (DES) software enable to diminish some of the drawbacks of stochastic models. In this paper we describe the insights we have gained based on using both Markov and DES models...

  9. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah; Ross, Robert; Carns, Philip

    2016-05-15

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the model size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.

  10. Convective aggregation in realistic convective-scale simulations

    OpenAIRE

    Holloway, Christopher E.

    2017-01-01

    To investigate the real-world relevance of idealized-model convective self-aggregation, five 15-day cases of real organized convection in the tropics are simulated. These include multiple simulations of each case to test sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. These simulations are compared to self-aggregation seen in the same model configured to run in idealized radiative-convective equilibriu...

  11. Optimized Parallel Discrete Event Simulation (PDES) for High Performance Computing (HPC) Clusters

    National Research Council Canada - National Science Library

    Abu-Ghazaleh, Nael

    2005-01-01

    The aim of this project was to study the communication subsystem performance of state of the art optimistic simulator Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES...

  12. Discrete event simulation as an ergonomic tool to predict workload exposures during systems design

    NARCIS (Netherlands)

    Perez, J.; Looze, M.P. de; Bosch, T.; Neumann, W.P.

    2014-01-01

    This methodological paper presents a novel approach to predict operator's mechanical exposure and fatigue accumulation in discrete event simulations. A biomechanical model of work-cycle loading is combined with a discrete event simulation model which provides work cycle patterns over the shift

  13. Rare Event Simulation in Radiation Transport

    Science.gov (United States)

    Kollman, Craig

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiplied by the likelihood ratio between the true and simulated probabilities so as to keep our estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive "learning" algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give, with probability one, a sequence of estimates converging exponentially fast to the true solution. In the final chapter, an attempt to generalize this algorithm to a continuous

  14. Physically realistic modeling of maritime training simulation

    OpenAIRE

    Cieutat , Jean-Marc

    2003-01-01

    Maritime training simulation is an important matter of maritime teaching, which requires a lot of scientific and technical skills.In this framework, where the real time constraint has to be maintained, all physical phenomena cannot be studied; the most visual physical phenomena relating to the natural elements and the ship behaviour are reproduced only. Our swell model, based on a surface wave simulation approach, permits to simulate the shape and the propagation of a regular train of waves f...

  15. Simulation of an extreme heavy rainfall event over Chennai, India using WRF: Sensitivity to grid resolution and boundary layer physics

    KAUST Repository

    Srinivas, C.V.

    2018-05-04

    In this study, the heavy precipitation event on 01 December 2015 over Chennai located on the southeast coast of India was simulated using the Weather Research and Forecast (WRF) model. A series of simulations were conducted using explicit convection and varying the planetary boundary layer (PBL) parameterization schemes. The model results were compared with available surface, satellite and Doppler Weather Radar observations. Simulations indicate strong, sustained moist convection associated with development of a mesoscale upper air cyclonic circulation, during the passage of a synoptic scale low-pressure trough caused heavy rainfall over Chennai and its surroundings. Results suggest that veering of wind with height associated with strong wind shear in the layer 800–400 hPa together with dry air advection facilitated development of instability and initiation of convection. The 1-km domain using explicit convection improved the prediction of rainfall intensity of about 450 mm and its distribution. The PBL physics strongly influenced the rainfall prediction by changing the location of upper air circulation, energy transport, moisture convergence and intensity of convection in the schemes YSU, MYJ and MYNN. All the simulations underestimated the first spell of the heavy rainfall. While YSU and MYJ schemes grossly underestimated the rainfall and dislocated the area of maximum rainfall, the higher order MYNN scheme simulated the rainfall pattern in better agreement with observations. The MYNN showed lesser mixing and simulated more humid boundary layer, higher convective available potential energy (CAPE) and stronger winds at mid-troposphere than did the other schemes. The MYNN also realistically simulated the location of upper air cyclonic flow and various dynamic and thermodynamic features. Consequently it simulated stronger moisture convergence and higher precipitation.

  16. Simulation of an extreme heavy rainfall event over Chennai, India using WRF: Sensitivity to grid resolution and boundary layer physics

    KAUST Repository

    Srinivas, C.V.; Yesubabu, V.; Hari Prasad, D.; Hari Prasad, K.B.R.R.; Greeshma, M.M.; Baskaran, R.; Venkatraman, B.

    2018-01-01

    In this study, the heavy precipitation event on 01 December 2015 over Chennai located on the southeast coast of India was simulated using the Weather Research and Forecast (WRF) model. A series of simulations were conducted using explicit convection and varying the planetary boundary layer (PBL) parameterization schemes. The model results were compared with available surface, satellite and Doppler Weather Radar observations. Simulations indicate strong, sustained moist convection associated with development of a mesoscale upper air cyclonic circulation, during the passage of a synoptic scale low-pressure trough caused heavy rainfall over Chennai and its surroundings. Results suggest that veering of wind with height associated with strong wind shear in the layer 800–400 hPa together with dry air advection facilitated development of instability and initiation of convection. The 1-km domain using explicit convection improved the prediction of rainfall intensity of about 450 mm and its distribution. The PBL physics strongly influenced the rainfall prediction by changing the location of upper air circulation, energy transport, moisture convergence and intensity of convection in the schemes YSU, MYJ and MYNN. All the simulations underestimated the first spell of the heavy rainfall. While YSU and MYJ schemes grossly underestimated the rainfall and dislocated the area of maximum rainfall, the higher order MYNN scheme simulated the rainfall pattern in better agreement with observations. The MYNN showed lesser mixing and simulated more humid boundary layer, higher convective available potential energy (CAPE) and stronger winds at mid-troposphere than did the other schemes. The MYNN also realistically simulated the location of upper air cyclonic flow and various dynamic and thermodynamic features. Consequently it simulated stronger moisture convergence and higher precipitation.

  17. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  18. Benchmarking Simulation of Long Term Station Blackout Events

    International Nuclear Information System (INIS)

    Kim, Sung Kyum; Lee, John C.; Fynan, Douglas A.; Lee, John C.

    2013-01-01

    The importance of passive cooling systems has emerged since the SBO events. Turbine-driven auxiliary feedwater (TD-AFW) system is the only passive cooling system for steam generators (SGs) in current PWRs. During SBO events, all alternating current (AC) and direct current (DC) are interrupted and then the water levels of steam generators become high. In this case, turbine blades could be degraded and cannot cool down the SGs anymore. To prevent this kind of degradations, improved TD-AFW system should be installed for current PWRs, especially OPR 1000 plants. A long-term station blackout (LTSBO) scenario based on the improved TD-AFW system has been benchmarked as a reference input file. The following task is a safety analysis in order to find some important parameters causing the peak cladding temperature (PCT) to vary. This task has been initiated with the benchmarked input deck applying to the State-of-the-Art Reactor Consequence Analyses (SOARCA) Report. The point of the improved TD-AFW is to control the water level of the SG by using the auxiliary battery charged by a generator connected with the auxiliary turbine. However, this battery also could be disconnected from the generator. To analyze the uncertainties of the failure of the auxiliary battery, the simulation for the time-dependent failure of the TD-AFW has been performed. In addition to the cases simulated in the paper, some valves (e. g., pressurizer safety valve), available during SBO events in the paper, could be important parameters to assess uncertainties in PCTs estimated. The results for these parameters will be included in a future study in addition to the results for the leakage of the RCP seals. After the simulation of several transient cases, alternating conditional expectation (ACE) algorithm will be used to derive functional relationships between the PCT and several system parameters

  19. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  20. Discrete event simulation of the ATLAS second level trigger

    International Nuclear Information System (INIS)

    Vermeulen, J.C.; Dankers, R.J.; Hunt, S.; Harris, F.; Hortnagl, C.; Erasov, A.; Bogaerts, A.

    1998-01-01

    Discrete event simulation is applied for determining the computing and networking resources needed for the ATLAS second level trigger. This paper discusses the techniques used and some of the results obtained so far for well defined laboratory configurations and for the full system

  1. MR-based measurements and simulations of the magnetic field created by a realistic transcranial magnetic stimulation (TMS) coil and stimulator.

    Science.gov (United States)

    Mandija, Stefano; Petrov, Petar I; Neggers, Sebastian F W; Luijten, Peter R; van den Berg, Cornelis A T

    2016-11-01

    Transcranial magnetic stimulation (TMS) is an emerging technique that allows non-invasive neurostimulation. However, the correct validation of electromagnetic models of typical TMS coils and the correct assessment of the incident TMS field (B TMS ) produced by standard TMS stimulators are still lacking. Such a validation can be performed by mapping B TMS produced by a realistic TMS setup. In this study, we show that MRI can provide precise quantification of the magnetic field produced by a realistic TMS coil and a clinically used TMS stimulator in the region in which neurostimulation occurs. Measurements of the phase accumulation created by TMS pulses applied during a tailored MR sequence were performed in a phantom. Dedicated hardware was developed to synchronize a typical, clinically used, TMS setup with a 3-T MR scanner. For comparison purposes, electromagnetic simulations of B TMS were performed. MR-based measurements allow the mapping and quantification of B TMS starting 2.5 cm from the TMS coil. For closer regions, the intra-voxel dephasing induced by B TMS prohibits TMS field measurements. For 1% TMS output, the maximum measured value was ~0.1 mT. Simulations reflect quantitatively the experimental data. These measurements can be used to validate electromagnetic models of TMS coils, to guide TMS coil positioning, and for dosimetry and quality assessment of concurrent TMS-MRI studies without the need for crude methods, such as motor threshold, for stimulation dose determination. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  3. Realistic molecular model of kerogen's nanostructure.

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  4. The devil is in the details: Comparisons of episodic simulations of positive and negative future events.

    Science.gov (United States)

    Puig, Vannia A; Szpunar, Karl K

    2017-08-01

    Over the past decade, psychologists have devoted considerable attention to episodic simulation-the ability to imagine specific hypothetical events. Perhaps one of the most consistent patterns of data to emerge from this literature is that positive simulations of the future are rated as more detailed than negative simulations of the future, a pattern of results that is commonly interpreted as evidence for a positivity bias in future thinking. In the present article, we demonstrate across two experiments that negative future events are consistently simulated in more detail than positive future events when frequency of prior thinking is taken into account as a possible confounding variable and when level of detail associated with simulated events is assessed using an objective scoring criterion. Our findings are interpreted in the context of the mobilization-minimization hypothesis of event cognition that suggests people are especially likely to devote cognitive resources to processing negative scenarios. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Simplified realistic human head model for simulating Tumor Treating Fields (TTFields).

    Science.gov (United States)

    Wenger, Cornelia; Bomzon, Ze'ev; Salvador, Ricardo; Basser, Peter J; Miranda, Pedro C

    2016-08-01

    Tumor Treating Fields (TTFields) are alternating electric fields in the intermediate frequency range (100-300 kHz) of low-intensity (1-3 V/cm). TTFields are an anti-mitotic treatment against solid tumors, which are approved for Glioblastoma Multiforme (GBM) patients. These electric fields are induced non-invasively by transducer arrays placed directly on the patient's scalp. Cell culture experiments showed that treatment efficacy is dependent on the induced field intensity. In clinical practice, a software called NovoTalTM uses head measurements to estimate the optimal array placement to maximize the electric field delivery to the tumor. Computational studies predict an increase in the tumor's electric field strength when adapting transducer arrays to its location. Ideally, a personalized head model could be created for each patient, to calculate the electric field distribution for the specific situation. Thus, the optimal transducer layout could be inferred from field calculation rather than distance measurements. Nonetheless, creating realistic head models of patients is time-consuming and often needs user interaction, because automated image segmentation is prone to failure. This study presents a first approach to creating simplified head models consisting of convex hulls of the tissue layers. The model is able to account for anisotropic conductivity in the cortical tissues by using a tensor representation estimated from Diffusion Tensor Imaging. The induced electric field distribution is compared in the simplified and realistic head models. The average field intensities in the brain and tumor are generally slightly higher in the realistic head model, with a maximal ratio of 114% for a simplified model with reasonable layer thicknesses. Thus, the present pipeline is a fast and efficient means towards personalized head models with less complexity involved in characterizing tissue interfaces, while enabling accurate predictions of electric field distribution.

  6. A Framework for the Optimization of Discrete-Event Simulation Models

    Science.gov (United States)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  7. Application of Discrete Event Simulation in Mine Production Forecast

    African Journals Online (AJOL)

    Application of Discrete Event Simulation in Mine Production Forecast. Felix Adaania Kaba, Victor Amoako Temeng, Peter Arroja Eshun. Abstract. Mine production forecast is pertinent to mining as it serves production goals for a production period. Perseus Mining Ghana Limited (PMGL), Ayanfuri, deterministically forecasts ...

  8. Details of regional particle deposition and airflow structures in a realistic model of human tracheobronchial airways: two-phase flow simulation.

    Science.gov (United States)

    Rahimi-Gorji, Mohammad; Gorji, Tahereh B; Gorji-Bandpy, Mofid

    2016-07-01

    In the present investigation, detailed two-phase flow modeling of airflow, transport and deposition of micro-particles (1-10µm) in a realistic tracheobronchial airway geometry based on CT scan images under various breathing conditions (i.e. 10-60l/min) was considered. Lagrangian particle tracking has been used to investigate the particle deposition patterns in a model comprising mouth up to generation G6 of tracheobronchial airways. The results demonstrated that during all breathing patterns, the maximum velocity change occurred in the narrow throat region (Larynx). Due to implementing a realistic geometry for simulations, many irregularities and bending deflections exist in the airways model. Thereby, at higher inhalation rates, these areas are prone to vortical effects which tend to entrap the inhaled particles. According to the results, deposition fraction has a direct relationship with particle aerodynamic diameter (for dp=1-10µm). Enhancing inhalation flow rate and particle size will largely increase the inertial force and consequently, more particle deposition is evident suggesting that inertial impaction is the dominant deposition mechanism in tracheobronchial airways. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Parallel discrete-event simulation of FCFS stochastic queueing networks

    Science.gov (United States)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  10. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  11. ReDecay, a method to re-use the underlying events to speed up the simulation in LHCb

    CERN Multimedia

    Muller, Dominik

    2017-01-01

    With the steady increase in the precision of flavour physics measurements collected during LHC Run 2, the LHCb experiment requires simulated data samples of ever increasing magnitude to study the detector response in detail. However, relying on an increase of available computing power for the production of simulated events will not suffice to achieve this goal. The simulation of the detector response is the main contribution to the time needed to generate a sample, that scales linearly with the particles multiplicity of the event. Of the dozens of particles present in the simulation only a few, namely those participating in the studied signal decay, are of particular interest, while all remaining ones, the so-called underlying event, mainly affect the resolution and efficiencies of the detector. This talk presents a novel development for the LHCb simulation software which re-uses the underlying event from previously simulated events. This approach achieves an order of magnitude increase in speed and the same ...

  12. Simulation of tokamak runaway-electron events

    International Nuclear Information System (INIS)

    Bolt, H.; Miyahara, A.; Miyake, M.; Yamamoto, T.

    1987-08-01

    High energy runaway-electron events which can occur in tokamaks when the plasma hits the first wall are a critical issue for the materials selection of future devices. Runaway-electron events are simulated with an electron linear accelerator to better understand the observed runaway-electron damage to tokamak first wall materials and to consider the runaway-electron issue in further materials development and selection. The electron linear accelerator produces beam energies of 20 to 30 MeV at an integrated power input of up to 1.3 kW. Graphite, SiC + 2 % AlN, stainless steel, molybdenum and tungsten have been tested as bulk materials. To test the reliability of actively cooled systems under runaway-electron impact layer systems of graphite fixed to metal substrates have been tested. The irradiation resulted in damage to the metal compounds but left graphite and SiC + 2 % AlN without damage. Metal substrates of graphite - metal systems for actively cooled structures suffer severe damage unless thick graphite shielding is provided. (author)

  13. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    Science.gov (United States)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  14. Asynchronous discrete event schemes for PDEs

    Science.gov (United States)

    Stone, D.; Geiger, S.; Lord, G. J.

    2017-08-01

    A new class of asynchronous discrete-event simulation schemes for advection-diffusion-reaction equations is introduced, based on the principle of allowing quanta of mass to pass through faces of a (regular, structured) Cartesian finite volume grid. The timescales of these events are linked to the flux on the face. The resulting schemes are self-adaptive, and local in both time and space. Experiments are performed on realistic physical systems related to porous media flow applications, including a large 3D advection diffusion equation and advection diffusion reaction systems. The results are compared to highly accurate reference solutions where the temporal evolution is computed with exponential integrator schemes using the same finite volume discretisation. This allows a reliable estimation of the solution error. Our results indicate a first order convergence of the error as a control parameter is decreased, and we outline a framework for analysis.

  15. What are the assets and weaknesses of HFO detectors? A benchmark framework based on realistic simulations.

    Directory of Open Access Journals (Sweden)

    Nicolas Roehri

    Full Text Available High-frequency oscillations (HFO have been suggested as biomarkers of epileptic tissues. While visual marking of these short and small oscillations is tedious and time-consuming, automatic HFO detectors have not yet met a large consensus. Even though detectors have been shown to perform well when validated against visual marking, the large number of false detections due to their lack of robustness hinder their clinical application. In this study, we developed a validation framework based on realistic and controlled simulations to quantify precisely the assets and weaknesses of current detectors. We constructed a dictionary of synthesized elements-HFOs and epileptic spikes-from different patients and brain areas by extracting these elements from the original data using discrete wavelet transform coefficients. These elements were then added to their corresponding simulated background activity (preserving patient- and region- specific spectra. We tested five existing detectors against this benchmark. Compared to other studies confronting detectors, we did not only ranked them according their performance but we investigated the reasons leading to these results. Our simulations, thanks to their realism and their variability, enabled us to highlight unreported issues of current detectors: (1 the lack of robust estimation of the background activity, (2 the underestimated impact of the 1/f spectrum, and (3 the inadequate criteria defining an HFO. We believe that our benchmark framework could be a valuable tool to translate HFOs into a clinical environment.

  16. Advances in Discrete-Event Simulation for MSL Command Validation

    Science.gov (United States)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  17. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  18. Event-by-event simulation of quantum phenomena

    NARCIS (Netherlands)

    Raedt, H. De; Raedt, K. De; Michielsen, K.; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    In various basic experiments in quantum physics, observations are recorded event-by-event. The final outcome of such experiments can be computed according to the rules of quantum theory but quantum theory does not describe single events. In this paper, we describe a stimulation approach that does

  19. Numerical simulation of internal reconnection event in spherical tokamak

    International Nuclear Information System (INIS)

    Hayashi, Takaya; Mizuguchi, Naoki; Sato, Tetsuya

    1999-07-01

    Three-dimensional magnetohydrodynamic simulations are executed in a full toroidal geometry to clarify the physical mechanisms of the Internal Reconnection Event (IRE), which is observed in the spherical tokamak experiments. The simulation results reproduce several main properties of IRE. Comparison between the numerical results and experimental observation indicates fairly good agreements regarding nonlinear behavior, such as appearance of localized helical distortion, appearance of characteristic conical shape in the pressure profile during thermal quench, and subsequent appearance of the m=2/n=1 type helical distortion of the torus. (author)

  20. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  1. Neural Correlates of Realistic and Unrealistic Auditory Space Perception

    Directory of Open Access Journals (Sweden)

    Akiko Callan

    2011-10-01

    Full Text Available Binaural recordings can simulate externalized auditory space perception over headphones. However, if the orientation of the recorder's head and the orientation of the listener's head are incongruent, the simulated auditory space is not realistic. For example, if a person lying flat on a bed listens to an environmental sound that was recorded by microphones inserted in ears of a person who was in an upright position, the sound simulates an auditory space rotated 90 degrees to the real-world horizontal axis. Our question is whether brain activation patterns are different between the unrealistic auditory space (ie, the orientation of the listener's head and the orientation of the recorder's head are incongruent and the realistic auditory space (ie, the orientations are congruent. River sounds that were binaurally recorded either in a supine position or in an upright body position were served as auditory stimuli. During fMRI experiments, participants listen to the stimuli and pressed one of two buttons indicating the direction of the water flow (horizontal/vertical. Behavioral results indicated that participants could not differentiate between the congruent and the incongruent conditions. However, neuroimaging results showed that the congruent condition activated the planum temporale significantly more than the incongruent condition.

  2. Atomic level insights into realistic molecular models of dendrimer-drug complexes through MD simulations

    Science.gov (United States)

    Jain, Vaibhav; Maiti, Prabal K.; Bharatam, Prasad V.

    2016-09-01

    Computational studies performed on dendrimer-drug complexes usually consider 1:1 stoichiometry, which is far from reality, since in experiments more number of drug molecules get encapsulated inside a dendrimer. In the present study, molecular dynamic (MD) simulations were implemented to characterize the more realistic molecular models of dendrimer-drug complexes (1:n stoichiometry) in order to understand the effect of high drug loading on the structural properties and also to unveil the atomistic level details. For this purpose, possible inclusion complexes of model drug Nateglinide (Ntg) (antidiabetic, belongs to Biopharmaceutics Classification System class II) with amine- and acetyl-terminated G4 poly(amidoamine) (G4 PAMAM(NH2) and G4 PAMAM(Ac)) dendrimers at neutral and low pH conditions are explored in this work. MD simulation analysis on dendrimer-drug complexes revealed that the drug encapsulation efficiency of G4 PAMAM(NH2) and G4 PAMAM(Ac) dendrimers at neutral pH was 6 and 5, respectively, while at low pH it was 12 and 13, respectively. Center-of-mass distance analysis showed that most of the drug molecules are located in the interior hydrophobic pockets of G4 PAMAM(NH2) at both the pH; while in the case of G4 PAMAM(Ac), most of them are distributed near to the surface at neutral pH and in the interior hydrophobic pockets at low pH. Structural properties such as radius of gyration, shape, radial density distribution, and solvent accessible surface area of dendrimer-drug complexes were also assessed and compared with that of the drug unloaded dendrimers. Further, binding energy calculations using molecular mechanics Poisson-Boltzmann surface area approach revealed that the location of drug molecules in the dendrimer is not the decisive factor for the higher and lower binding affinity of the complex, but the charged state of dendrimer and drug, intermolecular interactions, pH-induced conformational changes, and surface groups of dendrimer do play an

  3. Nuclear facility safeguards systems modeling using discrete event simulation

    International Nuclear Information System (INIS)

    Engi, D.

    1977-01-01

    The threat of theft or dispersal of special nuclear material at a nuclear facility is treated by studying the temporal relationships between adversaries having authorized access to the facility (insiders) and safeguards system events by using a GASP IV discrete event simulation. The safeguards system events--detection, assessment, delay, communications, and neutralization--are modeled for the general insider adversary strategy which includes degradation of the safeguards system elements followed by an attempt to steal or disperse special nuclear material. The performance measure used in the analysis is the estimated probability of safeguards system success in countering the adversary based upon a predetermined set of adversary actions. An exemplary problem which includes generated results is presented for a hypothetical nuclear facility. The results illustrate representative information that could be utilized by safeguards decision-makers

  4. Simulating microtransport in realistic porous media

    NARCIS (Netherlands)

    Lopez Penha, D.J.

    2012-01-01

    Simulations in porous media widely adopt macroscopic models of transport phenomena. These models are computationally efficient as not all geometrical details at the pore scale are accounted for. Generally, these models require closure relations for effective transport parameters, where the

  5. Event simulation for the WA80 experiment

    International Nuclear Information System (INIS)

    Sorensen, S.P.

    1986-01-01

    The HIJET and LUND event generators are compared. It is concluded that for detector construction and design of experimental setups, the differences between the two models are marginal. The coverage of the WA80 setup in pseudorapidity and energy is demonstrated. The performance of some of the WA80 detectors (zero-degree calorimeter, wall calorimeter, multiplicity array, and SAPHIR lead-glass detector) is evaluated based on calculations with the LUND or the HIJET codes combined with codes simulating the detector responses. 9 refs., 3 figs

  6. Discrete event simulation in an artificial intelligence environment: Some examples

    International Nuclear Information System (INIS)

    Roberts, D.J.; Farish, T.

    1991-01-01

    Several Los Alamos National Laboratory (LANL) object-oriented discrete-event simulation efforts have been completed during the past three years. One of these systems has been put into production and has a growing customer base. Another (started two years earlier than the first project) was completed but has not yet been used. This paper will describe these simulation projects. Factors which were pertinent to the success of the one project, and to the failure of the second project will be discussed (success will be measured as the extent to which the simulation model was used as originally intended). 5 figs

  7. FROG: The Fast And Realistic OpenGL Event Displayer

    CERN Document Server

    Quertenmont, Loic

    2009-01-01

    FROG [1] [2] is a generic framework dedicated to visualisation of events in high energy experiment. It is suitable to any particular physics experiment or detector design. The code is light (< 3 MB) and fast (browsing time 20 events per second for a large High Energy Physics experiment) and can run on various operating systems, as its object-oriented structure (C++) relies on the cross-platform OPENGL [3] and GLUT [4] libraries. Moreover, FROG does not require installation of third party libraries for the visualisation. This documents describes the features and principles of FROG version 1.106, its working scheme and numerous functionalities such as: 3D and 2D visualisation, graphical user interface, mouse interface, configuration files, production of pictures of various format, integration of personal objects, etc. Finally the application of FROG for physic experiment/environement, such as Gastof, CMS, ILD, Delphes will be presented for illustration.

  8. Automatic temperature computation for realistic IR simulation

    Science.gov (United States)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  9. Computational investigation of nonlinear microwave tomography on anatomically realistic breast phantoms

    DEFF Research Database (Denmark)

    Jensen, P. D.; Rubæk, Tonny; Mohr, J. J.

    2013-01-01

    The performance of a nonlinear microwave tomography algorithm is tested using simulated data from anatomically realistic breast phantoms. These tests include several different anatomically correct breast models from the University of Wisconsin-Madison repository with and without tumors inserted....

  10. Event-by-event jet quenching

    Energy Technology Data Exchange (ETDEWEB)

    Fries, R.J.; Rodriguez, R.; Ramirez, E.

    2010-08-14

    High momentum jets and hadrons can be used as probes for the quark gluon plasma (QGP) formed in nuclear collisions at high energies. We investigate the influence of fluctuations in the fireball on jet quenching observables by comparing propagation of light quarks and gluons through averaged, smooth QGP fireballs with event-by-event jet quenching using realistic inhomogeneous fireballs. We find that the transverse momentum and impact parameter dependence of the nuclear modification factor R{sub AA} can be fit well in an event-by-event quenching scenario within experimental errors. However the transport coefficient {cflx q} extracted from fits to the measured nuclear modification factor R{sub AA} in averaged fireballs underestimates the value from event-by-event calculations by up to 50%. On the other hand, after adjusting {cflx q} to fit R{sub AA} in the event-by-event analysis we find residual deviations in the azimuthal asymmetry v{sub 2} and in two-particle correlations, that provide a possible faint signature for a spatial tomography of the fireball. We discuss a correlation function that is a measure for spatial inhomogeneities in a collision and can be constrained from data.

  11. Event-by-event jet quenching

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, R. [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843 (United States); Fries, R.J., E-mail: rjfries@comp.tamu.ed [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843 (United States); RIKEN/BNL Research Center, Brookhaven National Laboratory, Upton, NY 11973 (United States); Ramirez, E. [Physics Department, University of Texas El Paso, El Paso, TX 79968 (United States)

    2010-09-27

    High momentum jets and hadrons can be used as probes for the quark gluon plasma (QGP) formed in nuclear collisions at high energies. We investigate the influence of fluctuations in the fireball on jet quenching observables by comparing propagation of light quarks and gluons through averaged, smooth QGP fireballs with event-by-event jet quenching using realistic inhomogeneous fireballs. We find that the transverse momentum and impact parameter dependence of the nuclear modification factor R{sub AA} can be fit well in an event-by-event quenching scenario within experimental errors. However the transport coefficient q extracted from fits to the measured nuclear modification factor R{sub AA} in averaged fireballs underestimates the value from event-by-event calculations by up to 50%. On the other hand, after adjusting q to fit R{sub AA} in the event-by-event analysis we find residual deviations in the azimuthal asymmetry v{sub 2} and in two-particle correlations, that provide a possible faint signature for a spatial tomography of the fireball. We discuss a correlation function that is a measure for spatial inhomogeneities in a collision and can be constrained from data.

  12. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  13. Mixed-realism simulation of adverse event disclosure: an educational methodology and assessment instrument.

    Science.gov (United States)

    Matos, Francisco M; Raemer, Daniel B

    2013-04-01

    Physicians have an ethical duty to disclose adverse events to patients or families. Various strategies have been reported for teaching disclosure, but no instruments have been shown to be reliable for assessing them.The aims of this study were to report a structured method for teaching adverse event disclosure using mixed-realism simulation, develop and begin to validate an instrument for assessing performance, and describe the disclosure practice of anesthesiology trainees. Forty-two anesthesiology trainees participated in a 2-part exercise with mixed-realism simulation. The first part took place using a mannequin patient in a simulated operating room where trainees became enmeshed in a clinical episode that led to an adverse event and the second part in a simulated postoperative care unit where the learner is asked to disclose to a standardized patient who systematically moves through epochs of grief response. Two raters scored subjects using an assessment instrument we developed that combines a 4-element behaviorally anchored rating scale (BARS) and a 5-stage objective rating scale. The performance scores for elements within the BARS and the 5-stage instrument showed excellent interrater reliability (Cohen's κ = 0.7), appropriate range (mean range for BARS, 4.20-4.47; mean range for 5-stage instrument, 3.73-4.46), and high internal consistency (P realism simulation that engages learners in an adverse event and allows them to practice disclosure to a structured range of patient responses. We have developed a reliable 2-part instrument with strong psychometric properties for assessing disclosure performance.

  14. Event-by-Event Observables and Fluctuations

    International Nuclear Information System (INIS)

    Petersen, Hannah

    2013-01-01

    In this talk the status and open questions of the phenomenological description of all the stages of a heavy ion reaction are highlighted. Special emphasis is put on event-by-event fluctuations and associated observables. The first part is concentrated on high RHIC and LHC energies and the second part reviews the challenges for modeling heavy ion reactions at lower beam energies in a more realistic fashion. Overall, the main conclusion is that sophisticated theoretical dynamical approaches that describe many observables in the same framework are essential for the quantitative understanding of the properties of hot and dense nuclear matter

  15. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Sezen, Halil [The Ohio State Univ., Columbus, OH (United States). Dept. of Civil, Environmental and Geodetic Engineering; Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States). College of Engineering, Nuclear Engineering Program, Dept. of Mechanical and Aerospace Engineering; Denning, R. [The Ohio State Univ., Columbus, OH (United States); Vaidya, N. [Rizzo Associates, Pittsburgh, PA (United States)

    2017-12-29

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  16. Multiple discrete-energy ion features in the inner magnetosphere: 9 February 1998, event

    Directory of Open Access Journals (Sweden)

    Y. Ebihara

    2004-04-01

    Full Text Available Multiple discrete-energy ion bands observed by the Polar satellite in the inner magnetosphere on 9 February 1998 were investigated by means of particle simulation with a realistic model of the convection electric field. The multiple bands appeared in the energy vs. L spectrum in the 1–100 keV range when Polar traveled in the heart of the ring current along the outbound and inbound paths. We performed particle tracing, and simulated the energy vs. L spectra of proton fluxes under the dipole magnetic field, the corotation electric field, and the realistic convection electric field model with its parameters depending on the solar wind data. Simulated spectra are shown to agree well with the observed ones. A better agreement is achieved when we rotate the convection electric potential eastward by 2h inMLT and we change the distribution function in time in the near-Earth magnetotail. It is concluded that the multiple bands are likely produced by two processes for this particular event, that is, changes in the convection electric field (for >3keV protons and changes in the distribution function in the near-Earth magnetotail (for <3keV protons. Key words. Magnetospheric physics (energetic particles, trapped; electric field – Space plasma physics (numerical simulation studies

  17. Device simulation of charge collection and single-event upset

    International Nuclear Information System (INIS)

    Dodd, P.E.

    1996-01-01

    In this paper the author reviews the current status of device simulation of ionizing-radiation-induced charge collection and single-event upset (SEU), with an emphasis on significant results of recent years. The author presents an overview of device-modeling techniques applicable to the SEU problem and the unique challenges this task presents to the device modeler. He examines unloaded simulations of radiation-induced charge collection in simple p/n diodes, SEU in dynamic random access memories (DRAM's), and SEU in static random access memories (SRAM's). The author concludes with a few thoughts on future issues likely to confront the SEU device modeler

  18. Assessing methane emission estimation methods based on atmospheric measurements from oil and gas production using LES simulations

    Science.gov (United States)

    Saide, P. E.; Steinhoff, D.; Kosovic, B.; Weil, J.; Smith, N.; Blewitt, D.; Delle Monache, L.

    2017-12-01

    There are a wide variety of methods that have been proposed and used to estimate methane emissions from oil and gas production by using air composition and meteorology observations in conjunction with dispersion models. Although there has been some verification of these methodologies using controlled releases and concurrent atmospheric measurements, it is difficult to assess the accuracy of these methods for more realistic scenarios considering factors such as terrain, emissions from multiple components within a well pad, and time-varying emissions representative of typical operations. In this work we use a large-eddy simulation (LES) to generate controlled but realistic synthetic observations, which can be used to test multiple source term estimation methods, also known as an Observing System Simulation Experiment (OSSE). The LES is based on idealized simulations of the Weather Research & Forecasting (WRF) model at 10 m horizontal grid-spacing covering an 8 km by 7 km domain with terrain representative of a region located in the Barnett shale. Well pads are setup in the domain following a realistic distribution and emissions are prescribed every second for the components of each well pad (e.g., chemical injection pump, pneumatics, compressor, tanks, and dehydrator) using a simulator driven by oil and gas production volume, composition and realistic operational conditions. The system is setup to allow assessments under different scenarios such as normal operations, during liquids unloading events, or during other prescribed operational upset events. Methane and meteorology model output are sampled following the specifications of the emission estimation methodologies and considering typical instrument uncertainties, resulting in realistic observations (see Figure 1). We will show the evaluation of several emission estimation methods including the EPA Other Test Method 33A and estimates using the EPA AERMOD regulatory model. We will also show source estimation

  19. Prototype Development Capabilities of 3D Spatial Interactions and Failures During Scenario Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Steven Prescott; Ramprasad Sampath; Curtis Smith; Tony Koonce

    2014-09-01

    Computers have been used for 3D modeling and simulation, but only recently have computational resources been able to give realistic results in a reasonable time frame for large complex models. This report addressed the methods, techniques, and resources used to develop a prototype for using 3D modeling and simulation engine to improve risk analysis and evaluate reactor structures and components for a given scenario. The simulations done for this evaluation were focused on external events, specifically tsunami floods, for a hypothetical nuclear power facility on a coastline.

  20. Two-dimensional numerical simulation of the effect of single event burnout for n-channel VDMOSFET

    International Nuclear Information System (INIS)

    Guo Hongxia; Chen Yusheng; Wang Wei; Zhao Jinlong; Zhang Yimen; Zhou Hui

    2004-01-01

    2D MEDICI simulator is used to investigate the effect of Single Event Burnout (SEB) for n-channel power VDMOSFETs. The simulation results are consistent with experimental results which have been published. The simulation results are of great interest for a better understanding of the occurrence of events. The effects of the minority carrier lifetime in the base region, the base width and the emitter doping density on SEB susceptibility are verified. Some hardening solutions to SEB are provided. The work shows that the 2D simulator MEDICI is an useful tool for burnout prediction and for the evaluation of hardening solutions. (authors)

  1. In situ simulation: Taking reported critical incidents and adverse events back to the clinic

    DEFF Research Database (Denmark)

    Juul, Jonas; Paltved, Charlotte; Krogh, Kristian

    2014-01-01

    for content analysis4 and thematic analysis5. Medical experts and simulation faculty will design scenarios for in situ simulation training based on the analysis. Short-term observations using time logs will be performed along with interviews with key informants at the departments. Video data will be collected...... improve patient safety if coupled with training and organisational support2. Insight into the nature of reported critical incidents and adverse events can be used in writing in situ simulation scenarios and thus lead to interventions that enhance patient safety. The patient safety literature emphasises...... well-developed non-technical skills in preventing medical errors3. Furthermore, critical incidents and adverse events reporting systems comprise a knowledgebase to gain in-depth insights into patient safety issues. This study explores the use of critical incidents and adverse events reports to inform...

  2. Comparison of discrete event simulation tools in an academic environment

    Directory of Open Access Journals (Sweden)

    Mario Jadrić

    2014-12-01

    Full Text Available A new research model for simulation software evaluation is proposed consisting of three main categories of criteria: modeling and simulation capabilities of the explored tools, and tools’ input/output analysis possibilities, all with respective sub-criteria. Using the presented model, two discrete event simulation tools are evaluated in detail using the task-centred scenario. Both tools (Arena and ExtendSim were used for teaching discrete event simulation in preceding academic years. With the aim to inspect their effectiveness and to help us determine which tool is more suitable for students i.e. academic purposes, we used a simple simulation model of entities competing for limited resources. The main goal was to measure subjective (primarily attitude and objective indicators while using the tools when the same simulation scenario is given. The subjects were first year students of Master studies in Information Management at the Faculty of Economics in Split taking a course in Business Process Simulations (BPS. In a controlled environment – in a computer lab, two groups of students were given detailed, step-by-step instructions for building models using both tools - first using ExtendSim then Arena or vice versa. Subjective indicators (students’ attitudes were collected using an online survey completed immediately upon building each model. Subjective indicators primarily include students’ personal estimations of Arena and ExtendSim capabilities/features for model building, model simulation and result analysis. Objective indicators were measured using specialised software that logs information on user's behavior while performing a particular task on their computer such as distance crossed by mouse during model building, the number of mouse clicks, usage of the mouse wheel and speed achieved. The results indicate that ExtendSim is well preferred comparing to Arena with regards to subjective indicators while the objective indicators are

  3. An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Donev, A; Garcia, A L; Alder, B J

    2007-07-30

    A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.

  4. Realistic modeling of chamber transport for heavy-ion fusion

    International Nuclear Information System (INIS)

    Sharp, W.M.; Grote, D.P.; Callahan, D.A.; Tabak, M.; Henestroza, E.; Yu, S.S.; Peterson, P.F.; Welch, D.R.; Rose, D.V.

    2003-01-01

    Transport of intense heavy-ion beams to an inertial-fusion target after final focus is simulated here using a realistic computer model. It is found that passing the beam through a rarefied plasma layer before it enters the fusion chamber can largely neutralize the beam space charge and lead to a usable focal spot for a range of ion species and input conditions

  5. Efficient rare-event simulation for multiple jump events in regularly varying random walks and compound Poisson processes

    NARCIS (Netherlands)

    B. Chen (Bohan); J. Blanchet; C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2017-01-01

    textabstractWe propose a class of strongly efficient rare event simulation estimators for random walks and compound Poisson processes with a regularly varying increment/jump-size distribution in a general large deviations regime. Our estimator is based on an importance sampling strategy that hinges

  6. Can discrete event simulation be of use in modelling major depression?

    Science.gov (United States)

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-12-05

    Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.

  7. Discrete-Event Simulation Unmasks the Quantum Cheshire Cat

    Science.gov (United States)

    Michielsen, Kristel; Lippert, Thomas; Raedt, Hans De

    2017-05-01

    It is shown that discrete-event simulation accurately reproduces the experimental data of a single-neutron interferometry experiment [T. Denkmayr {\\sl et al.}, Nat. Commun. 5, 4492 (2014)] and provides a logically consistent, paradox-free, cause-and-effect explanation of the quantum Cheshire cat effect without invoking the notion that the neutron and its magnetic moment separate. Describing the experimental neutron data using weak-measurement theory is shown to be useless for unravelling the quantum Cheshire cat effect.

  8. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  9. Electromagnetic field effect simulation over a realistic pixel ed phantom human's brain

    Energy Technology Data Exchange (ETDEWEB)

    Rojas, R.; Calderon, J. A.; Rivera, T. [IPN, Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada, Calz. Legaria No. 694, Col. Irrigacion, 11500 Mexico D. F. (Mexico); Azorin, J., E-mail: rafaelturing@prodigy.net.mx [Universidad Autonoma Metropolitana, Unidad Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, 09340 Mexico D. F. (Mexico)

    2012-10-15

    The exposition to different types of electromagnetic radiations can produce damages and injures on the people's tissues. The scientist, spend time and resources studying the effects of electromagnetic fields over the organs. Particularly in medical areas, the specialist in imaging methodologies and radiological treatment, are very worried about no injure there patient. Determination of matter radiation interaction, can be experimental or theoretical is not an easy task anyway. At first case, is not possible make measures inside the patient, then the experimental procedure consist in make measures in human's dummy, however, is not possible see deformations of electromagnetic fields due the organs presence. In the second case, is necessary solve, the Maxwell's equations with the electromagnetic field, crossing a lot of organs and tissues with different electric and magnetic properties each one. One alternative for theoretical solution, is make a computational simulation, however, this option, require an enormous quantity of memory and large computational times. Then, the most simulations are making in 2 dimensional or in 3 dimensional although using human models approximations, build ed with basic geometrical figures, like spheres, cylinders, ellipsoids, etc. Obviously this models just lets obtain a coarse solution of the actually situation. In this work, we propose a novel methodology to build a realistic pixel ed phantom of human's organs, and solve the Maxwell's equations over this models, evidently, the solutions are more approximated to the real behaviour. Additionally, there models results optimized when they are discretized and the finite element method is used to calculate the electromagnetic field and the induced currents. (Author)

  10. Interferences and events on epistemic shifts in physics through computer simulations

    CERN Document Server

    Warnke, Martin

    2017-01-01

    Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.

  11. Realistic Vendor-Specific Synthetic Ultrasound Data for Quality Assurance of 2-D Speckle Tracking Echocardiography: Simulation Pipeline and Open Access Database.

    Science.gov (United States)

    Alessandrini, Martino; Chakraborty, Bidisha; Heyde, Brecht; Bernard, Olivier; De Craene, Mathieu; Sermesant, Maxime; D'Hooge, Jan

    2018-03-01

    Two-dimensional (2-D) echocardiography is the modality of choice in the clinic for the diagnosis of cardiac disease. Hereto, speckle tracking (ST) packages complement visual assessment by the cardiologist by providing quantitative diagnostic markers of global and regional cardiac function (e.g., displacement, strain, and strain-rate). Yet, the reported high vendor-dependence between the outputs of different ST packages raises clinical concern and hampers the widespread dissemination of the ST technology. In part, this is due to the lack of a solid commonly accepted quality assurance pipeline for ST packages. Recently, we have developed a framework to benchmark ST algorithms for 3-D echocardiography by using realistic simulated volumetric echocardiographic recordings. Yet, 3-D echocardiography remains an emerging technology, whereas the compelling clinical concern is, so far, directed to the standardization of 2-D ST only. Therefore, by building upon our previous work, we present in this paper a pipeline to generate realistic synthetic sequences for 2-D ST algorithms. Hereto, the synthetic cardiac motion is obtained from a complex electromechanical heart model, whereas realistic vendor-specific texture is obtained by sampling a real clinical ultrasound recording. By modifying the parameters in our pipeline, we generated an open-access library of 105 synthetic sequences encompassing: 1) healthy and ischemic motion patterns; 2) the most common apical probe orientations; and 3) vendor-specific image quality from seven different systems. Ground truth deformation is also provided to allow performance analysis. The application of the provided data set is also demonstrated in the benchmarking of a recent academic ST algorithm.

  12. Simulating single-event burnout of n-channel power MOSFET's

    International Nuclear Information System (INIS)

    Johnson, G.H.; Hohl, J.H.; Schrimpf, R.D.; Galloway, K.F.

    1993-01-01

    Heavy ions are ubiquitous in a space environment. Single-event burnout of power MOSFET's is a sudden catastrophic failure mechanism that is initiated by the passage of a heavy ion through the device structure. The passage of the heavy ion generates a current filament that locally turns on a parasitic n-p-n transistor inherent to the power MOSFET. Subsequent high currents and high voltage in the device induce second breakdown of the parasitic bipolar transistor and hence meltdown of the device. This paper presents a model that can be used for simulating the burnout mechanism in order to gain insight into the significant device parameters that most influence the single-event burnout susceptibility of n-channel power MOSFET's

  13. Combining NMR ensembles and molecular dynamics simulations provides more realistic models of protein structures in solution and leads to better chemical shift prediction

    International Nuclear Information System (INIS)

    Lehtivarjo, Juuso; Tuppurainen, Kari; Hassinen, Tommi; Laatikainen, Reino; Peräkylä, Mikael

    2012-01-01

    While chemical shifts are invaluable for obtaining structural information from proteins, they also offer one of the rare ways to obtain information about protein dynamics. A necessary tool in transforming chemical shifts into structural and dynamic information is chemical shift prediction. In our previous work we developed a method for 4D prediction of protein 1 H chemical shifts in which molecular motions, the 4th dimension, were modeled using molecular dynamics (MD) simulations. Although the approach clearly improved the prediction, the X-ray structures and single NMR conformers used in the model cannot be considered fully realistic models of protein in solution. In this work, NMR ensembles (NMRE) were used to expand the conformational space of proteins (e.g. side chains, flexible loops, termini), followed by MD simulations for each conformer to map the local fluctuations. Compared with the non-dynamic model, the NMRE+MD model gave 6–17% lower root-mean-square (RMS) errors for different backbone nuclei. The improved prediction indicates that NMR ensembles with MD simulations can be used to obtain a more realistic picture of protein structures in solutions and moreover underlines the importance of short and long time-scale dynamics for the prediction. The RMS errors of the NMRE+MD model were 0.24, 0.43, 0.98, 1.03, 1.16 and 2.39 ppm for 1 Hα, 1 HN, 13 Cα, 13 Cβ, 13 CO and backbone 15 N chemical shifts, respectively. The model is implemented in the prediction program 4DSPOT, available at http://www.uef.fi/4dspothttp://www.uef.fi/4dspot.

  14. Combining NMR ensembles and molecular dynamics simulations provides more realistic models of protein structures in solution and leads to better chemical shift prediction

    Energy Technology Data Exchange (ETDEWEB)

    Lehtivarjo, Juuso, E-mail: juuso.lehtivarjo@uef.fi; Tuppurainen, Kari; Hassinen, Tommi; Laatikainen, Reino [University of Eastern Finland, School of Pharmacy (Finland); Peraekylae, Mikael [University of Eastern Finland, Institute of Biomedicine (Finland)

    2012-03-15

    While chemical shifts are invaluable for obtaining structural information from proteins, they also offer one of the rare ways to obtain information about protein dynamics. A necessary tool in transforming chemical shifts into structural and dynamic information is chemical shift prediction. In our previous work we developed a method for 4D prediction of protein {sup 1}H chemical shifts in which molecular motions, the 4th dimension, were modeled using molecular dynamics (MD) simulations. Although the approach clearly improved the prediction, the X-ray structures and single NMR conformers used in the model cannot be considered fully realistic models of protein in solution. In this work, NMR ensembles (NMRE) were used to expand the conformational space of proteins (e.g. side chains, flexible loops, termini), followed by MD simulations for each conformer to map the local fluctuations. Compared with the non-dynamic model, the NMRE+MD model gave 6-17% lower root-mean-square (RMS) errors for different backbone nuclei. The improved prediction indicates that NMR ensembles with MD simulations can be used to obtain a more realistic picture of protein structures in solutions and moreover underlines the importance of short and long time-scale dynamics for the prediction. The RMS errors of the NMRE+MD model were 0.24, 0.43, 0.98, 1.03, 1.16 and 2.39 ppm for {sup 1}H{alpha}, {sup 1}HN, {sup 13}C{alpha}, {sup 13}C{beta}, {sup 13}CO and backbone {sup 15}N chemical shifts, respectively. The model is implemented in the prediction program 4DSPOT, available at http://www.uef.fi/4dspothttp://www.uef.fi/4dspot.

  15. Electron percolation in realistic models of carbon nanotube networks

    International Nuclear Information System (INIS)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-01-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models

  16. Electron percolation in realistic models of carbon nanotube networks

    Science.gov (United States)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-09-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models.

  17. Modeling the dynamics of a storm-time acceleration event: combining MHD effects with wave-particle interactions

    Science.gov (United States)

    Elkington, S. R.; Alam, S. S.; Chan, A. A.; Albert, J.; Jaynes, A. N.; Baker, D. N.; Wiltberger, M. J.

    2017-12-01

    Global simulations of radiation belt dynamics are often undertaken using either a transport formalism (e.g. Fokker-Plank), or via test particle simulations in model electric and magnetic fields. While transport formalisms offer computational efficiency and the ability to deal with a wide range of wave-particle interactions, they typically rely on simplified background fields, and often are limited to empirically-specified stochastic (diffusive) wave-particle interactions. On the other hand, test particle simulations may be carried out in global MHD simulations that include realistic physical effects such as magnetopause shadowing, convection, and substorm injections, but lack the ability to handle physics outside the MHD approximation in the realm of higher frequency (kHz) wave populations.In this work we introduce a comprehensive simulation framework combining global MHD/test particle techniques to provide realistic background fields and radial transport processes, with a Stochastic Differential Equation (SDE) method for addressing high frequency wave-particle interactions. We examine the March 17, 2013 storm-time acceleration period, an NSF-GEM focus challenge event, and use the framework to examine the relative importance of physical effects such as magnetopause shadowing, diffusive and advective transport processes, and wave-particle interactions through the various phases of the storm.

  18. A Green's function method for simulation of time-dependent solute transport and reaction in realistic microvascular geometries.

    Science.gov (United States)

    Secomb, Timothy W

    2016-12-01

    A novel theoretical method is presented for simulating the spatially resolved convective and diffusive transport of reacting solutes between microvascular networks and the surrounding tissues. The method allows for efficient computational solution of problems involving convection and non-linear binding of solutes in blood flowing through microvascular networks with realistic 3D geometries, coupled with transvascular exchange and diffusion and reaction in the surrounding tissue space. The method is based on a Green's function approach, in which the solute concentration distribution in the tissue is expressed as a sum of fields generated by time-varying distributions of discrete sources and sinks. As an example of the application of the method, the washout of an inert diffusible tracer substance from a tissue region perfused by a network of microvessels is simulated, showing its dependence on the solute's transvascular permeability and tissue diffusivity. Exponential decay of the washout concentration is predicted, with rate constants that are about 10-30% lower than the rate constants for a tissue cylinder model with the same vessel length, vessel surface area and blood flow rate per tissue volume. © The authors 2015. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  19. Simulation and event reconstruction inside the PandaRoot framework

    International Nuclear Information System (INIS)

    Spataro, S

    2008-01-01

    The PANDA detector will be located at the future GSI accelerator FAIR. Its primary objective is the investigation of strong interaction with anti-proton beams, in the range up to 15 GeV/c as momentum of the incoming anti-proton. The PANDA offline simulation framework is called 'PandaRoot', as it is based upon the ROOT 5.14 package. It is characterized by a high versatility; it allows to perform simulation and analysis, to run different event generators (EvtGen, Pluto, UrQmd), different transport models (Geant3, Geant4, Fluka) with the same code, thus to compare the results simply by changing few macro lines without recompiling at all. Moreover auto-configuration scripts allow installing the full framework easily in different Linux distributions and with different compilers (the framework was installed and tested in more than 10 Linux platforms) without further manipulation. The final data are in a tree format, easily accessible and readable through simple clicks on the root browsers. The presentation will report on the actual status of the computing development inside the PandaRoot framework, in terms of detector implementation and event reconstruction

  20. Role-playing for more realistic technical skills training.

    Science.gov (United States)

    Nikendei, C; Zeuch, A; Dieckmann, P; Roth, C; Schäfer, S; Völkl, M; Schellberg, D; Herzog, W; Jünger, J

    2005-03-01

    Clinical skills are an important and necessary part of clinical competence. Simulation plays an important role in many fields of medical education. Although role-playing is common in communication training, there are no reports about the use of student role-plays in the training of technical clinical skills. This article describes an educational intervention with analysis of pre- and post-intervention self-selected student survey evaluations. After one term of skills training, a thorough evaluation showed that the skills-lab training did not seem very realistic nor was it very demanding for trainees. To create a more realistic training situation and to enhance students' involvement, case studies and role-plays with defined roles for students (i.e. intern, senior consultant) were introduced into half of the sessions. Results of the evaluation in the second term showed that sessions with role-playing were rated significantly higher than sessions without role-playing.

  1. Can discrete event simulation be of use in modelling major depression?

    Directory of Open Access Journals (Sweden)

    François Clément

    2006-12-01

    Full Text Available Abstract Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors, our aim was to clarify to what extent "Discrete Event Simulation" (DES models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.. Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful

  2. DeMO: An Ontology for Discrete-event Modeling and Simulation

    Science.gov (United States)

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  3. Realistic terrain visualization based on 3D virtual world technology

    Science.gov (United States)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  4. Asymmetric focusing study from twin input power couplers using realistic rf cavity field maps

    Directory of Open Access Journals (Sweden)

    Colwyn Gulliford

    2011-03-01

    Full Text Available Advanced simulation codes now exist that can self-consistently solve Maxwell’s equations for the combined system of an rf cavity and a beam bunch. While these simulations are important for a complete understanding of the beam dynamics in rf cavities, they require significant time and computing power. These techniques are therefore not readily included in real time simulations useful to the beam physicist during beam operations. Thus, there exists a need for a simplified algorithm which simulates realistic cavity fields significantly faster than self-consistent codes, while still incorporating enough of the necessary physics to ensure accurate beam dynamics computation. To this end, we establish a procedure for producing realistic field maps using lossless cavity eigenmode field solvers. This algorithm incorporates all relevant cavity design and operating parameters, including beam loading from a nonrelativistic beam. The algorithm is then used to investigate the asymmetric quadrupolelike focusing produced by the input couplers of the Cornell ERL injector cavity for a variety of beam and operating parameters.

  5. Numerical simulations of an advection fog event over Shanghai Pudong International Airport with the WRF model

    Science.gov (United States)

    Lin, Caiyan; Zhang, Zhongfeng; Pu, Zhaoxia; Wang, Fengyun

    2017-10-01

    A series of numerical simulations is conducted to understand the formation, evolution, and dissipation of an advection fog event over Shanghai Pudong International Airport (ZSPD) with the Weather Research and Forecasting (WRF) model. Using the current operational settings at the Meteorological Center of East China Air Traffic Management Bureau, the WRF model successfully predicts the fog event at ZSPD. Additional numerical experiments are performed to examine the physical processes associated with the fog event. The results indicate that prediction of this particular fog event is sensitive to microphysical schemes for the time of fog dissipation but not for the time of fog onset. The simulated timing of the arrival and dissipation of the fog, as well as the cloud distribution, is substantially sensitive to the planetary boundary layer and radiation (both longwave and shortwave) processes. Moreover, varying forecast lead times also produces different simulation results for the fog event regarding its onset and duration, suggesting a trade-off between more accurate initial conditions and a proper forecast lead time that allows model physical processes to spin up adequately during the fog simulation. The overall outcomes from this study imply that the complexity of physical processes and their interactions within the WRF model during fog evolution and dissipation is a key area of future research.

  6. Discrete-Event Simulation with Agents for Modeling of Dynamic Asymmetric Threats in Maritime Security

    National Research Council Canada - National Science Library

    Ng, Chee W

    2007-01-01

    .... Discrete-event simulation (DES) was used to simulate a typical port-security, local, waterside-threat response model and to test the adaptive response of asymmetric threats in reaction to port-security procedures, while a multi-agent system (MAS...

  7. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    Science.gov (United States)

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  8. Nuclear power plant simulation on the AD10

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.; Stritar, A.

    1985-01-01

    A combination of advanced modeling techniques and the modern, special-purpose peripheral minicomputer AD10 is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times greater than actual process speeds. Results are shown for a BWR plant simulation. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the recirculation loop and feedwater train. Point kinetics incorporate reactivity feedback for void fraction, for fuel temperature, for coolant temperature, and for boron concentration. Control systems and trip logic are simulated for the nuclear steam supply system. 4 refs., 3 figs

  9. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  10. Monte Carlo simulations in small animal PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Branco, Susana [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)], E-mail: susana.silva@fc.ul.pt; Jan, Sebastien [Service Hospitalier Frederic Joliot, CEA/DSV/DRM, Orsay (France); Almeida, Pedro [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)

    2007-10-01

    This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using {sup -}F and [{sup 18}F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies.

  11. Simulating Flaring Events via an Intelligent Cellular Automata Mechanism

    Science.gov (United States)

    Dimitropoulou, M.; Vlahos, L.; Isliker, H.; Georgoulis, M.

    2010-07-01

    We simulate flaring events through a Cellular Automaton (CA) model, in which, for the first time, we use observed vector magnetograms as initial conditions. After non-linear force free extrapolation of the magnetic field from the vector magnetograms, we identify magnetic discontinuities, using two alternative criteria: (1) the average magnetic field gradient, or (2) the normalized magnetic field curl (i.e. the current). Magnetic discontinuities are identified at the grid-sites where the magnetic field gradient or curl exceeds a specified threshold. We then relax the magnetic discontinuities according to the rules of Lu and Hamilton (1991) or Lu et al. (1993), i.e. we redistribute the magnetic field locally so that the discontinuities disappear. In order to simulate the flaring events, we consider several alternative scenarios with regard to: (1) The threshold above which magnetic discontinuities are identified (applying low, high, and height-dependent threshold values); (2) The driving process that occasionally causes new discontinuities (at randomly chosen grid sites, magnetic field increments are added that are perpendicular (or may-be also parallel) to the existing magnetic field). We address the question whether the coronal active region magnetic fields can indeed be considered to be in the state of self-organized criticality (SOC).

  12. Simulation of the Tornado Event of 22 March, 2013 over ...

    Indian Academy of Sciences (India)

    An attempt has been made to simulate this rare event using the Weather Research and Forecasting (WRF) model. The model was run in a single domain at 9 km resolution for a period of 24 hrs, starting at 0000 UTC on 22 March, 2013. The meteorological conditions that led to form this tornado have been analyzed.

  13. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    Science.gov (United States)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    ' precipitation forecasts showed some skill (improvement over persistence) for lead times up to 60' for moderate intensities (up to 1 mm in 30') and up to 2.5h for lower rates (above 0.1 mm). However an important event-to-event variability has been found as illustrated by the fact that hit rates of rain-no-rain forecasts achieved the 60% value at 90' in the 7 September 2005 and only 40' in the 2 November 2008 case. The discussion of these results provides useful information on the potential application of nowcasting systems and realistic values to be contrasted with specific end-user requirements. This work has been done in the framework of the Hymex research programme and has been partly funded by the ProFEWS project (CGL2010-15892). References Bech J, N Pineda, T Rigo, M Aran, J Amaro, M Gayà, J Arús, J Montanyà, O van der Velde, 2011: A Mediterranean nocturnal heavy rainfall and tornadic event. Part I: Overview, damage survey and radar analysis. Atmospheric Research 100:621-637 http://dx.doi.org/10.1016/j.atmosres.2010.12.024 Bech J, R Pascual, T Rigo, N Pineda, JM López, J Arús, and M Gayà, 2007: An observational study of the 7 September 2005 Barcelona tornado outbreak. Natural Hazards and Earth System Science 7:129-139 http://dx.doi.org/10.5194/nhess-7-129-2007 Berenguer M, C Corral, R Sa'nchez-Diezma, D Sempere-Torres, 2005: Hydrological validation of a radarbased nowcasting technique. Journal of Hydrometeorology 6: 532-549 http://dx.doi.org/10.1175/JHM433.1 Berenguer M, D Sempere, G Pegram, 2011: SBMcast - An ensemble nowcasting technique to assess the uncertainty in rainfall forecasts by Lagrangian extrapolation. Journal of Hydrology 404: 226-240 http://dx.doi.org/10.1016/j.jhydrol.2011.04.033 Pierce C, A Seed, S Ballard, D Simonin, Z Li, 2012: Nowcasting. In Doppler Radar Observations (J Bech, JL Chau, ed.) Ch. 13, 98-142. InTech, Rijeka, Croatia http://dx.doi.org/10.5772/39054

  14. Consistent simulation of nonresonant diphoton production in hadron collisions including associated jet production up to two jets

    Science.gov (United States)

    Odaka, Shigeru; Kurihara, Yoshimasa

    2016-12-01

    An event generator for diphoton (γ γ ) production in hadron collisions that includes associated jet production up to two jets has been developed using a subtraction method based on the limited leading-log subtraction. The parton shower (PS) simulation to restore the subtracted divergent components involves both quantum electrodynamic (QED) and quantum chromodynamic radiation, and QED radiation at very small Q2 is simulated by referring to a fragmentation function (FF). The PS/FF simulation has the ability to enforce the radiation of a given number of energetic photons. The generated events can be fed to PYTHIA to obtain particle (hadron) level event information, which enables us to perform realistic simulations of photon isolation and hadron-jet reconstruction. The simulated events, in which the loop-mediated g g →γ γ process is involved, reasonably reproduce the diphoton kinematics measured at the LHC. Using the developed simulation, we found that the two-jet processes significantly contribute to diphoton production. A large two-jet contribution can be considered as a common feature in electroweak-boson production in hadron collisions although the reason is yet to be understood. Discussion concerning the treatment of the underlying events in photon isolation is necessary for future higher precision measurements.

  15. Interpretation of Cellular Imaging and AQP4 Quantification Data in a Single Cell Simulator

    Directory of Open Access Journals (Sweden)

    Seon B. Kim

    2014-03-01

    Full Text Available The goal of the present study is to integrate different datasets in cell biology to derive additional quantitative information about a gene or protein of interest within a single cell using computational simulations. We propose a novel prototype cell simulator as a quantitative tool to integrate datasets including dynamic information about transcript and protein levels and the spatial information on protein trafficking in a complex cellular geometry. In order to represent the stochastic nature of transcription and gene expression, our cell simulator uses event-based stochastic simulations to capture transcription, translation, and dynamic trafficking events. In a reconstructed cellular geometry, a realistic microtubule structure is generated with a novel growth algorithm for simulating vesicular transport and trafficking events. In a case study, we investigate the change in quantitative expression levels of a water channel-aquaporin 4-in a single astrocyte cell, upon pharmacological treatment. Gillespie based discrete time approximation method results in stochastic fluctuation of mRNA and protein levels. In addition, we compute the dynamic trafficking of aquaporin-4 on microtubules in this reconstructed astrocyte. Computational predictions are validated with experimental data. The demonstrated cell simulator facilitates the analysis and prediction of protein expression dynamics.

  16. BlackMax: A black-hole event generator with rotation, recoil, split branes, and brane tension

    International Nuclear Information System (INIS)

    Dai Dechang; Starkman, Glenn; Stojkovic, Dejan; Issever, Cigdem; Tseng, Jeff; Rizvi, Eram

    2008-01-01

    We present a comprehensive black-hole event generator, BlackMax, which simulates the experimental signatures of microscopic and Planckian black-hole production and evolution at the LHC in the context of brane world models with low-scale quantum gravity. The generator is based on phenomenologically realistic models free of serious problems that plague low-scale gravity, thus offering more realistic predictions for hadron-hadron colliders. The generator includes all of the black-hole gray-body factors known to date and incorporates the effects of black-hole rotation, splitting between the fermions, nonzero brane tension, and black-hole recoil due to Hawking radiation (although not all simultaneously). The generator can be interfaced with Herwig and Pythia. The main code can be downloaded from http://www-pnp.physics.ox.ac.uk/~issever/BlackMax/blackmax.html.

  17. Impact of a realistic river routing in coupled ocean-atmosphere simulations of the Last Glacial Maximum climate

    Energy Technology Data Exchange (ETDEWEB)

    Alkama, Ramdane [IPSL, Laboratoire des Sciences du Climat et de l' Environnement, Gif-sur-Yvette Cedex (France); Universite Pierre et Marie Curie, Structure et fonctionnement des systemes hydriques continentaux (Sisyphe), Paris (France); Kageyama, M.; Ramstein, G.; Marti, O.; Swingedouw, D. [IPSL, Laboratoire des Sciences du Climat et de l' Environnement, Gif-sur-Yvette Cedex (France); Ribstein, P. [Universite Pierre et Marie Curie, Structure et fonctionnement des systemes hydriques continentaux (Sisyphe), Paris (France)

    2008-06-15

    The presence of large ice sheets over North America and North Europe at the Last Glacial Maximum (LGM) strongly impacted Northern hemisphere river pathways. Despite the fact that such changes may significantly alter the freshwater input to the ocean, modified surface hydrology has never been accounted for in coupled ocean-atmosphere general circulation model simulations of the LGM climate. To reconstruct the LGM river routing, we use the ICE-5G LGM topography. Because of the uncertainties in the extent of the Fennoscandian ice sheet in the Eastern part of the Kara Sea, we consider two more realistic river routing scenarios. The first scenario is characterised by the presence of an ice dammed lake south of the Fennoscandian ice sheet, and corresponds to the ICE-5G topography. This lake is fed by the Ob and Yenisei rivers. In the second scenario, both these rivers flow directly into the Arctic Ocean, which is more consistent with the latest QUEEN ice sheet margin reconstructions. We study the impact of these changes on the LGM climate as simulated by the IPSL{sub C}M4 model and focus on the overturning thermohaline circulation. A comparison with a classical LGM simulation performed using the same model and modern river basins as designed in the PMIP2 exercise leads to the following conclusions: (1) The discharge into the North Atlantic Ocean is increased by 2,000 m{sup 3}/s between 38 and 54 N in both simulations that contain LGM river routing, compared to the classical LGM experiment. (2) The ice dammed lake is shown to have a weak impact, relative to the classical simulation, both in terms of climate and ocean circulation. (3) In contrast, the North Atlantic deep convection and meridional overturning are weaker than during the classical LGM run if the Ob and Yenisei rivers flow directly into the Arctic Ocean. The total discharge into the Arctic Ocean is increased by 31,000 m{sup 3}/s, relative to the classical LGM simulation. Consequentially, northward ocean heat

  18. Nuclear equation of state for core-collapse supernova simulations with realistic nuclear forces

    Energy Technology Data Exchange (ETDEWEB)

    Togashi, H., E-mail: hajime.togashi@riken.jp [Nishina Center for Accelerator-Based Science, Institute of Physical and Chemical Research (RIKEN), 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Research Institute for Science and Engineering, Waseda University, 3-4-1 Okubo, Shinjuku-ku, Tokyo 169-8555 (Japan); Nakazato, K. [Faculty of Arts and Science, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka 819-0395 (Japan); Takehara, Y.; Yamamuro, S.; Suzuki, H. [Department of Physics, Faculty of Science and Technology, Tokyo University of Science, Yamazaki 2641, Noda, Chiba 278-8510 (Japan); Takano, M. [Research Institute for Science and Engineering, Waseda University, 3-4-1 Okubo, Shinjuku-ku, Tokyo 169-8555 (Japan); Department of Pure and Applied Physics, Graduate School of Advanced Science and Engineering, Waseda University, 3-4-1 Okubo, Shinjuku-ku, Tokyo 169-8555 (Japan)

    2017-05-15

    A new table of the nuclear equation of state (EOS) based on realistic nuclear potentials is constructed for core-collapse supernova numerical simulations. Adopting the EOS of uniform nuclear matter constructed by two of the present authors with the cluster variational method starting from the Argonne v18 and Urbana IX nuclear potentials, the Thomas–Fermi calculation is performed to obtain the minimized free energy of a Wigner–Seitz cell in non-uniform nuclear matter. As a preparation for the Thomas–Fermi calculation, the EOS of uniform nuclear matter is modified so as to remove the effects of deuteron cluster formation in uniform matter at low densities. Mixing of alpha particles is also taken into account following the procedure used by Shen et al. (1998, 2011). The critical densities with respect to the phase transition from non-uniform to uniform phase with the present EOS are slightly higher than those with the Shen EOS at small proton fractions. The critical temperature with respect to the liquid–gas phase transition decreases with the proton fraction in a more gradual manner than in the Shen EOS. Furthermore, the mass and proton numbers of nuclides appearing in non-uniform nuclear matter with small proton fractions are larger than those of the Shen EOS. These results are consequences of the fact that the density derivative coefficient of the symmetry energy of our EOS is smaller than that of the Shen EOS.

  19. Reduced herbivory during simulated ENSO rainy events increases native herbaceous plants in semiarid Chile

    NARCIS (Netherlands)

    Manrique, R.; Gutierrez, J.R.; Holmgren, M.; Squeo, F.A.

    2007-01-01

    El Niño Southern Oscillation (ENSO) events have profound consequences for the dynamics of terrestrial ecosystems. Since increased climate variability is expected to favour the invasive success of exotic species, we conducted a field experiment to study the effects that simulated rainy ENSO events in

  20. Robust mode space approach for atomistic modeling of realistically large nanowire transistors

    Science.gov (United States)

    Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard

    2018-01-01

    Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.

  1. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    Science.gov (United States)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  2. Monte Carlo simulations of the electric field close to the body in realistic environments for application in personal radiofrequency dosimetry

    International Nuclear Information System (INIS)

    Iskra, S.; McKenzie, R.; Cosic, I.

    2011-01-01

    Personal dosemeters can play an important role in epidemiological studies and in radiofrequency safety programmes. In this study, a Monte Carlo approach is used in conjunction with the finite difference time domain method to obtain distributions of the electric field strength close to a human body model in simulated realistic environments. The field is a proxy for the response of an ideal body-worn electric field dosemeter. A set of eight environments were modelled based on the statistics of Rayleigh, Rice and log-normal fading to simulate outdoor and indoor multi-path exposures at 450, 900 and 2100 MHz. Results indicate that a dosemeter mounted randomly within 10-50 mm of the adult or child body model (torso region) will on average underestimate the spatially averaged value of the incident electric field strength by a factor of 0.52 to 0.74 over the frequencies of 450, 900 and 2100 MHz. The uncertainty in results, assessed at the 95 % confidence level (between the 2.5. and 97.5. percentiles) was largest at 2100 MHz and smallest at 450 MHz. (authors)

  3. Number distribution of leakage neutrons for single neutron emission event and one source emission event in multiplying medium for two variables - a GEANT4 study

    International Nuclear Information System (INIS)

    Roy, Arup Singha; Raman, Anand; Chaudhury, Probal; Thomas, Renju G.

    2018-01-01

    A quantitative knowledge about the neutron multiplying character of a neutron multiplying medium such as High enriched Uranium (HEU), Weapon Graded plutonium (WGPu) and similar special nuclear materials is essential for improving the probability of detection of these materials to check against illicit trafficking. The objective of this study is to gain a deeper insight in to the neutron and gamma multiplication behaviour of these materials. The leakage number distribution of neutron and gamma initiated by a source emission event (Spontaneous Fission) as well as single neutron emission event has been obtained in the course of this study. The computations for this study were carried out through GEANT4 simulation and also with the help of FREYA incorporated into it. This helped to carry out a detailed analysis of each history more realistically and obtain more reliable results

  4. Discrete event simulation methods applied to advanced importance measures of repairable components in multistate network flow systems

    International Nuclear Information System (INIS)

    Huseby, Arne B.; Natvig, Bent

    2013-01-01

    Discrete event models are frequently used in simulation studies to model and analyze pure jump processes. A discrete event model can be viewed as a system consisting of a collection of stochastic processes, where the states of the individual processes change as results of various kinds of events occurring at random points of time. We always assume that each event only affects one of the processes. Between these events the states of the processes are considered to be constant. In the present paper we use discrete event simulation in order to analyze a multistate network flow system of repairable components. In order to study how the different components contribute to the system, it is necessary to describe the often complicated interaction between component processes and processes at the system level. While analytical considerations may throw some light on this, a simulation study often allows the analyst to explore more details. By producing stable curve estimates for the development of the various processes, one gets a much better insight in how such systems develop over time. These methods are particulary useful in the study of advanced importancez measures of repairable components. Such measures can be very complicated, and thus impossible to calculate analytically. By using discrete event simulations, however, this can be done in a very natural and intuitive way. In particular significant differences between the Barlow–Proschan measure and the Natvig measure in multistate network flow systems can be explored

  5. Discrete event simulation for petroleum transfers involving harbors, refineries and pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Martins, Marcella S.R.; Lueders, Ricardo; Delgado, Myriam R.B.S. [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil)

    2009-07-01

    Nowadays a great effort has been spent by companies to improve their logistics in terms of programming of events that affect production and distribution of products. In this case, simulation can be a valuable tool for evaluating different behaviors. The objective of this work is to build a discrete event simulation model for scheduling of operational activities in complexes containing one harbor and two refineries interconnected by a pipeline infrastructure. The model was developed in Arena package, based on three sub-models that control pier allocation, loading of tanks, and transfers to refineries through pipelines. Preliminary results obtained for a given control policy, show that profit can be calculated by taking into account many parameters such as oil costs on ships, pier using, over-stay of ships and interface costs. Such problem has already been considered in the literature but using different strategies. All these factors should be considered in a real-world operation where decision making tools are necessary to obtain high returns. (author)

  6. Event detection and localization for small mobile robots using reservoir computing.

    Science.gov (United States)

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  7. On Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Suresh, Mahima Agumbe

    2013-05-01

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil and gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against these infrastructures have been proven costly and imprecise, particularly when dealing with large-scale distribution systems. In this article, to the best of our knowledge, for the first time, we investigate how mobile sensor networks can be used for optimal event detection and localization in acyclic flow networks. We propose the idea of using sensors that move along the edges of the network and detect events (i.e., attacks). To localize the events, sensors detect proximity to beacons, which are devices with known placement in the network. We formulate the problem of minimizing the cost of monitoring infrastructure (i.e., minimizing the number of sensors and beacons deployed) in a predetermined zone of interest, while ensuring a degree of coverage by sensors and a required accuracy in locating events using beacons. We propose algorithms for solving the aforementioned problem and demonstrate their effectiveness with results obtained from a realistic flow network simulator.

  8. Parallel discrete event simulation

    NARCIS (Netherlands)

    Overeinder, B.J.; Hertzberger, L.O.; Sloot, P.M.A.; Withagen, W.J.

    1991-01-01

    In simulating applications for execution on specific computing systems, the simulation performance figures must be known in a short period of time. One basic approach to the problem of reducing the required simulation time is the exploitation of parallelism. However, in parallelizing the simulation

  9. Simulation for ward processes of surgical care.

    Science.gov (United States)

    Pucher, Philip H; Darzi, Ara; Aggarwal, Rajesh

    2013-07-01

    The role of simulation in surgical education, initially confined to technical skills and procedural tasks, increasingly includes training nontechnical skills including communication, crisis management, and teamwork. Research suggests that many preventable adverse events can be attributed to nontechnical error occurring within a ward context. Ward rounds represent the primary point of interaction between patient and physician but take place without formalized training or assessment. The simulated ward should provide an environment in which processes of perioperative care can be performed safely and realistically, allowing multidisciplinary assessment and training of full ward rounds. We review existing literature and describe our experience in setting up our ward simulator. We examine the facilities, equipment, cost, and personnel required for establishing a surgical ward simulator and consider the scenario development, assessment, and feedback tools necessary to integrate it into a surgical curriculum. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Analysis of convection-permitting simulations for capturing heavy rainfall events over Myanmar Region

    Science.gov (United States)

    Acierto, R. A. E.; Kawasaki, A.

    2017-12-01

    Perennial flooding due to heavy rainfall events causes strong impacts on the society and economy. With increasing pressures of rapid development and potential for climate change impacts, Myanmar experiences a rapid increase in disaster risk. Heavy rainfall hazard assessment is key on quantifying such disaster risk in both current and future conditions. Downscaling using Regional Climate Models (RCM) such as Weather Research and Forecast model have been used extensively for assessing such heavy rainfall events. However, usage of convective parameterizations can introduce large errors in simulating rainfall. Convective-permitting simulations have been used to deal with this problem by increasing the resolution of RCMs to 4km. This study focuses on the heavy rainfall events during the six-year (2010-2015) wet period season from May to September in Myanmar. The investigation primarily utilizes rain gauge observation for comparing downscaled heavy rainfall events in 4km resolution using ERA-Interim as boundary conditions using 12km-4km one-way nesting method. The study aims to provide basis for production of high-resolution climate projections over Myanmar in order to contribute for flood hazard and risk assessment.

  11. A methodological approach to a realistic evaluation of skin absorbed doses during manipulation of radioactive sources by means of GAMOS Monte Carlo simulations

    Science.gov (United States)

    Italiano, Antonio; Amato, Ernesto; Auditore, Lucrezia; Baldari, Sergio

    2018-05-01

    The accurate evaluation of the radiation burden associated with radiation absorbed doses to the skin of the extremities during the manipulation of radioactive sources is a critical issue in operational radiological protection, deserving the most accurate calculation approaches available. Monte Carlo simulation of the radiation transport and interaction is the gold standard for the calculation of dose distributions in complex geometries and in presence of extended spectra of multi-radiation sources. We propose the use of Monte Carlo simulations in GAMOS, in order to accurately estimate the dose to the extremities during manipulation of radioactive sources. We report the results of these simulations for 90Y, 131I, 18F and 111In nuclides in water solutions enclosed in glass or plastic receptacles, such as vials or syringes. Skin equivalent doses at 70 μm of depth and dose-depth profiles are reported for different configurations, highlighting the importance of adopting a realistic geometrical configuration in order to get accurate dosimetric estimations. Due to the easiness of implementation of GAMOS simulations, case-specific geometries and nuclides can be adopted and results can be obtained in less than about ten minutes of computation time with a common workstation.

  12. Simulation and verification of transient events in large wind power installations

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, P.; Hansen, A.D.; Christensen, P.; Meritz, M.; Bech, J.; Bak-Jensen, B.; Nielsen, H.

    2003-10-01

    Models for wind power installations excited by transient events have been developed and verified. A number of cases have been investigated, including comparisons of simulations of a three-phase short circuit, validation with measurements of tripping of single wind turbine, islanding of a group of two wind turbines, and voltage steps caused by tripping of wind turbines and by manual transformer tap-changing. A Benchmark model is also presented, enabling the reader to test own simulation results against results obtained with models developed in EMTDC and DIgSILENT. (au)

  13. A simple conceptual model of abrupt glacial climate events

    Directory of Open Access Journals (Sweden)

    H. Braun

    2007-11-01

    Full Text Available Here we use a very simple conceptual model in an attempt to reduce essential parts of the complex nonlinearity of abrupt glacial climate changes (the so-called Dansgaard-Oeschger events to a few simple principles, namely (i the existence of two different climate states, (ii a threshold process and (iii an overshooting in the stability of the system at the start and the end of the events, which is followed by a millennial-scale relaxation. By comparison with a so-called Earth system model of intermediate complexity (CLIMBER-2, in which the events represent oscillations between two climate states corresponding to two fundamentally different modes of deep-water formation in the North Atlantic, we demonstrate that the conceptual model captures fundamental aspects of the nonlinearity of the events in that model. We use the conceptual model in order to reproduce and reanalyse nonlinear resonance mechanisms that were already suggested in order to explain the characteristic time scale of Dansgaard-Oeschger events. In doing so we identify a new form of stochastic resonance (i.e. an overshooting stochastic resonance and provide the first explicitly reported manifestation of ghost resonance in a geosystem, i.e. of a mechanism which could be relevant for other systems with thresholds and with multiple states of operation. Our work enables us to explicitly simulate realistic probability measures of Dansgaard-Oeschger events (e.g. waiting time distributions, which are a prerequisite for statistical analyses on the regularity of the events by means of Monte-Carlo simulations. We thus think that our study is an important advance in order to develop more adequate methods to test the statistical significance and the origin of the proposed glacial 1470-year climate cycle.

  14. Event-based scenario manager for multibody dynamics simulation of heavy load lifting operations in shipyards

    Directory of Open Access Journals (Sweden)

    Sol Ha

    2016-01-01

    Full Text Available This paper suggests an event-based scenario manager capable of creating and editing a scenario for shipbuilding process simulation based on multibody dynamics. To configure various situation in shipyards and easily connect with multibody dynamics, the proposed method has two main concepts: an Actor and an Action List. The Actor represents the anatomic unit of action in the multibody dynamics and can be connected to a specific component of the dynamics kernel such as the body and joint. The user can make a scenario up by combining the actors. The Action List contains information for arranging and executing the actors. Since the shipbuilding process is a kind of event-based sequence, all simulation models were configured using Discrete EVent System Specification (DEVS formalism. The proposed method was applied to simulations of various operations in shipyards such as lifting and erection of a block and heavy load lifting operation using multiple cranes.

  15. Modeling Anti-Air Warfare With Discrete Event Simulation and Analyzing Naval Convoy Operations

    Science.gov (United States)

    2016-06-01

    W., & Scheaffer, R. L. (2008). Mathematical statistics with applications . Belmont, CA: Cengage Learning. 118 THIS PAGE INTENTIONALLY LEFT BLANK...WARFARE WITH DISCRETE EVENT SIMULATION AND ANALYZING NAVAL CONVOY OPERATIONS by Ali E. Opcin June 2016 Thesis Advisor: Arnold H. Buss Co...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MODELING ANTI-AIR WARFARE WITH DISCRETE EVENT

  16. Convective aggregation in idealised models and realistic equatorial cases

    Science.gov (United States)

    Holloway, Chris

    2015-04-01

    Idealised explicit convection simulations of the Met Office Unified Model are shown to exhibit spontaneous self-aggregation in radiative-convective equilibrium, as seen previously in other models in several recent studies. This self-aggregation is linked to feedbacks between radiation, surface fluxes, and convection, and the organization is intimately related to the evolution of the column water vapour (CWV) field. To investigate the relevance of this behaviour to the real world, these idealized simulations are compared with five 15-day cases of real organized convection in the tropics, including multiple simulations of each case testing sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. Despite similar large-scale forcing via lateral boundary conditions, systematic differences in mean CWV, CWV distribution shape, and the length scale of CWV features are found between the different sensitivity runs, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations.

  17. Neutron dosemeter responses in workplace fields and the implications of using realistic neutron calibration fields

    International Nuclear Information System (INIS)

    Thomas, D.J.; Horwood, N.; Taylor, G.C.

    1999-01-01

    The use of realistic neutron calibration fields to overcome some of the problems associated with the response functions of presently available dosemeters, both area survey instruments and personal dosemeters, has been investigated. Realistic calibration fields have spectra which, compared to conventional radionuclide source based calibration fields, more closely match those of the workplace fields in which dosemeters are used. Monte Carlo simulations were performed to identify laboratory systems which would produce appropriate workplace-like calibration fields. A detailed analysis was then undertaken of the predicted under- and over-responses of dosemeters in a wide selection of measured workplace field spectra assuming calibration in a selection of calibration fields. These included both conventional radionuclide source calibration fields, and also several proposed realistic calibration fields. The present state of the art for dosemeter performance, and the possibilities of improving accuracy by using realistic calibration fields are both presented. (author)

  18. Corpuscular event-by-event simulation of quantum optics experiments: application to a quantum-controlled delayed-choice experiment

    International Nuclear Information System (INIS)

    De Raedt, Hans; Delina, M; Jin, Fengping; Michielsen, Kristel

    2012-01-01

    A corpuscular simulation model of optical phenomena that does not require knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one by one is discussed. The event-based corpuscular model gives a unified description of multiple-beam fringes of a plane parallel plate and a single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum eraser, two-beam interference, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments. The approach is illustrated by applying it to a recent proposal for a quantum-controlled delayed choice experiment, demonstrating that also this thought experiment can be understood in terms of particle processes only.

  19. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  20. PhyloSim - Monte Carlo simulation of sequence evolution in the R statistical computing environment

    Directory of Open Access Journals (Sweden)

    Massingham Tim

    2011-04-01

    Full Text Available Abstract Background The Monte Carlo simulation of sequence evolution is routinely used to assess the performance of phylogenetic inference methods and sequence alignment algorithms. Progress in the field of molecular evolution fuels the need for more realistic and hence more complex simulations, adapted to particular situations, yet current software makes unreasonable assumptions such as homogeneous substitution dynamics or a uniform distribution of indels across the simulated sequences. This calls for an extensible simulation framework written in a high-level functional language, offering new functionality and making it easy to incorporate further complexity. Results PhyloSim is an extensible framework for the Monte Carlo simulation of sequence evolution, written in R, using the Gillespie algorithm to integrate the actions of many concurrent processes such as substitutions, insertions and deletions. Uniquely among sequence simulation tools, PhyloSim can simulate arbitrarily complex patterns of rate variation and multiple indel processes, and allows for the incorporation of selective constraints on indel events. User-defined complex patterns of mutation and selection can be easily integrated into simulations, allowing PhyloSim to be adapted to specific needs. Conclusions Close integration with R and the wide range of features implemented offer unmatched flexibility, making it possible to simulate sequence evolution under a wide range of realistic settings. We believe that PhyloSim will be useful to future studies involving simulated alignments.

  1. Simulation of heavy, long-term rainfall over low mountain ranges; Simulation von Starkniederschlaegen mit langer Andauer ueber Mittelgebirgen

    Energy Technology Data Exchange (ETDEWEB)

    Kunz, M.

    2003-03-01

    A diagnostic model for the estimation of orographic precipitation during large-scale upslide motions is presented. It is based on linear theory for 3-D mountain overflow. From the simulated vertical velocities rain intensities at the ground are calculated using a model for precipitation formation. Due to the small number of free parameters and because of the simple initialisation method, e.g. with single radiosonde data, the model is used for regionalisation of precipitation from rain gauge observations as well as for deriving its statistics under dynamical constraints. For Southwest Germany and Eastern France, with the low mountain ranges of the Vosges, Black Forest and Swabian Alb, model simulations are performed for individual events with heavy rainfall. Thereby it is evaluated, how realistic rainfall patterns can be obtained with a combination of model simulations and measurement data. Mean rainfall distributions are derived from simulations of all extreme events with 24-h totals over 60 mm at selected rain gauge stations between 1971 and 2000. Furthermore the calculation of rain sums for different return periods is performed using extreme value statistics. So it is possible to quantify the hazard potential of heavy rainfall, which may cause flooding or landslides, in high spatial resolution (2.5 x 2.5 km). (orig.)

  2. Discrete event simulation and the resultant data storage system response in the operational mission environment of Jupiter-Saturn /Voyager/ spacecraft

    Science.gov (United States)

    Mukhopadhyay, A. K.

    1978-01-01

    The Data Storage Subsystem Simulator (DSSSIM) simulating (by ground software) occurrence of discrete events in the Voyager mission is described. Functional requirements for Data Storage Subsystems (DSS) simulation are discussed, and discrete event simulation/DSSSIM processing is covered. Four types of outputs associated with a typical DSSSIM run are presented, and DSSSIM limitations and constraints are outlined.

  3. Non-axisymmetric simulation of the vertical displacement event in tokamaks

    International Nuclear Information System (INIS)

    Lim, Y.Y.; Lee, J.K.; Shin, K.J.; Hur, M.S.

    1999-01-01

    Tokamak plasmas with highly elongated cross sections are subject to a vertical displacement event (VDE). The nonlinear magnetohydrodynamic (MHD) evolutions of tokamak plasmas during the VDE are simulated by a three-dimensional MHD code as a combination of N=0 and N=1 components. The nonlinear evolution during the VDE is strongly affected by the relative amplitude of the N=1 to the N=0 modes. (author)

  4. The Xenon Test Chamber Q-SUN® for testing realistic tolerances of fungi exposed to simulated full spectrum solar radiation.

    Science.gov (United States)

    Dias, Luciana P; Araújo, Claudinéia A S; Pupin, Breno; Ferreira, Paulo C; Braga, Gilberto Ú L; Rangel, Drauzio E N

    2018-06-01

    The low survival of insect-pathogenic fungi when used for insect control in agriculture is mainly due to the deleterious effects of ultraviolet radiation and heat from solar irradiation. In this study, conidia of 15 species of entomopathogenic fungi were exposed to simulated full-spectrum solar radiation emitted by a Xenon Test Chamber Q-SUN XE-3-HC 340S (Q-LAB ® Corporation, Westlake, OH, USA), which very closely simulates full-spectrum solar radiation. A dendrogram obtained from cluster analyses, based on lethal time 50 % and 90 % calculated by Probit analyses, separated the fungi into three clusters: cluster 3 contains species with highest tolerance to simulated full-spectrum solar radiation, included Metarhizium acridum, Cladosporium herbarum, and Trichothecium roseum with LT 50  > 200 min irradiation. Cluster 2 contains eight species with moderate UV tolerance: Aschersonia aleyrodis, Isaria fumosorosea, Mariannaea pruinosa, Metarhizium anisopliae, Metarhizium brunneum, Metarhizium robertsii, Simplicillium lanosoniveum, and Torrubiella homopterorum with LT 50 between 120 and 150 min irradiation. The four species in cluster 1 had the lowest UV tolerance: Lecanicillium aphanocladii, Beauveria bassiana, Tolypocladium cylindrosporum, and Tolypocladium inflatum with LT 50  solar radiation before. We conclude that the equipment provided an excellent tool for testing realistic tolerances of fungi to full-spectrum solar radiation of microbial agents for insect biological control in agriculture. Copyright © 2018 British Mycological Society. Published by Elsevier Ltd. All rights reserved.

  5. Performance Evaluation of Wireless Sensor Networks for Event-Detection with Shadowing-Induced Radio Irregularities

    Directory of Open Access Journals (Sweden)

    Giuseppe De Marco

    2007-01-01

    Full Text Available In this paper, we study a particular application of wireless sensor networks for event-detection and tracking. In this kind of application, the transport of data is simplified, and guaranteeing a minimum number of packets at the monitoring node is the only constraint on the performance of the sensor network. This minimum number of packets is called event-reliability. Contrary to other studies on the subject, here we consider the behavior of such a network in presence of a realistic radio model, such as the shadowing of the radio signal. With this setting, we extend our previous analysis of the event-reliability approach for the transport of data. In particular, both regular and random networks are considered. The contribute of this work is to show via simulations that, in the presence of randomness or irregularities in the radio channel, the event-reliability can be jeopardized, that is the constraint on the minimum number of packets at the sink node could not be satisfied.

  6. An ECG simulator for generating maternal-foetal activity mixtures on abdominal ECG recordings

    International Nuclear Information System (INIS)

    Behar, Joachim; Andreotti, Fernando; Li, Qiao; Oster, Julien; Clifford, Gari D; Zaunseder, Sebastian

    2014-01-01

    Accurate foetal electrocardiogram (FECG) morphology extraction from non-invasive sensors remains an open problem. This is partly due to the paucity of available public databases. Even when gold standard information (i.e derived from the scalp electrode) is present, the collection of FECG can be problematic, particularly during stressful or clinically important events. In order to address this problem we have introduced an FECG simulator based on earlier work on foetal and adult ECG modelling. The open source foetal ECG synthetic simulator, fecgsyn, is able to generate maternal-foetal ECG mixtures with realistic amplitudes, morphology, beat-to-beat variability, heart rate changes and noise. Positional (rotation and translation-related) movements in the foetal and maternal heart due to respiration, foetal activity and uterine contractions were also added to the simulator. The simulator was used to generate some of the signals that were part of the 2013 PhysioNet Computing in Cardiology Challenge dataset and has been posted on Physionet.org (together with scripts to generate realistic scenarios) under an open source license. The toolbox enables further research in the field and provides part of a standard for industry and regulatory testing of rare pathological scenarios. (paper)

  7. A novel approach for modelling complex maintenance systems using discrete event simulation

    International Nuclear Information System (INIS)

    Alrabghi, Abdullah; Tiwari, Ashutosh

    2016-01-01

    Existing approaches for modelling maintenance rely on oversimplified assumptions which prevent them from reflecting the complexity found in industrial systems. In this paper, we propose a novel approach that enables the modelling of non-identical multi-unit systems without restrictive assumptions on the number of units or their maintenance characteristics. Modelling complex interactions between maintenance strategies and their effects on assets in the system is achieved by accessing event queues in Discrete Event Simulation (DES). The approach utilises the wide success DES has achieved in manufacturing by allowing integration with models that are closely related to maintenance such as production and spare parts systems. Additional advantages of using DES include rapid modelling and visual interactive simulation. The proposed approach is demonstrated in a simulation based optimisation study of a published case. The current research is one of the first to optimise maintenance strategies simultaneously with their parameters while considering production dynamics and spare parts management. The findings of this research provide insights for non-conflicting objectives in maintenance systems. In addition, the proposed approach can be used to facilitate the simulation and optimisation of industrial maintenance systems. - Highlights: • This research is one of the first to optimise maintenance strategies simultaneously. • New insights for non-conflicting objectives in maintenance systems. • The approach can be used to optimise industrial maintenance systems.

  8. Towards realistic molecular dynamics simulations of grain boundary mobility

    International Nuclear Information System (INIS)

    Zhou, J.; Mohles, V.

    2011-01-01

    In order to investigate grain boundary migration by molecular dynamics (MD) simulations a new approach involving a crystal orientation-dependent driving force has been developed by imposing an appropriate driving force on grain boundary atoms and enlarging the effective range of driving force. The new approach has been validated by the work of the driving force associated with the motion of grain boundaries. With the new approach the relation between boundary migration velocity and driving force is found to be nonlinear, as was expected from rate theory for large driving forces applied in MD simulations. By evaluating grain boundary mobility nonlinearly for a set of symmetrical tilt boundaries in aluminum at high temperature, high-angle grain boundaries were shown to move much faster than low-angle grain boundaries. This agrees well with experimental findings for recrystallization and grain growth. In comparison with the available data the simulated mobility of a 38.21 o Σ7 boundary was found to be significantly lower than other MD simulation results and comparable with the experimental values. Furthermore, the average volume involved during atomic jumps for boundary migration is determined in MD simulations for the first time. The large magnitude of the volume indicates that grain boundary migration is accomplished by the correlated motion of atom groups.

  9. Numerical Simulation of a Breaking Gravity Wave Event Over Greenland Observed During Fastex

    National Research Council Canada - National Science Library

    Doyle, James

    1997-01-01

    Measurements from the NOAA G4 research aircraft and high-resolution numerical simulations are used to study the evolution and dynamics of a large-amplitude gravity wave event over Greenland that took...

  10. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    Science.gov (United States)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  11. A participative and facilitative conceptual modelling framework for discrete event simulation studies in healthcare

    OpenAIRE

    Kotiadis, Kathy; Tako, Antuela; Vasilakis, Christos

    2014-01-01

    Existing approaches to conceptual modelling (CM) in discrete-event simulation do not formally support the participation of a group of stakeholders. Simulation in healthcare can benefit from stakeholder participation as it makes possible to share multiple views and tacit knowledge from different parts of the system. We put forward a framework tailored to healthcare that supports the interaction of simulation modellers with a group of stakeholders to arrive at a common conceptual model. The fra...

  12. Fractional counts-the simulation of low probability events

    International Nuclear Information System (INIS)

    Coldwell, R.L.; Lasche, G.P.; Jadczyk, A.

    2001-01-01

    The code RobSim has been added to RobWin.1 It simulates spectra resulting from gamma rays striking an array of detectors made up of different components. These are frequently used to set coincidence and anti-coincidence windows that decide if individual events are part of the signal. The first problem addressed is the construction of the detector. Then owing to the statistical nature of the responses of these elements there is a random nature in the response that can be taken into account by including fractional counts in the output spectrum. This somewhat complicates the error analysis, as Poisson statistics are no longer applicable

  13. Estimating ICU bed capacity using discrete event simulation.

    Science.gov (United States)

    Zhu, Zhecheng; Hen, Bee Hoon; Teow, Kiok Liang

    2012-01-01

    The intensive care unit (ICU) in a hospital caters for critically ill patients. The number of the ICU beds has a direct impact on many aspects of hospital performance. Lack of the ICU beds may cause ambulance diversion and surgery cancellation, while an excess of ICU beds may cause a waste of resources. This paper aims to develop a discrete event simulation (DES) model to help the healthcare service providers determine the proper ICU bed capacity which strikes the balance between service level and cost effectiveness. The DES model is developed to reflect the complex patient flow of the ICU system. Actual operational data, including emergency arrivals, elective arrivals and length of stay, are directly fed into the DES model to capture the variations in the system. The DES model is validated by open box test and black box test. The validated model is used to test two what-if scenarios which the healthcare service providers are interested in: the proper number of the ICU beds in service to meet the target rejection rate and the extra ICU beds in service needed to meet the demand growth. A 12-month period of actual operational data was collected from an ICU department with 13 ICU beds in service. Comparison between the simulation results and the actual situation shows that the DES model accurately captures the variations in the system, and the DES model is flexible to simulate various what-if scenarios. DES helps the healthcare service providers describe the current situation, and simulate the what-if scenarios for future planning.

  14. Numerical simulation of a mistral wind event occuring

    Science.gov (United States)

    Guenard, V.; Caccia, J. L.; Tedeschi, G.

    2003-04-01

    The experimental network of the ESCOMPTE field experiment (june-july 2001) is turned into account to investigate the Mistral wind affecting the Marseille area (South of France). Mistral wind is a northerly flow blowing across the Rhône valley and toward the Mediterranean sea resulting from the dynamical low pressure generated in the wake of the Alps ridge. It brings cold, dry air masses and clear sky conditions over the south-eastern part of France. Up to now, few scientific studies have been carried out on the Mistral wind especially the evolution of its 3-D structure so that its mesoscale numerical simulation is still relevant. Non-hydrostatic RAMS model is performed to better investigate this mesoscale phenomena. Simulations at a 12 km horizontal resolution are compared to boundary layer wind profilers and ground measurements. Preliminary results suit quite well with the Mistral statistical studies carried out by the operational service of Météo-France and observed wind profiles are correctly reproduced by the numerical model RAMS which appears to be an efficient tool for its understanding of Mistral. Owing to the absence of diabatic effect in Mistral events which complicates numerical simulations, the present work is the first step for the validation of RAMS model in that area. Further works will consist on the study of the interaction of Mistral wind with land-sea breeze. Also, RAMS simulations will be combined with aerosol production and ocean circulation models to supply chemists and oceanographers with some answers for their studies.

  15. Self-Adaptive Event-Driven Simulation of Multi-Scale Plasma Systems

    Science.gov (United States)

    Omelchenko, Yuri; Karimabadi, Homayoun

    2005-10-01

    Multi-scale plasmas pose a formidable computational challenge. The explicit time-stepping models suffer from the global CFL restriction. Efficient application of adaptive mesh refinement (AMR) to systems with irregular dynamics (e.g. turbulence, diffusion-convection-reaction, particle acceleration etc.) may be problematic. To address these issues, we developed an alternative approach to time stepping: self-adaptive discrete-event simulation (DES). DES has origin in operations research, war games and telecommunications. We combine finite-difference and particle-in-cell techniques with this methodology by assuming two caveats: (1) a local time increment, dt for a discrete quantity f can be expressed in terms of a physically meaningful quantum value, df; (2) f is considered to be modified only when its change exceeds df. Event-driven time integration is self-adaptive as it makes use of causality rules rather than parametric time dependencies. This technique enables asynchronous flux-conservative update of solution in accordance with local temporal scales, removes the curse of the global CFL condition, eliminates unnecessary computation in inactive spatial regions and results in robust and fast parallelizable codes. It can be naturally combined with various mesh refinement techniques. We discuss applications of this novel technology to diffusion-convection-reaction systems and hybrid simulations of magnetosonic shocks.

  16. Simulation of Random Events for Air Traffic Applications

    Directory of Open Access Journals (Sweden)

    Stéphane Puechmorel

    2018-05-01

    Full Text Available Resilience to uncertainties must be ensured in air traffic management. Unexpected events can either be disruptive, like thunderstorms or the famous volcano ash cloud resulting from the Eyjafjallajökull eruption in Iceland, or simply due to imprecise measurements or incomplete knowledge of the environment. While human operators are able to cope with such situations, it is generally not the case for automated decision support tools. Important examples originate from the numerous attempts made to design algorithms able to solve conflicts between aircraft occurring during flights. The STARGATE (STochastic AppRoach for naviGATion functions in uncertain Environment project was initiated in order to study the feasibility of inherently robust automated planning algorithms that will not fail when submitted to random perturbations. A mandatory first step is the ability to simulate the usual stochastic phenomenons impairing the system: delays due to airport platforms or air traffic control (ATC and uncertainties on the wind velocity. The work presented here will detail algorithms suitable for the simulation task.

  17. Rational versus Emotional Reasoning in a Realistic Multi-Objective Environment

    OpenAIRE

    Mayboudi, Seyed Mohammad Hossein

    2011-01-01

    ABSTRACT: Emotional intelligence and its associated with models have recently become one of new active studies in the field of artificial intelligence. Several works have been performed on modelling of emotional behaviours such as love, hate, happiness and sadness. This study presents a comparative evaluation of rational and emotional behaviours and the effects of emotions on the decision making process of agents in a realistic multi-objective environment. NetLogo simulation environment is u...

  18. Handling of the Generation of Primary Events in Gauss, the LHCb Simulation Framework

    CERN Multimedia

    Corti, G; Brambach, T; Brook, N H; Gauvin, N; Harrison, K; Harrison, P; He, J; Ilten, P J; Jones, C R; Lieng, M H; Manca, G; Miglioranzi, S; Robbe, P; Vagnoni, V; Whitehead, M; Wishahi, J

    2010-01-01

    The LHCb simulation application, Gauss, consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCb experimental program it is particularly important to model B meson decays: the EvtGen code developed in CLEO and BaBar has been chosen and customized for non coherent B production as occuring in pp collisions at the LHC. The initial proton-proton collision is provided by a different generator engine, currently Pythia 6 for massive production of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages are available in the physics community or specifically developed in LHCb, and are used for the different purposes. Running conditions affecting the events generated such as the size of the luminous region, the number of collisions occuring in a bunc...

  19. Simulating flaring events in complex active regions driven by observed magnetograms

    Science.gov (United States)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.

    2011-05-01

    Context. We interpret solar flares as events originating in active regions that have reached the self organized critical state, by using a refined cellular automaton model with initial conditions derived from observations. Aims: We investigate whether the system, with its imposed physical elements, reaches a self organized critical state and whether well-known statistical properties of flares, such as scaling laws observed in the distribution functions of characteristic parameters, are reproduced after this state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy and event duration follow the expected scaling laws, we first applied a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent loading and relaxation steps lead the system to self organized criticality, after which the statistical properties of the simulated events are examined. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately imposed on all elements of the model. Results: Our results show that self organized criticality is indeed reached when applying specific loading and relaxation rules. Power-law indices obtained from the distribution functions of the modeled flaring events are in good agreement with observations. Single power laws (peak and total flare energy) are obtained, as are power laws with exponential cutoff and double power laws (flare duration). The results are also compared with observational X-ray data from the GOES satellite for our active-region sample. Conclusions: We conclude that well-known statistical properties of flares are reproduced after the system has

  20. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2014-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating

  1. Thermohydraulic simulation of HTR-10 nuclear reactor core using realistic CFD approach; Simulacao termohidraulica do nucleo do reator nuclear HTR-10 com o uso da abordagem realistica CFD

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Alexandro S.; Dominguez, Dany S., E-mail: alexandrossilva@gmail.com, E-mail: dsdominguez@gmail.com [Universidade Estadual de Santa Cruz (UESC), Ilheus, BA (Brazil); Mazaira, Leorlen Y. Rojas; Hernandez, Carlos R.G., E-mail: leored1984@gmail.com, E-mail: cgh@instec.cu [Instituto Superior de Tecnologias y Ciencias Aplicadas, La Habana (Cuba); Lira, Carlos Alberto Brayner de Oliveira, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil)

    2015-07-01

    High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal–hydraulic characteristics. In this article, it was performed the thermal–hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a column of FCC (Face Centered Cubic) cells, with 41 layers and 82 pebbles. The input data used were taken from the thermohydraulic IAEA Benchmark (TECDOC-1694). The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)

  2. Simulation of air admission in a propeller hydroturbine during transient events

    Science.gov (United States)

    Nicolle, J.; Morissette, J.-F.

    2016-11-01

    In this study, multiphysic simulations are carried out in order to model fluid loading and structural stresses on propeller blades during startup and runaway. It is found that air admission plays an important role during these transient events and that biphasic simulations are therefore required. At the speed no load regime, a large air pocket with vertical free surface forms in the centre of the runner displacing the water flow near the shroud. This significantly affects the torque developed on the blades and thus structural loading. The resulting pressures are applied to a quasi-static structural model and good agreement is obtained with experimental strain gauge data.

  3. Discrete event simulation of the Defense Waste Processing Facility (DWPF) analytical laboratory

    International Nuclear Information System (INIS)

    Shanahan, K.L.

    1992-02-01

    A discrete event simulation of the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) analytical laboratory has been constructed in the GPSS language. It was used to estimate laboratory analysis times at process analytical hold points and to study the effect of sample number on those times. Typical results are presented for three different simultaneous representing increasing levels of complexity, and for different sampling schemes. Example equipment utilization time plots are also included. SRS DWPF laboratory management and chemists found the simulations very useful for resource and schedule planning

  4. Robust control of decoherence in realistic one-qubit quantum gates

    International Nuclear Information System (INIS)

    Protopopescu, V; Perez, R; D'Helon, C; Schmulen, J

    2003-01-01

    We present an open-loop (bang-bang) scheme to control decoherence in a generic one-qubit quantum gate and implement it in a realistic simulation. The system is consistently described within the spin-boson model, with interactions accounting for both adiabatic and thermal decoherence. The external control is included from the beginning in the Hamiltonian as an independent interaction term. After tracing out the environment modes, reduced equations are obtained for the two-level system in which the effects of both decoherence and external control appear explicitly. The controls are determined exactly from the condition to eliminate decoherence, i.e. to restore unitarity. Numerical simulations show excellent performance and robustness of the proposed control scheme

  5. Optimizing a Water Simulation based on Wavefront Parameter Optimization

    OpenAIRE

    Lundgren, Martin

    2017-01-01

    DICE, a Swedish game company, wanted a more realistic water simulation. Currently, most large scale water simulations used in games are based upon ocean simulation technology. These techniques falter when used in other scenarios, such as coastlines. In order to produce a more realistic simulation, a new one was created based upon the water simulation technique "Wavefront Parameter Interpolation". This technique involves a rather extensive preprocess that enables ocean simulations to have inte...

  6. Identification of coronal heating events in 3D simulations

    Science.gov (United States)

    Kanella, Charalambos; Gudiksen, Boris V.

    2017-07-01

    Context. The solar coronal heating problem has been an open question in the science community since 1939. One of the proposed models for the transport and release of mechanical energy generated in the sub-photospheric layers and photosphere is the magnetic reconnection model that incorporates Ohmic heating, which releases a part of the energy stored in the magnetic field. In this model many unresolved flaring events occur in the solar corona, releasing enough energy to heat the corona. Aims: The problem with the verification and quantification of this model is that we cannot resolve small scale events due to limitations of the current observational instrumentation. Flaring events have scaling behavior extending from large X-class flares down to the so far unobserved nanoflares. Histograms of observable characteristics of flares show powerlaw behavior for energy release rate, size, and total energy. Depending on the powerlaw index of the energy release, nanoflares might be an important candidate for coronal heating; we seek to find that index. Methods: In this paper we employ a numerical three-dimensional (3D)-magnetohydrodynamic (MHD) simulation produced by the numerical code Bifrost, which enables us to look into smaller structures, and a new technique to identify the 3D heating events at a specific instant. The quantity we explore is the Joule heating, a term calculated directly by the code, which is explicitly correlated with the magnetic reconnection because it depends on the curl of the magnetic field. Results: We are able to identify 4136 events in a volume 24 × 24 × 9.5 Mm3 (I.e., 768 × 786 × 331 grid cells) of a specific snapshot. We find a powerlaw slope of the released energy per second equal to αP = 1.5 ± 0.02, and two powerlaw slopes of the identified volume equal to αV = 1.53 ± 0.03 and αV = 2.53 ± 0.22. The identified energy events do not represent all the released energy, but of the identified events, the total energy of the largest events

  7. Risk-based determination of design pressure of LNG fuel storage tanks based on dynamic process simulation combined with Monte Carlo method

    International Nuclear Information System (INIS)

    Noh, Yeelyong; Chang, Kwangpil; Seo, Yutaek; Chang, Daejun

    2014-01-01

    This study proposes a new methodology that combines dynamic process simulation (DPS) and Monte Carlo simulation (MCS) to determine the design pressure of fuel storage tanks on LNG-fueled ships. Because the pressure of such tanks varies with time, DPS is employed to predict the pressure profile. Though equipment failure and subsequent repair affect transient pressure development, it is difficult to implement these features directly in the process simulation due to the randomness of the failure. To predict the pressure behavior realistically, MCS is combined with DPS. In MCS, discrete events are generated to create a lifetime scenario for a system. The combination of MCS with long-term DPS reveals the frequency of the exceedance pressure. The exceedance curve of the pressure provides risk-based information for determining the design pressure based on risk acceptance criteria, which may vary with different points of view. - Highlights: • The realistic operation scenario of the LNG FGS system is estimated by MCS. • In repeated MCS trials, the availability of the FGS system is evaluated. • The realistic pressure profile is obtained by the proposed methodology. • The exceedance curve provides risk-based information for determining design pressure

  8. RailSiTe® (Rail Simulation and Testing

    Directory of Open Access Journals (Sweden)

    Martin Johne

    2016-10-01

    Full Text Available RailSiTe® (Rail Simulation and Testing is DLR’s rail simulation and testing laboratory (see Figure 1. It is the implementation of a fully modular concept for the simulation of on-board and trackside control and safety technology. The RailSiTe® laboratory additionally comprises the RailSET (Railway Simulation Environment for Train Drivers and Operators human-factors laboratory, a realistic environment containing a realistic train mockup including 3D simulation.

  9. Discrete-event simulation for the design and evaluation of physical protection systems

    International Nuclear Information System (INIS)

    Jordan, S.E.; Snell, M.K.; Madsen, M.M.; Smith, J.S.; Peters, B.A.

    1998-01-01

    This paper explores the use of discrete-event simulation for the design and control of physical protection systems for fixed-site facilities housing items of significant value. It begins by discussing several modeling and simulation activities currently performed in designing and analyzing these protection systems and then discusses capabilities that design/analysis tools should have. The remainder of the article then discusses in detail how some of these new capabilities have been implemented in software to achieve a prototype design and analysis tool. The simulation software technology provides a communications mechanism between a running simulation and one or more external programs. In the prototype security analysis tool, these capabilities are used to facilitate human-in-the-loop interaction and to support a real-time connection to a virtual reality (VR) model of the facility being analyzed. This simulation tool can be used for both training (in real-time mode) and facility analysis and design (in fast mode)

  10. Discrete event simulation of crop operations in sweet pepper in support of work method innovation

    NARCIS (Netherlands)

    Ooster, van 't Bert; Aantjes, Wiger; Melamed, Z.

    2017-01-01

    Greenhouse Work Simulation, GWorkS, is a model that simulates crop operations in greenhouses for the purpose of analysing work methods. GWorkS is a discrete event model that approaches reality as a discrete stochastic dynamic system. GWorkS was developed and validated using cut-rose as a case

  11. Decentralized real-time simulation of forest machines

    Science.gov (United States)

    Freund, Eckhard; Adam, Frank; Hoffmann, Katharina; Rossmann, Juergen; Kraemer, Michael; Schluse, Michael

    2000-10-01

    To develop realistic forest machine simulators is a demanding task. A useful simulator has to provide a close- to-reality simulation of the forest environment as well as the simulation of the physics of the vehicle. Customers demand a highly realistic three dimensional forestry landscape and the realistic simulation of the complex motion of the vehicle even in rough terrain in order to be able to use the simulator for operator training under close-to- reality conditions. The realistic simulation of the vehicle, especially with the driver's seat mounted on a motion platform, greatly improves the effect of immersion into the virtual reality of a simulated forest and the achievable level of education of the driver. Thus, the connection of the real control devices of forest machines to the simulation system has to be supported, i.e. the real control devices like the joysticks or the board computer system to control the crane, the aggregate etc. Beyond, the fusion of the board computer system and the simulation system is realized by means of sensors, i.e. digital and analog signals. The decentralized system structure allows several virtual reality systems to evaluate and visualize the information of the control devices and the sensors. So, while the driver is practicing, the instructor can immerse into the same virtual forest to monitor the session from his own viewpoint. In this paper, we are describing the realized structure as well as the necessary software and hardware components and application experiences.

  12. Simulation of rainfall-runoff for major flash flood events in Karachi

    Science.gov (United States)

    Zafar, Sumaira

    2016-07-01

    Metropolitan city Karachi has strategic importance for Pakistan. With the each passing decade the city is facing urban sprawl and rapid population growth. These rapid changes directly affecting the natural resources of city including its drainage pattern. Karachi has three major cities Malir River with the catchment area of 2252 sqkm and Lyari River has catchment area about 470.4 sqkm. These are non-perennial rivers and active only during storms. Change of natural surfaces into hard pavement causing an increase in rainfall-runoff response. Curve Number is increased which is now causing flash floods in the urban locality of Karachi. There is only one gauge installed on the upstream of the river but there no record for the discharge. Only one gauge located at the upstream is not sufficient for discharge measurements. To simulate the maximum discharge of Malir River rainfall (1985 to 2014) data were collected from Pakistan meteorological department. Major rainfall events use to simulate the rainfall runoff. Maximum rainfall-runoff response was recorded in during 1994, 2007 and 2013. This runoff causes damages and inundation in floodplain areas of Karachi. These flash flooding events not only damage the property but also cause losses of lives

  13. Simulation of size-dependent aerosol deposition in a realistic model of the upper human airways

    NARCIS (Netherlands)

    Frederix, E.M.A.; Kuczaj, Arkadiusz K.; Nordlund, Markus; Belka, M.; Lizal, F.; Elcner, J.; Jicha, M.; Geurts, Bernardus J.

    An Eulerian internally mixed aerosol model is used for predictions of deposition inside a realistic cast of the human upper airways. The model, formulated in the multi-species and compressible framework, is solved using the sectional discretization of the droplet size distribution function to

  14. Impacts of Realistic Urban Heating, Part I: Spatial Variability of Mean Flow, Turbulent Exchange and Pollutant Dispersion

    Science.gov (United States)

    Nazarian, Negin; Martilli, Alberto; Kleissl, Jan

    2018-03-01

    As urbanization progresses, more realistic methods are required to analyze the urban microclimate. However, given the complexity and computational cost of numerical models, the effects of realistic representations should be evaluated to identify the level of detail required for an accurate analysis. We consider the realistic representation of surface heating in an idealized three-dimensional urban configuration, and evaluate the spatial variability of flow statistics (mean flow and turbulent fluxes) in urban streets. Large-eddy simulations coupled with an urban energy balance model are employed, and the heating distribution of urban surfaces is parametrized using sets of horizontal and vertical Richardson numbers, characterizing thermal stratification and heating orientation with respect to the wind direction. For all studied conditions, the thermal field is strongly affected by the orientation of heating with respect to the airflow. The modification of airflow by the horizontal heating is also pronounced for strongly unstable conditions. The formation of the canyon vortices is affected by the three-dimensional heating distribution in both spanwise and streamwise street canyons, such that the secondary vortex is seen adjacent to the windward wall. For the dispersion field, however, the overall heating of urban surfaces, and more importantly, the vertical temperature gradient, dominate the distribution of concentration and the removal of pollutants from the building canyon. Accordingly, the spatial variability of concentration is not significantly affected by the detailed heating distribution. The analysis is extended to assess the effects of three-dimensional surface heating on turbulent transfer. Quadrant analysis reveals that the differential heating also affects the dominance of ejection and sweep events and the efficiency of turbulent transfer (exuberance) within the street canyon and at the roof level, while the vertical variation of these parameters is less

  15. Development of a realistic, dynamic digital brain phantom for CT perfusion validation

    Science.gov (United States)

    Divel, Sarah E.; Segars, W. Paul; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.

    2016-03-01

    Physicians rely on CT Perfusion (CTP) images and quantitative image data, including cerebral blood flow, cerebral blood volume, and bolus arrival delay, to diagnose and treat stroke patients. However, the quantification of these metrics may vary depending on the computational method used. Therefore, we have developed a dynamic and realistic digital brain phantom upon which CTP scans can be simulated based on a set of ground truth scenarios. Building upon the previously developed 4D extended cardiac-torso (XCAT) phantom containing a highly detailed brain model, this work consisted of expanding the intricate vasculature by semi-automatically segmenting existing MRA data and fitting nonuniform rational B-spline surfaces to the new vessels. Using time attenuation curves input by the user as reference, the contrast enhancement in the vessels changes dynamically. At each time point, the iodine concentration in the arteries and veins is calculated from the curves and the material composition of the blood changes to reflect the expected values. CatSim, a CT system simulator, generates simulated data sets of this dynamic digital phantom which can be further analyzed to validate CTP studies and post-processing methods. The development of this dynamic and realistic digital phantom provides a valuable resource with which current uncertainties and controversies surrounding the quantitative computations generated from CTP data can be examined and resolved.

  16. Event Shape Sorting: selecting events with similar evolution

    Directory of Open Access Journals (Sweden)

    Tomášik Boris

    2017-01-01

    Full Text Available We present novel method for the organisation of events. The method is based on comparing event-by-event histograms of a chosen quantity Q that is measured for each particle in every event. The events are organised in such a way that those with similar shape of the Q-histograms end-up placed close to each other. We apply the method on histograms of azimuthal angle of the produced hadrons in ultrarelativsitic nuclear collisions. By selecting events with similar azimuthal shape of their hadron distribution one chooses events which are likely that they underwent similar evolution from the initial state to the freeze-out. Such events can more easily be compared to theoretical simulations where all conditions can be controlled. We illustrate the method on data simulated by the AMPT model.

  17. Realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2013-11-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. According to this definition, some roofs may have faces isolated from the boundary of P or even local minima, which are undesirable for several practical reasons. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs and show that the straight skeleton induces a realistic roof with maximum height and volume. We also show that the maximum possible number of distinct realistic roofs over P is ((n-4)(n-4)/4 /2⌋) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n4) preprocessing time. We also present an O(n5)-time algorithm for computing a realistic roof with minimum height or volume. © 2013 Elsevier B.V.

  18. Modeling and Analysis of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.

    2015-01-01

    An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).

  19. Simulating the influence of life trajectory events on transport mode behavior in an agent-based system

    NARCIS (Netherlands)

    Verhoeven, M.; Arentze, T.A.; Timmermans, H.J.P.; Waerden, van der P.J.H.J.

    2007-01-01

    this paper describes the results of a study on the impact of lifecycle or life trajectory events on activity-travel decisions. This lifecycle trajectory of individual agents can be easily incorporated in an agent-based simulation system. This paper focuses on two lifecycle events, change in

  20. Precision measurements in the weak interaction framework: development of realistic simulations for the LPCTrap device installed at GANIL

    International Nuclear Information System (INIS)

    Fabian, Xavier

    2015-01-01

    This work belongs to the effort presently deployed to measure the angular correlation parameter a_β_ν in three nuclear beta decays ("6He"+, "3"5Ar"+ and "1"9Ne"+). The V-A structure of the weak interaction implies that a_β_ν = +1 for a pure Fermi transition and a_β_ν = -1/3 for a pure Gamow-Teller transition. A thorough measurement of this parameter to check any deviation from these values may lead to the discovery of possible exotic currents. Furthermore, the measurement of a_β_ν in mirror transitions allows the extraction of V_u_d, the first element of the Cabibbo-Kobayashi-Maskawa (CKM) matrix. The LPCTrap apparatus, installed at GANIL, is designed to ready a continuous ion beam for injection in a dedicated Paul trap. This latter device allows to have a quasi-punctual source from which the decay products are detected in coincidence. It is from the study of the recoil ion time-of-flight (TOF) distribution that a_β_ν is withdrawn and, since 2010, the associated Shake-Off (SO) probabilities. This study requires the complete simulation of the LPCTrap experiments. The major part of this work is dedicated to such simulations, especially to the modeling of the trapped ion cloud dynamic. The Clouda program, which takes advantage of graphics processing unit (GPU), was developed in this context and its full characterization is presented here. Three important aspects are addressed: the electromagnetic trapping field, the realistic collisions between the ions and the buffer gas atoms and the space charge effect. The present work shows the importance of these simulations to increase the control of the systematic errors on a_β_ν. (author) [fr

  1. The crystal zero degree detector at BESIII as a realistic high rate environment for evaluating PANDA data acquisition modules

    International Nuclear Information System (INIS)

    Werner, Marcel

    2015-03-01

    The BESIII experiment located in Beijing, China, is investigating physics in the energy region of the charm-quark via electron positron annihilation reactions. A small detector to be placed in the very forward/backward region around θ=0 at BESIII is foreseen to measure photons from the initial state. This is especially interesting, because it opens the door for various physics measurements over a wide range of energies, even below the experiment's designated energy threshold, which is fixed by the accelerator. This thesis is investigating the capabilities of a crystal zero degree detector (cZDD) consisting of PbWO 4 crystals placed in that region of BESIII. Detailed Geant4-based simulations have been performed, and the energy resolution of the detector has been determined to be σ/μ=0.06+0.025/√(E[GeV]). The determination of the center-of-mass energy √(s) isr after the emission of the photon is of great importance for the study of such events. Preliminary simulations estimated the resolution of the reconstructed √(s) isr using the cZDD information to be significantly better than 10 % for appropriate photon impacts on the detector. Such events can only be investigated, when data from the cZDD and other detectors of BESIII can be correlated. A fast and powerful Data Acquisition (DAQ) capable of performing event correlation in real time is needed. DAQ modules capable of performing real time event correlation are being developed for the PANDA experiment at the future FAIR facility in Darmstadt, Germany. Investigating these modules in a realistic high-rate environment such as provided at BESIII, offers a great opportunity to gain experience in real time event correlation before the start of PANDA. Developments for the cZDD's DAQ using prototype PANDA DAQ modules have been done and successfully tested in experiments with radioactive sources and a beamtest with 210 MeV electrons at the Mainz Microtron.

  2. Development of a randomized 3D cell model for Monte Carlo microdosimetry simulations

    Energy Technology Data Exchange (ETDEWEB)

    Douglass, Michael; Bezak, Eva; Penfold, Scott [School of Chemistry and Physics, University of Adelaide, North Terrace, Adelaide 5005, South Australia (Australia) and Department of Medical Physics, Royal Adelaide Hospital, North Terrace, Adelaide 5000, South Australia (Australia)

    2012-06-15

    Purpose: The objective of the current work was to develop an algorithm for growing a macroscopic tumor volume from individual randomized quasi-realistic cells. The major physical and chemical components of the cell need to be modeled. It is intended to import the tumor volume into GEANT4 (and potentially other Monte Carlo packages) to simulate ionization events within the cell regions. Methods: A MATLAB Copyright-Sign code was developed to produce a tumor coordinate system consisting of individual ellipsoidal cells randomized in their spatial coordinates, sizes, and rotations. An eigenvalue method using a mathematical equation to represent individual cells was used to detect overlapping cells. GEANT4 code was then developed to import the coordinate system into GEANT4 and populate it with individual cells of varying sizes and composed of the membrane, cytoplasm, reticulum, nucleus, and nucleolus. Each region is composed of chemically realistic materials. Results: The in-house developed MATLAB Copyright-Sign code was able to grow semi-realistic cell distributions ({approx}2 Multiplication-Sign 10{sup 8} cells in 1 cm{sup 3}) in under 36 h. The cell distribution can be used in any number of Monte Carlo particle tracking toolkits including GEANT4, which has been demonstrated in this work. Conclusions: Using the cell distribution and GEANT4, the authors were able to simulate ionization events in the individual cell components resulting from 80 keV gamma radiation (the code is applicable to other particles and a wide range of energies). This virtual microdosimetry tool will allow for a more complete picture of cell damage to be developed.

  3. Tropical climate and vegetation cover during Heinrich event 1: Simulations with coupled climate vegetation models

    OpenAIRE

    Handiani, Dian Noor

    2012-01-01

    This study focuses on the climate and vegetation responses to abrupt climate change in the Northern Hemisphere during the last glacial period. Two abrupt climate events are explored: the abrupt cooling of the Heinrich event 1 (HE1), followed by the abrupt warming of the Bølling-Allerød interstadial (BA). These two events are simulated by perturbing the freshwater balance of the Atlantic Ocean, with the intention of altering the Atlantic Meridional Overturning Circulation (AMOC) and also of in...

  4. Crash avoidance in response to challenging driving events: The roles of age, serialization, and driving simulator platform.

    Science.gov (United States)

    Bélanger, Alexandre; Gagnon, Sylvain; Stinchcombe, Arne

    2015-09-01

    We examined the crash avoidance behaviors of older and middle-aged drivers in reaction to six simulated challenging road events using two different driving simulator platforms. Thirty-five healthy adults aged 21-36 years old (M=28.9±3.96) and 35 healthy adults aged 65-83 years old (M=72.1±4.34) were tested using a mid-level simulator, and 27 adults aged 21-38 years old (M=28.6±6.63) and 27 healthy adults aged 65-83 years old (M=72.7±5.39) were tested on a low-cost desktop simulator. Participants completed a set of six challenging events varying in terms of the maneuvers required, avoiding space given, directional avoidance cues, and time pressure. Results indicated that older drivers showed higher crash risk when events required multiple synchronized reactions. In situations that required simultaneous use of steering and braking, older adults tended to crash significantly more frequently. As for middle-aged drivers, their crashes were attributable to faster driving speed. The same age-related driving patterns were observed across simulator platforms. Our findings support the hypothesis that older adults tend to react serially while engaging in cognitively challenging road maneuvers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Evaluation of a proposed optimization method for discrete-event simulation models

    Directory of Open Access Journals (Sweden)

    Alexandre Ferreira de Pinho

    2012-12-01

    Full Text Available Optimization methods combined with computer-based simulation have been utilized in a wide range of manufacturing applications. However, in terms of current technology, these methods exhibit low performance levels which are only able to manipulate a single decision variable at a time. Thus, the objective of this article is to evaluate a proposed optimization method for discrete-event simulation models based on genetic algorithms which exhibits more efficiency in relation to computational time when compared to software packages on the market. It should be emphasized that the variable's response quality will not be altered; that is, the proposed method will maintain the solutions' effectiveness. Thus, the study draws a comparison between the proposed method and that of a simulation instrument already available on the market and has been examined in academic literature. Conclusions are presented, confirming the proposed optimization method's efficiency.

  6. Design principles and optimal performance for molecular motors under realistic constraints

    Science.gov (United States)

    Tu, Yuhai; Cao, Yuansheng

    2018-02-01

    The performance of a molecular motor, characterized by its power output and energy efficiency, is investigated in the motor design space spanned by the stepping rate function and the motor-track interaction potential. Analytic results and simulations show that a gating mechanism that restricts forward stepping in a narrow window in configuration space is needed for generating high power at physiologically relevant loads. By deriving general thermodynamics laws for nonequilibrium motors, we find that the maximum torque (force) at stall is less than its theoretical limit for any realistic motor-track interactions due to speed fluctuations. Our study reveals a tradeoff for the motor-track interaction: while a strong interaction generates a high power output for forward steps, it also leads to a higher probability of wasteful spontaneous back steps. Our analysis and simulations show that this tradeoff sets a fundamental limit to the maximum motor efficiency in the presence of spontaneous back steps, i.e., loose-coupling. Balancing this tradeoff leads to an optimal design of the motor-track interaction for achieving a maximum efficiency close to 1 for realistic motors that are not perfectly coupled with the energy source. Comparison with existing data and suggestions for future experiments are discussed.

  7. High-resolution simulation and forecasting of Jeddah floods using WRF version 3.5

    KAUST Repository

    Deng, Liping

    2013-12-01

    Modeling flash flood events in arid environments is a difficult but important task that has impacts on both water resource related issues and also emergency management and response. The challenge is often related to adequately describing the precursor intense rainfall events that cause these flood responses, as they are generally poorly simulated and forecast. Jeddah, the second largest city in the Kingdom of Saudi Arabia, has suffered from a number of flash floods over the last decade, following short-intense rainfall events. The research presented here focuses on examining four historic Jeddah flash floods (Nov. 25-26 2009, Dec. 29-30 2010, Jan. 14-15 2011 and Jan. 25-26 2011) and investigates the feasibility of using numerical weather prediction models to achieve a more realistic simulation of these flood-producing rainfall events. The Weather Research and Forecasting (WRF) model (version 3.5) is used to simulate precipitation and meteorological conditions via a high-resolution inner domain (1-km) around Jeddah. A range of different convective closure and microphysics parameterization, together with high-resolution (4-km) sea surface temperature data are employed. Through examining comparisons between the WRF model output and in-situ, radar and satellite data, the characteristics and mechanism producing the extreme rainfall events are discussed and the capacity of the WRF model to accurately forecast these rainstorms is evaluated.

  8. High-resolution simulation and forecasting of Jeddah floods using WRF version 3.5

    KAUST Repository

    Deng, Liping; McCabe, Matthew; Stenchikov, Georgiy L.; Evans, Jason; Kucera, Paul

    2013-01-01

    Modeling flash flood events in arid environments is a difficult but important task that has impacts on both water resource related issues and also emergency management and response. The challenge is often related to adequately describing the precursor intense rainfall events that cause these flood responses, as they are generally poorly simulated and forecast. Jeddah, the second largest city in the Kingdom of Saudi Arabia, has suffered from a number of flash floods over the last decade, following short-intense rainfall events. The research presented here focuses on examining four historic Jeddah flash floods (Nov. 25-26 2009, Dec. 29-30 2010, Jan. 14-15 2011 and Jan. 25-26 2011) and investigates the feasibility of using numerical weather prediction models to achieve a more realistic simulation of these flood-producing rainfall events. The Weather Research and Forecasting (WRF) model (version 3.5) is used to simulate precipitation and meteorological conditions via a high-resolution inner domain (1-km) around Jeddah. A range of different convective closure and microphysics parameterization, together with high-resolution (4-km) sea surface temperature data are employed. Through examining comparisons between the WRF model output and in-situ, radar and satellite data, the characteristics and mechanism producing the extreme rainfall events are discussed and the capacity of the WRF model to accurately forecast these rainstorms is evaluated.

  9. A global MHD simulation of an event with a quasi-steady northward IMF component

    Directory of Open Access Journals (Sweden)

    V. G. Merkin

    2007-06-01

    Full Text Available We show results of the Lyon-Fedder-Mobarry (LFM global MHD simulations of an event previously examined using Iridium spacecraft observations as well as DMSP and IMAGE FUV data. The event is chosen for the steady northward IMF sustained over a three-hour period during 16 July 2000. The Iridium observations showed very weak or absent Region 2 currents in the ionosphere, which makes the event favorable for global MHD modeling. Here we are interested in examining the model's performace during weak magnetospheric forcing, in particular, its ability to reproduce gross signatures of the ionospheric currents and convection pattern and energy deposition in the ionosphere both due to the Poynting flux and particle precipitation. We compare the ionospheric field-aligned current and electric potential patterns with those recovered from Iridium and DMSP observations, respectively. In addition, DMSP magnetometer data are used for comparisons of ionospheric magnetic perturbations. The electromagnetic energy flux is compared with Iridium-inferred values, while IMAGE FUV observations are utilized to verify the simulated particle energy flux.

  10. Numerical simulation of realistic high-temperature superconductors

    International Nuclear Information System (INIS)

    1997-01-01

    One of the main obstacles in the development of practical high-temperature superconducting (HTS) materials is dissipation, caused by the motion of magnetic flux quanta called vortices. Numerical simulations provide a promising new approach for studying these vortices. By exploiting the extraordinary memory and speed of massively parallel computers, researchers can obtain the extremely fine temporal and spatial resolution needed to model complex vortex behavior. The results may help identify new mechanisms to increase the current-capability capabilities and to predict the performance characteristics of HTS materials intended for industrial applications

  11. Ride Motion Simulator (RMS)

    Data.gov (United States)

    Federal Laboratory Consortium — The RMS is a simulator designed for crew station and man-in-the-loop experimentation. The simulator immerses users in a synthetic battlefield to experience realistic...

  12. Monte Carlo generator ELRADGEN 2.0 for simulation of radiative events in elastic ep-scattering of polarized particles

    Science.gov (United States)

    Akushevich, I.; Filoti, O. F.; Ilyichev, A.; Shumeiko, N.

    2012-07-01

    The structure and algorithms of the Monte Carlo generator ELRADGEN 2.0 designed to simulate radiative events in polarized ep-scattering are presented. The full set of analytical expressions for the QED radiative corrections is presented and discussed in detail. Algorithmic improvements implemented to provide faster simulation of hard real photon events are described. Numerical tests show high quality of generation of photonic variables and radiatively corrected cross section. The comparison of the elastic radiative tail simulated within the kinematical conditions of the BLAST experiment at MIT BATES shows a good agreement with experimental data. Catalogue identifier: AELO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1299 No. of bytes in distributed program, including test data, etc.: 11 348 Distribution format: tar.gz Programming language: FORTRAN 77 Computer: All Operating system: Any RAM: 1 MB Classification: 11.2, 11.4 Nature of problem: Simulation of radiative events in polarized ep-scattering. Solution method: Monte Carlo simulation according to the distributions of the real photon kinematic variables that are calculated by the covariant method of QED radiative correction estimation. The approach provides rather fast and accurate generation. Running time: The simulation of 108 radiative events for itest:=1 takes up to 52 seconds on Pentium(R) Dual-Core 2.00 GHz processor.

  13. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    Science.gov (United States)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  14. Integrated hydraulic and organophosphate pesticide injection simulations for enhancing event detection in water distribution systems.

    Science.gov (United States)

    Schwartz, Rafi; Lahav, Ori; Ostfeld, Avi

    2014-10-15

    As a complementary step towards solving the general event detection problem of water distribution systems, injection of the organophosphate pesticides, chlorpyrifos (CP) and parathion (PA), were simulated at various locations within example networks and hydraulic parameters were calculated over 24-h duration. The uniqueness of this study is that the chemical reactions and byproducts of the contaminants' oxidation were also simulated, as well as other indicative water quality parameters such as alkalinity, acidity, pH and the total concentration of free chlorine species. The information on the change in water quality parameters induced by the contaminant injection may facilitate on-line detection of an actual event involving this specific substance and pave the way to development of a generic methodology for detecting events involving introduction of pesticides into water distribution systems. Simulation of the contaminant injection was performed at several nodes within two different networks. For each injection, concentrations of the relevant contaminants' mother and daughter species, free chlorine species and water quality parameters, were simulated at nodes downstream of the injection location. The results indicate that injection of these substances can be detected at certain conditions by a very rapid drop in Cl2, functioning as the indicative parameter, as well as a drop in alkalinity concentration and a small decrease in pH, both functioning as supporting parameters, whose usage may reduce false positive alarms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Do absorption and realistic distraction influence performance of component task surgical procedure?

    OpenAIRE

    Pluyter, J.R.; Buzink, S.N.; Rutkowski, A.F.; Jakimowicz, J.J.

    2009-01-01

    Background. Surgeons perform complex tasks while exposed to multiple distracting sources that may increase stress in the operating room (e.g., music, conversation, and unadapted use of sophisticated technologies). This study aimed to examine whether such realistic social and technological distracting conditions may influence surgical performance. Methods. Twelve medical interns performed a laparoscopic cholecystectomy task with the Xitact LC 3.0 virtual reality simulator under distracting con...

  16. Toward Simulating Realistic Pursuit-Evasion Using a Roadmap-Based Approach

    KAUST Repository

    Rodriguez, Samuel; Denny, Jory; Zourntos, Takis; Amato, Nancy M.

    2010-01-01

    In this work, we describe an approach for modeling and simulating group behaviors for pursuit-evasion that uses a graph-based representation of the environment and integrates multi-agent simulation with roadmap-based path planning. We demonstrate

  17. U.S. Marine Corps Communication-Electronics School Training Process: Discrete-Event Simulation and Lean Options

    National Research Council Canada - National Science Library

    Neu, Charles R; Davenport, Jon; Smith, William R

    2007-01-01

    This paper uses discrete-event simulation modeling, inventory-reduction, and process improvement concepts to identify and analyze possibilities for improving the training continuum at the Marine Corps...

  18. 3-D topological signatures and a new discrimination method for single-electron events and 0νββ events in CdZnTe: A Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Ming; Li, Teng-Lin; Cang, Ji-Rong [Key Laboratory of Particle & Radiation Imaging (Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Zeng, Zhi, E-mail: zengzhi@tsinghua.edu.cn [Key Laboratory of Particle & Radiation Imaging (Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China); Fu, Jian-Qiang; Zeng, Wei-He; Cheng, Jian-Ping; Ma, Hao; Liu, Yi-Nong [Key Laboratory of Particle & Radiation Imaging (Tsinghua University), Ministry of Education (China); Department of Engineering Physics, Tsinghua University, Beijing 100084 (China)

    2017-06-21

    In neutrinoless double beta (0νββ) decay experiments, the diversity of topological signatures of different particles provides an important tool to distinguish double beta events from background events and reduce background rates. Aiming at suppressing the single-electron backgrounds which are most challenging, several groups have established Monte Carlo simulation packages to study the topological characteristics of single-electron events and 0νββ events and develop methods to differentiate them. In this paper, applying the knowledge of graph theory, a new topological signature called REF track (Refined Energy-Filtered track) is proposed and proven to be an accurate approximation of the real particle trajectory. Based on the analysis of the energy depositions along the REF track of single-electron events and 0νββ events, the REF energy deposition models for both events are proposed to indicate the significant differences between them. With these differences, this paper presents a new discrimination method, which, in the Monte Carlo simulation, achieved a single-electron rejection factor of 93.8±0.3 (stat.)% as well as a 0νββ efficiency of 85.6±0.4 (stat.)% with optimized parameters in CdZnTe.

  19. Discrete event simulation and virtual reality use in industry: new opportunities and future trends

    OpenAIRE

    Turner, Christopher; Hutabarat, Windo; Oyekan, John; Tiwari, Ashutosh

    2016-01-01

    This paper reviews the area of combined discrete event simulation (DES) and virtual reality (VR) use within industry. While establishing a state of the art for progress in this area, this paper makes the case for VR DES as the vehicle of choice for complex data analysis through interactive simulation models, highlighting both its advantages and current limitations. This paper reviews active research topics such as VR and DES real-time integration, communication protocols,...

  20. Simulation of interim spent fuel storage system with discrete event model

    International Nuclear Information System (INIS)

    Yoon, Wan Ki; Song, Ki Chan; Lee, Jae Sol; Park, Hyun Soo

    1989-01-01

    This paper describes dynamic simulation of the spent fuel storage system which is described by statistical discrete event models. It visualizes flow and queue of system over time, assesses the operational performance of the system activities and establishes the system components and streams. It gives information on system organization and operation policy with reference to the design. System was tested and analyzed over a number of critical parameters to establish the optimal system. Workforce schedule and resources with long processing time dominate process. A combination of two workforce shifts a day and two cooling pits gives the optimal solution of storage system. Discrete system simulation is an useful tool to get information on optimal design and operation of the storage system. (Author)

  1. Simulation of Flash-Flood-Producing Storm Events in Saudi Arabia Using the Weather Research and Forecasting Model

    KAUST Repository

    Deng, Liping

    2015-05-01

    The challenges of monitoring and forecasting flash-flood-producing storm events in data-sparse and arid regions are explored using the Weather Research and Forecasting (WRF) Model (version 3.5) in conjunction with a range of available satellite, in situ, and reanalysis data. Here, we focus on characterizing the initial synoptic features and examining the impact of model parameterization and resolution on the reproduction of a number of flood-producing rainfall events that occurred over the western Saudi Arabian city of Jeddah. Analysis from the European Centre for Medium-Range Weather Forecasts (ECMWF) interim reanalysis (ERA-Interim) data suggests that mesoscale convective systems associated with strong moisture convergence ahead of a trough were the major initial features for the occurrence of these intense rain events. The WRF Model was able to simulate the heavy rainfall, with driving convective processes well characterized by a high-resolution cloud-resolving model. The use of higher (1 km vs 5 km) resolution along the Jeddah coastline favors the simulation of local convective systems and adds value to the simulation of heavy rainfall, especially for deep-convection-related extreme values. At the 5-km resolution, corresponding to an intermediate study domain, simulation without a cumulus scheme led to the formation of deeper convective systems and enhanced rainfall around Jeddah, illustrating the need for careful model scheme selection in this transition resolution. In analysis of multiple nested WRF simulations (25, 5, and 1 km), localized volume and intensity of heavy rainfall together with the duration of rainstorms within the Jeddah catchment area were captured reasonably well, although there was evidence of some displacements of rainstorm events.

  2. Vaporization studies of plasma interactive materials in simulated plasma disruption events

    International Nuclear Information System (INIS)

    Stone, C.A. IV; Croessmann, C.D.; Whitley, J.B.

    1988-03-01

    The melting and vaporization that occur when plasma facing materials are subjected to a plasma disruption will severely limit component lifetime and plasma performance. A series of high heat flux experiments was performed on a group of fusion reactor candidate materials to model material erosion which occurs during plasma disruption events. The Electron Beam Test System was used to simulate single disruption and multiple disruption phenomena. Samples of aluminum, nickel, copper, molybdenum, and 304 stainless steel were subjected to a variety of heat loads, ranging from 100 to 400 msec pulses of 8 to 18 kWcm 2 . It was found that the initial surface temperature of a material strongly influences the vaporization process and that multiple disruptions do not scale linearly with respect to single disruption events. 2 refs., 9 figs., 5 tabs

  3. Validating numerical simulations of snow avalanches using dendrochronology: the Cerro Ventana event in Northern Patagonia, Argentina

    Directory of Open Access Journals (Sweden)

    A. Casteller

    2008-05-01

    Full Text Available The damage caused by snow avalanches to property and human lives is underestimated in many regions around the world, especially where this natural hazard remains poorly documented. One such region is the Argentinean Andes, where numerous settlements are threatened almost every winter by large snow avalanches. On 1 September 2002, the largest tragedy in the history of Argentinean mountaineering took place at Cerro Ventana, Northern Patagonia: nine persons were killed and seven others injured by a snow avalanche. In this paper, we combine both numerical modeling and dendrochronological investigations to reconstruct this event. Using information released by local governmental authorities and compiled in the field, the avalanche event was numerically simulated using the avalanche dynamics programs AVAL-1D and RAMMS. Avalanche characteristics, such as extent and date were determined using dendrochronological techniques. Model simulation results were compared with documentary and tree-ring evidences for the 2002 event. Our results show a good agreement between the simulated projection of the avalanche and its reconstructed extent using tree-ring records. Differences between the observed and the simulated avalanche, principally related to the snow height deposition in the run-out zone, are mostly attributed to the low resolution of the digital elevation model used to represent the valley topography. The main contributions of this study are (1 to provide the first calibration of numerical avalanche models for the Patagonian Andes and (2 to highlight the potential of Nothofagus pumilio tree-ring records to reconstruct past snow-avalanche events in time and space. Future research should focus on testing this combined approach in other forested regions of the Andes.

  4. Soil organic carbon loss and selective transportation under field simulated rainfall events.

    Science.gov (United States)

    Nie, Xiaodong; Li, Zhongwu; Huang, Jinquan; Huang, Bin; Zhang, Yan; Ma, Wenming; Hu, Yanbiao; Zeng, Guangming

    2014-01-01

    The study on the lateral movement of soil organic carbon (SOC) during soil erosion can improve the understanding of global carbon budget. Simulated rainfall experiments on small field plots were conducted to investigate the SOC lateral movement under different rainfall intensities and tillage practices. Two rainfall intensities (High intensity (HI) and Low intensity (LI)) and two tillage practices (No tillage (NT) and Conventional tillage (CT)) were maintained on three plots (2 m width × 5 m length): HI-NT, LI-NT and LI-CT. The rainfall lasted 60 minutes after the runoff generated, the sediment yield and runoff volume were measured and sampled at 6-min intervals. SOC concentration of sediment and runoff as well as the sediment particle size distribution were measured. The results showed that most of the eroded organic carbon (OC) was lost in form of sediment-bound organic carbon in all events. The amount of lost SOC in LI-NT event was 12.76 times greater than that in LI-CT event, whereas this measure in HI-NT event was 3.25 times greater than that in LI-NT event. These results suggest that conventional tillage as well as lower rainfall intensity can reduce the amount of lost SOC during short-term soil erosion. Meanwhile, the eroded sediment in all events was enriched in OC, and higher enrichment ratio of OC (ERoc) in sediment was observed in LI events than that in HI event, whereas similar ERoc curves were found in LI-CT and LI-NT events. Furthermore, significant correlations between ERoc and different size sediment particles were only observed in HI-NT event. This indicates that the enrichment of OC is dependent on the erosion process, and the specific enrichment mechanisms with respect to different erosion processes should be studied in future.

  5. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  6. Rare event simulation in finite-infinite dimensional space

    International Nuclear Information System (INIS)

    Au, Siu-Kui; Patelli, Edoardo

    2016-01-01

    Modern engineering systems are becoming increasingly complex. Assessing their risk by simulation is intimately related to the efficient generation of rare failure events. Subset Simulation is an advanced Monte Carlo method for risk assessment and it has been applied in different disciplines. Pivotal to its success is the efficient generation of conditional failure samples, which is generally non-trivial. Conventionally an independent-component Markov Chain Monte Carlo (MCMC) algorithm is used, which is applicable to high dimensional problems (i.e., a large number of random variables) without suffering from ‘curse of dimension’. Experience suggests that the algorithm may perform even better for high dimensional problems. Motivated by this, for any given problem we construct an equivalent problem where each random variable is represented by an arbitrary (hence possibly infinite) number of ‘hidden’ variables. We study analytically the limiting behavior of the algorithm as the number of hidden variables increases indefinitely. This leads to a new algorithm that is more generic and offers greater flexibility and control. It coincides with an algorithm recently suggested by independent researchers, where a joint Gaussian distribution is imposed between the current sample and the candidate. The present work provides theoretical reasoning and insights into the algorithm.

  7. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  8. Insights into factors contributing to the observability of a submarine at periscope depth by modern radar, Part 2: EM simulation of mast RCS in a realistic sea surface environment

    CSIR Research Space (South Africa)

    Smit, JC

    2012-09-01

    Full Text Available IEEE-APS Topical Conference on Antennas and Propagation in Wireless Communications (APWC), Cape Town 2-7 September 2012 Insights into factors contributing to the observability of a submarine at periscope depth by modern radar, Part 2: EM... simulation of mast RCS in a realistic sea surface environment Smit JC; Cilliers JE CSIR, Defence, Peace, Safety and Security. PO Box 395, Pretoria, 0001 Abstract Recently, a set of high resolution radar measurements were undertaken in South...

  9. Halo current and resistive wall simulations of ITER

    International Nuclear Information System (INIS)

    Strauss, H.R.; Zheng Linjin; Kotschenreuther, M.; Park, W.; Jardin, S.; Breslau, J.; Pletzer, A.; Paccagnella, R.; Sugiyama, L.; Chu, M.; Chance, M.; Turnbull, A.

    2005-01-01

    A number of ITER relevant problems in resistive MHD concern the effects of a resistive wall: vertical displacement events (VDE), halo currents caused by disruptions, and resistive wall modes. Simulations of these events have been carried out using the M3D code. We have verified the growth rate scaling of VDEs, which is proportional to the wall resistivity. Simulations have been done of disruptions caused by large inversion radius internal kink modes, as well as by nonlinear growth of resistive wall modes. Halo current flowing during the disruption has asymmetries with toroidal peaking factor up to about 3. VDEs have larger growth rates during disruption simulations, which may account for the loss of vertical feedback control during disruptions in experiments. Further simulations have been made of disruptions caused by resistive wall modes in ITER equilibria. For these modes the toroidal peaking factor is close to 1. Resistive wall modes in ITER and reactors have also been investigated utilizing the newly developed AEGIS (Adaptive EiGenfunction Independent Solution) linear full MHD code, for realistically shaped, fully toroidal equilibria. The AEGIS code uses an adaptive mesh in the radial direction which allows thin inertial layers to be accurately resolved, such as those responsible for the stabilization of resistive wall modes (RWM) by plasma rotation. Stabilization of resistive wall modes by rotation and wall thickness effects are examined. (author)

  10. Computing return times or return periods with rare event algorithms

    Science.gov (United States)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  11. Simulation of thermal-neutron-induced single-event upset using particle and heavy-ion transport code system

    International Nuclear Information System (INIS)

    Arita, Yutaka; Kihara, Yuji; Mitsuhasi, Junichi; Niita, Koji; Takai, Mikio; Ogawa, Izumi; Kishimoto, Tadafumi; Yoshihara, Tsutomu

    2007-01-01

    The simulation of a thermal-neutron-induced single-event upset (SEU) was performed on a 0.4-μm-design-rule 4 Mbit static random access memory (SRAM) using particle and heavy-ion transport code system (PHITS): The SEU rates obtained by the simulation were in very good agreement with the result of experiments. PHITS is a useful tool for simulating SEUs in semiconductor devices. To further improve the accuracy of the simulation, additional methods for tallying the energy deposition are required for PHITS. (author)

  12. Analysis of cyclic variations of liquid fuel-air mixing processes in a realistic DISI IC-engine using Large Eddy Simulation

    International Nuclear Information System (INIS)

    Goryntsev, D.; Sadiki, A.; Klein, M.; Janicka, J.

    2010-01-01

    Direct injection spark ignition (DISI) engines have a large potential to reduce emissions and specific fuel consumption. One of the most important problem in the design of DISI engines is the cycle-to-cycle variations of the flow, mixing and combustion processes. The Large Eddy Simulation (LES) based analysis is used to characterize the cycle-to-cycle fluctuations of the flow field as well as the mixture preparation in a realistic four-stroke internal combustion engine with variable charge motion system. Based on the analysis of cycle-to-cycle velocity fluctuations of in-cylinder flow, the impact of various fuel spray boundary conditions on injection processes and mixture preparation is pointed out. The joint effect of both cycle-to-cycle velocity fluctuations and variable spray boundary conditions is discussed in terms of mean and standard deviation of relative air-fuel ratio, velocity and mass fraction. Finally a qualitative analysis of the intensity of cyclic fluctuations below the spark plug is provided.

  13. Evacuation Simulation in Kalayaan Residence Hall, up Diliman Using Gama Simulation Software

    Science.gov (United States)

    Claridades, A. R. C.; Villanueva, J. K. S.; Macatulad, E. G.

    2016-09-01

    Agent-Based Modeling (ABM) has recently been adopted in some studies for the modelling of events as a dynamic system given a set of events and parameters. In principle, ABM employs individual agents with assigned attributes and behaviors and simulates their behavior around their environment and interaction with other agents. This can be a useful tool in both micro and macroscale-applications. In this study, a model initially created and applied to an academic building was implemented in a dormitory. In particular, this research integrates three-dimensional Geographic Information System (GIS) with GAMA as the multi-agent based evacuation simulation and is implemented in Kalayaan Residence Hall. A three-dimensional GIS model is created based on the floor plans and demographic data of the dorm, including respective pathways as networks, rooms, floors, exits and appropriate attributes. This model is then re-implemented in GAMA. Different states of the agents and their effect on their evacuation time were then observed. GAMA simulation with varying path width was also implemented. It has been found out that compared to their original states, panic, eating and studying will hasten evacuation, and on the other hand, sleeping and being on the bathrooms will be impedances. It is also concluded that evacuation time will be halved when path widths are doubled, however it is recommended for further studies for pathways to be modeled as spaces instead of lines. A more scientific basis for predicting agent behavior in these states is also recommended for more realistic results.

  14. Simulating at realistic quark masses. Light quark masses

    International Nuclear Information System (INIS)

    Goeckeler, M.; Streuer, T.

    2006-11-01

    We present new results for light quark masses. The calculations are performed using two flavours of O(a) improved Wilson fermions. We have reached lattice spacings as small as a ∝0.07 fm and pion masses down to m π ∝340 MeV in our simulations. This gives us significantly better control on the chiral and continuum extrapolations. (orig.)

  15. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  16. Simulation of complex data structures for planning of studies with focus on biomarker comparison.

    Science.gov (United States)

    Schulz, Andreas; Zöller, Daniela; Nickels, Stefan; Beutel, Manfred E; Blettner, Maria; Wild, Philipp S; Binder, Harald

    2017-06-13

    There are a growing number of observational studies that do not only focus on single biomarkers for predicting an outcome event, but address questions in a multivariable setting. For example, when quantifying the added value of new biomarkers in addition to established risk factors, the aim might be to rank several new markers with respect to their prediction performance. This makes it important to consider the marker correlation structure for planning such a study. Because of the complexity, a simulation approach may be required to adequately assess sample size or other aspects, such as the choice of a performance measure. In a simulation study based on real data, we investigated how to generate covariates with realistic distributions and what generating model should be used for the outcome, aiming to determine the least amount of information and complexity needed to obtain realistic results. As a basis for the simulation a large epidemiological cohort study, the Gutenberg Health Study was used. The added value of markers was quantified and ranked in subsampling data sets of this population data, and simulation approaches were judged by the quality of the ranking. One of the evaluated approaches, the random forest, requires original data at the individual level. Therefore, also the effect of the size of a pilot study for random forest based simulation was investigated. We found that simple logistic regression models failed to adequately generate realistic data, even with extensions such as interaction terms or non-linear effects. The random forest approach was seen to be more appropriate for simulation of complex data structures. Pilot studies starting at about 250 observations were seen to provide a reasonable level of information for this approach. We advise to avoid oversimplified regression models for simulation, in particular when focusing on multivariable research questions. More generally, a simulation should be based on real data for adequately reflecting

  17. Simulation of complex data structures for planning of studies with focus on biomarker comparison

    Directory of Open Access Journals (Sweden)

    Andreas Schulz

    2017-06-01

    Full Text Available Abstract Background There are a growing number of observational studies that do not only focus on single biomarkers for predicting an outcome event, but address questions in a multivariable setting. For example, when quantifying the added value of new biomarkers in addition to established risk factors, the aim might be to rank several new markers with respect to their prediction performance. This makes it important to consider the marker correlation structure for planning such a study. Because of the complexity, a simulation approach may be required to adequately assess sample size or other aspects, such as the choice of a performance measure. Methods In a simulation study based on real data, we investigated how to generate covariates with realistic distributions and what generating model should be used for the outcome, aiming to determine the least amount of information and complexity needed to obtain realistic results. As a basis for the simulation a large epidemiological cohort study, the Gutenberg Health Study was used. The added value of markers was quantified and ranked in subsampling data sets of this population data, and simulation approaches were judged by the quality of the ranking. One of the evaluated approaches, the random forest, requires original data at the individual level. Therefore, also the effect of the size of a pilot study for random forest based simulation was investigated. Results We found that simple logistic regression models failed to adequately generate realistic data, even with extensions such as interaction terms or non-linear effects. The random forest approach was seen to be more appropriate for simulation of complex data structures. Pilot studies starting at about 250 observations were seen to provide a reasonable level of information for this approach. Conclusions We advise to avoid oversimplified regression models for simulation, in particular when focusing on multivariable research questions. More generally

  18. Realistic Paleobathymetry of the Cenomanian–Turonian (94 Ma Boundary Global Ocean

    Directory of Open Access Journals (Sweden)

    Arghya Goswami

    2018-01-01

    Full Text Available At present, global paleoclimate simulations are prepared with bathtub-like, flat, featureless and steep walled ocean bathymetry, which is neither realistic nor suitable. In this article, we present the first enhanced version of a reconstructed paleobathymetry for Cenomanian–Turonian (94 Ma time in a 0.1° × 0.1° resolution, that is both realistic and suitable for use in paleo-climate studies. This reconstruction is an extrapolation of a parameterized modern ocean bathymetry that combines simple geophysical models (standard plate cooling model for the oceanic lithosphere based on ocean crustal age, global modern oceanic sediment thicknesses, and generalized shelf-slope-rise structures calibrated from a published global relief model of the modern world (ETOPO1 at active and passive continental margins. The base version of this Cenomanian–Turonian paleobathymetry reconstruction is then updated with known submarine large igneous provinces, plateaus, and seamounts to minimize the difference between the reconstructed paleobathymetry and the real bathymetry that once existed.

  19. Assessment of the Weather Research and Forecasting (WRF) model for simulation of extreme rainfall events in the upper Ganga Basin

    Science.gov (United States)

    Chawla, Ila; Osuri, Krishna K.; Mujumdar, Pradeep P.; Niyogi, Dev

    2018-02-01

    Reliable estimates of extreme rainfall events are necessary for an accurate prediction of floods. Most of the global rainfall products are available at a coarse resolution, rendering them less desirable for extreme rainfall analysis. Therefore, regional mesoscale models such as the advanced research version of the Weather Research and Forecasting (WRF) model are often used to provide rainfall estimates at fine grid spacing. Modelling heavy rainfall events is an enduring challenge, as such events depend on multi-scale interactions, and the model configurations such as grid spacing, physical parameterization and initialization. With this background, the WRF model is implemented in this study to investigate the impact of different processes on extreme rainfall simulation, by considering a representative event that occurred during 15-18 June 2013 over the Ganga Basin in India, which is located at the foothills of the Himalayas. This event is simulated with ensembles involving four different microphysics (MP), two cumulus (CU) parameterizations, two planetary boundary layers (PBLs) and two land surface physics options, as well as different resolutions (grid spacing) within the WRF model. The simulated rainfall is evaluated against the observations from 18 rain gauges and the Tropical Rainfall Measuring Mission Multi-Satellite Precipitation Analysis (TMPA) 3B42RT version 7 data. From the analysis, it should be noted that the choice of MP scheme influences the spatial pattern of rainfall, while the choice of PBL and CU parameterizations influences the magnitude of rainfall in the model simulations. Further, the WRF run with Goddard MP, Mellor-Yamada-Janjic PBL and Betts-Miller-Janjic CU scheme is found to perform best in simulating this heavy rain event. The selected configuration is evaluated for several heavy to extremely heavy rainfall events that occurred across different months of the monsoon season in the region. The model performance improved through incorporation

  20. Connecting macroscopic observables and microscopic assembly events in amyloid formation using coarse grained simulations.

    Directory of Open Access Journals (Sweden)

    Noah S Bieler

    Full Text Available The pre-fibrillar stages of amyloid formation have been implicated in cellular toxicity, but have proved to be challenging to study directly in experiments and simulations. Rational strategies to suppress the formation of toxic amyloid oligomers require a better understanding of the mechanisms by which they are generated. We report Dynamical Monte Carlo simulations that allow us to study the early stages of amyloid formation. We use a generic, coarse-grained model of an amyloidogenic peptide that has two internal states: the first one representing the soluble random coil structure and the second one the [Formula: see text]-sheet conformation. We find that this system exhibits a propensity towards fibrillar self-assembly following the formation of a critical nucleus. Our calculations establish connections between the early nucleation events and the kinetic information available in the later stages of the aggregation process that are commonly probed in experiments. We analyze the kinetic behaviour in our simulations within the framework of the theory of classical nucleated polymerisation, and are able to connect the structural events at the early stages in amyloid growth with the resulting macroscopic observables such as the effective nucleus size. Furthermore, the free-energy landscapes that emerge from these simulations allow us to identify pertinent properties of the monomeric state that could be targeted to suppress oligomer formation.

  1. Evolutionary paths, applications and future development of discrete event simulation systems; Simulazione a eventi discreti: nuove linee di sviluppo e applicazioni

    Energy Technology Data Exchange (ETDEWEB)

    Garetti, M. [Milan Politecnico, Milan (Italy). Dipt. di Economia e Produzione; Bartolotta, A.

    2000-10-01

    The state of the art of discrete event simulation tools is presented with special reference to the application to the manufacturing systems area. After presenting the basics of discrete event computer simulation, the different steps to be followed for the successful use of simulation are defined and discussed. The evolution of software packages for discrete event simulation is also presented, highlighting main technological changes. Finally the future development lines of simulation are outlined. [Italian] Viene presentato lo stato dell'arte della simulazione a eventi discreti. Dopo una breve descrizione della tecnica della simulazione e della sua evoluzione, con un particolare riguardo alla simulazione dei sistemi produttivi, sono descritte le fasi della procedura da seguire per condurre unostudio di simulazione e i possibili approcci per la costruzione del modello. Viene infine descritta l'evoluzione dei principali pacchetti software di simulazione esistenti sul mercato.

  2. Simulating at realistic quark masses. Light quark masses

    Energy Technology Data Exchange (ETDEWEB)

    Goeckeler, M. [Regensburg Univ. (Germany). Inst. fuer Physik 1 - Theoretische Physik; Horsley, R.; Zanotti, J.M. [Edinburgh Univ. (United Kingdom). School of Physics; Nakamura, Y.; Pleiter, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Rakow, P.E.L. [Liverpool Univ. (United Kingdom). Dept. of Mathematical Sciences; Schierholz, G. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC]|[Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Streuer, T. [Kentucky Univ., Lexington, KY (United States). Dept. of Physics and Astronomy; Stueben, H. [Konrad-Zuse-Zentrum fuer Informationstechnik Berlin (ZIB) (Germany)

    2006-11-15

    We present new results for light quark masses. The calculations are performed using two flavours of O(a) improved Wilson fermions. We have reached lattice spacings as small as a {proportional_to}0.07 fm and pion masses down to m{sub {pi}} {proportional_to}340 MeV in our simulations. This gives us significantly better control on the chiral and continuum extrapolations. (orig.)

  3. 3D numerical simulations of negative hydrogen ion extraction using realistic plasma parameters, geometry of the extraction aperture and full 3D magnetic field map

    Science.gov (United States)

    Mochalskyy, S.; Wünderlich, D.; Ruf, B.; Franzen, P.; Fantz, U.; Minea, T.

    2014-02-01

    Decreasing the co-extracted electron current while simultaneously keeping negative ion (NI) current sufficiently high is a crucial issue on the development plasma source system for ITER Neutral Beam Injector. To support finding the best extraction conditions the 3D Particle-in-Cell Monte Carlo Collision electrostatic code ONIX (Orsay Negative Ion eXtraction) has been developed. Close collaboration with experiments and other numerical models allows performing realistic simulations with relevant input parameters: plasma properties, geometry of the extraction aperture, full 3D magnetic field map, etc. For the first time ONIX has been benchmarked with commercial positive ions tracing code KOBRA3D. A very good agreement in terms of the meniscus position and depth has been found. Simulation of NI extraction with different e/NI ratio in bulk plasma shows high relevance of the direct negative ion extraction from the surface produced NI in order to obtain extracted NI current as in the experimental results from BATMAN testbed.

  4. Modeling energy market dynamics using discrete event system simulation

    International Nuclear Information System (INIS)

    Gutierrez-Alcaraz, G.; Sheble, G.B.

    2009-01-01

    This paper proposes the use of Discrete Event System Simulation to study the interactions among fuel and electricity markets and consumers, and the decision-making processes of fuel companies (FUELCOs), generation companies (GENCOs), and consumers in a simple artificial energy market. In reality, since markets can reach a stable equilibrium or fail, it is important to observe how they behave in a dynamic framework. We consider a Nash-Cournot model in which marketers are depicted as Nash-Cournot players that determine supply to meet end-use consumption. Detailed engineering considerations such as transportation network flows are omitted, because the focus is upon the selection and use of appropriate market models to provide answers to policy questions. (author)

  5. Interpretation of the spin glass behaviour of diluted magnetic semiconductors below the nearest-neighbour percolation threshold via realistic Monte Carlo simulations

    CERN Document Server

    Karaoulanis, D; Bacalis, N C

    2000-01-01

    We have performed Monte Carlo simulations of magnetic semiconductors above and below the nearest-neighbour percolation threshold (NNPT) using a classical Heisenberg Hamiltonian with up to third nearest-neighbour (nn) interactions. Large clusters were created allowing use of realistically low magnetic fields (10 G). Above NNPT our results, apart from confirming the existing picture of this class of materials, also show that the inclusion of the second and third (nn) interactions increases the frustration, thus making the transition temperature smaller and closer to experiment than calculated via the first nn interactions only. A physically plausible explanation is given. Below NNPT our results strongly support the validity of the hypothesis (D. Karaoulanis, J.P. Xanthakis, C. Papatriantafillou, J. Magn. Magn. Mater. 161 (1996) 231), that the experimentally observed susceptibility is the sum of two contributions: a paramagnetic one due to isolated magnetic clusters, and a spin-glass contribution due to an 'infi...

  6. A systematic comparison of recurrent event models for application to composite endpoints.

    Science.gov (United States)

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  7. Event Prediction for Modeling Mental Simulation in Naturalistic Decision Making

    National Research Council Canada - National Science Library

    Kunde, Dietmar

    2005-01-01

    ... and increasingly important asymmetric warfare scenarios. Although improvements in computer technology support more and more detailed representations, human decision making is still far from being automated in a realistic way...

  8. Markov modeling and discrete event simulation in health care: a systematic comparison.

    Science.gov (United States)

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  9. Building the evidence on simulation validity: comparison of anesthesiologists' communication patterns in real and simulated cases.

    Science.gov (United States)

    Weller, Jennifer; Henderson, Robert; Webster, Craig S; Shulruf, Boaz; Torrie, Jane; Davies, Elaine; Henderson, Kaylene; Frampton, Chris; Merry, Alan F

    2014-01-01

    Effective teamwork is important for patient safety, and verbal communication underpins many dimensions of teamwork. The validity of the simulated environment would be supported if it elicited similar verbal communications to the real setting. The authors hypothesized that anesthesiologists would exhibit similar verbal communication patterns in routine operating room (OR) cases and routine simulated cases. The authors further hypothesized that anesthesiologists would exhibit different communication patterns in routine cases (real or simulated) and simulated cases involving a crisis. Key communications relevant to teamwork were coded from video recordings of anesthesiologists in the OR, routine simulation and crisis simulation and percentages were compared. The authors recorded comparable videos of 20 anesthesiologists in the two simulations, and 17 of these anesthesiologists in the OR, generating 400 coded events in the OR, 683 in the routine simulation, and 1,419 in the crisis simulation. The authors found no significant differences in communication patterns in the OR and the routine simulations. The authors did find significant differences in communication patterns between the crisis simulation and both the OR and the routine simulations. Participants rated team communication as realistic and considered their communications occurred with a similar frequency in the simulations as in comparable cases in the OR. The similarity of teamwork-related communications elicited from anesthesiologists in simulated cases and the real setting lends support for the ecological validity of the simulation environment and its value in teamwork training. Different communication patterns and frequencies under the challenge of a crisis support the use of simulation to assess crisis management skills.

  10. Assessment of long-term knowledge retention following single-day simulation training for uncommon but critical obstetrical events

    Science.gov (United States)

    Vadnais, Mary A.; Dodge, Laura E.; Awtrey, Christopher S.; Ricciotti, Hope A.; Golen, Toni H.; Hacker, Michele R.

    2013-01-01

    Objective The objectives were to determine (i) whether simulation training results in short-term and long-term improvement in the management of uncommon but critical obstetrical events and (ii) to determine whether there was additional benefit from annual exposure to the workshop. Methods Physicians completed a pretest to measure knowledge and confidence in the management of eclampsia, shoulder dystocia, postpartum hemorrhage and vacuum-assisted vaginal delivery. They then attended a simulation workshop and immediately completed a posttest. Residents completed the same posttests 4 and 12 months later, and attending physicians completed the posttest at 12 months. Physicians participated in the same simulation workshop 1 year later and then completed a final posttest. Scores were compared using paired t-tests. Results Physicians demonstrated improved knowledge and comfort immediately after simulation. Residents maintained this improvement at 1 year. Attending physicians remained more comfortable managing these scenarios up to 1 year later; however, knowledge retention diminished with time. Repeating the simulation after 1 year brought additional improvement to physicians. Conclusion Simulation training can result in short-term and contribute to long-term improvement in objective measures of knowledge and comfort level in managing uncommon but critical obstetrical events. Repeat exposure to simulation training after 1 year can yield additional benefits. PMID:22191668

  11. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    Science.gov (United States)

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. Copyright © 2015 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  12. Toward Simulating Realistic Pursuit-Evasion Using a Roadmap-Based Approach

    KAUST Repository

    Rodriguez, Samuel

    2010-01-01

    In this work, we describe an approach for modeling and simulating group behaviors for pursuit-evasion that uses a graph-based representation of the environment and integrates multi-agent simulation with roadmap-based path planning. We demonstrate the utility of this approach for a variety of scenarios including pursuit-evasion on terrains, in multi-level buildings, and in crowds. © 2010 Springer-Verlag Berlin Heidelberg.

  13. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  14. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  15. Evaluating TCMS Train-to-Ground communication performances based on the LTE technology and discreet event simulations

    DEFF Research Database (Denmark)

    Bouaziz, Maha; Yan, Ying; Kassab, Mohamed

    2018-01-01

    is shared between the train and different passengers. The simulation is based on the discrete-events network simulator Riverbed Modeler. Next, second step focusses on a co-simulation testbed, to evaluate performances with real traffic based on Hardware-In-The-Loop and OpenAirInterface modules. Preliminary...... (Long Term Evolution) network as an alternative communication technology, instead of GSM-R (Global System for Mobile communications-Railway) because of some capacity and capability limits. First step, a pure simulation is used to evaluate the network load for a high-speed scenario, when the LTE network...... simulation and co-simulation results show that LTE provides good performance for the TCMS traffic exchange in terms of packet delay and data integrity...

  16. Numerical Simulations of an Inversion Fog Event in the Salt Lake Valley during the MATERHORN-Fog Field Campaign

    Science.gov (United States)

    Chachere, Catherine N.; Pu, Zhaoxia

    2018-01-01

    An advanced research version of the Weather Research and Forecasting (WRF) Model is employed to simulate a wintertime inversion fog event in the Salt Lake Valley during the Mountain Terrain Atmospheric Modeling and Observations Program (MATERHORN) field campaign during January 2015. Simulation results are compared to observations obtained from the field program. The sensitivity of numerical simulations to available cloud microphysical (CM), planetary boundary layer (PBL), radiation, and land surface models (LSMs) is evaluated. The influence of differing visibility algorithms and initialization times on simulation results is also examined. Results indicate that the numerical simulations of the fog event are sensitive to the choice of CM, PBL, radiation, and LSM as well as the visibility algorithm and initialization time. Although the majority of experiments accurately captured the synoptic setup environment, errors were found in most experiments within the boundary layer, specifically a 3° warm bias in simulated surface temperatures compared to observations. Accurate representation of surface and boundary layer variables are vital in correctly predicting fog in the numerical model.

  17. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    Science.gov (United States)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  18. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    Science.gov (United States)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  19. Comparative Study of Aircraft Boarding Strategies Using Cellular Discrete Event Simulation

    Directory of Open Access Journals (Sweden)

    Shafagh Jafer

    2017-11-01

    Full Text Available Time is crucial in the airlines industry. Among all factors contributing to an aircraft turnaround time; passenger boarding delays is the most challenging one. Airlines do not have control over the behavior of passengers; thus, focusing their effort on reducing passenger boarding time through implementing efficient boarding strategies. In this work, we attempt to use cellular Discrete-Event System Specification (Cell-DEVS modeling and simulation to provide a comprehensive evaluation of aircraft boarding strategies. We have developed a simulation benchmark consisting of eight boarding strategies including Back-to-Front; Window Middle Aisle; Random; Zone Rotate; Reverse Pyramid; Optimal; Optimal Practical; and Efficient. Our simulation models are scalable and adaptive; providing a powerful analysis apparatus for investigating any existing or yet to be discovered boarding strategy. We explain the details of our models and present the results both visually and numerically to evaluate the eight implemented boarding strategies. We also compare our results with other studies that have used different modeling techniques; reporting nearly identical performance results. The simulations revealed that Window Middle Aisle provides the least boarding delay; with a small fraction of time difference compared to the optimal strategy. The results of this work could highly benefit the commercial airlines industry by optimizing and reducing passenger boarding delays.

  20. Assessing the realism of colonoscopy simulation: the development of an instrument and systematic comparison of 4 simulators.

    Science.gov (United States)

    Hill, Andrew; Horswill, Mark S; Plooy, Annaliese M; Watson, Marcus O; Karamatic, Rozemary; Basit, Tabinda A; Wallis, Guy M; Riek, Stephan; Burgess-Limerick, Robin; Hewett, David G

    2012-03-01

    No useful comparative data exist on the relative realism of commercially available devices for simulating colonoscopy. To develop an instrument for quantifying realism and provide the first wide-ranging empiric comparison. Repeated measures, observational study. Nineteen experienced colonoscopists completed cases on 4 colonoscopy simulators (AccuTouch, GI Mentor II, Koken, and Kyoto Kagaku) and evaluated each device. A medical simulation center in a large tertiary hospital. For each device, colonoscopists completed the newly developed Colonoscopy Simulator Realism Questionnaire (CSRQ), which contains 58 items grouped into 10 subscales measuring the realism of different aspects of the simulation. Subscale scores are weighted and combined into an aggregated score, and there is also a single overall realism item. Overall, current colonoscopy simulators were rated as only moderately realistic compared with real human colonoscopy (mean aggregated score, 56.28/100; range, 48.39-60.45, where 0 = "extremely unrealistic" and 100 = "extremely realistic"). On both overall realism measures, the GI Mentor II was rated significantly less realistic than the AccuTouch, Kyoto Kagaku, and Koken (P realism. There is no clear "first choice" simulator among those assessed. Each has unique strengths and weaknesses, reflected in the differing results observed across 9 subscales. These findings may facilitate the targeted selection of simulators for various aspects of colonoscopy training. Copyright © 2012 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.

  1. Charge-dependent correlations from event-by-event anomalous hydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hirono, Yuji [Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY 11794-3800 (United States); Hirano, Tetsufumi [Department of Physics, Sophia University, Tokyo 102-8554 (Japan); Kharzeev, Dmitri E. [Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY 11794-3800 (United States); Department of Physics and RIKEN-BNL Research Center, Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2016-12-15

    We report on our recent attempt of quantitative modeling of the Chiral Magnetic Effect (CME) in heavy-ion collisions. We perform 3+1 dimensional anomalous hydrodynamic simulations on an event-by-event basis, with constitutive equations that contain the anomaly-induced effects. We also develop a model of the initial condition for the axial charge density that captures the statistical nature of random chirality imbalances created by the color flux tubes. Basing on the event-by-event hydrodynamic simulations for hundreds of thousands of collisions, we calculate the correlation functions that are measured in experiments, and discuss how the anomalous transport affects these observables.

  2. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis

    NARCIS (Netherlands)

    A. Tran-Duy (An); A. Boonen (Annelies); M.A.F.J. van de Laar (Mart); A. Franke (Andre); J.L. Severens (Hans)

    2011-01-01

    textabstractObjective: To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Methods: Discrete event simulation paradigm was selected for model

  3. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis

    NARCIS (Netherlands)

    Tran-Duy, A.; Boonen, A.; Laar, M.A.F.J.; Franke, A.C.; Severens, J.L.

    2011-01-01

    Objective To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Methods Discrete event simulation paradigm was selected for model development. Drug

  4. Characterization of photomultiplier tubes with a realistic model through GPU-boosted simulation

    Science.gov (United States)

    Anthony, M.; Aprile, E.; Grandi, L.; Lin, Q.; Saldanha, R.

    2018-02-01

    The accurate characterization of a photomultiplier tube (PMT) is crucial in a wide-variety of applications. However, current methods do not give fully accurate representations of the response of a PMT, especially at very low light levels. In this work, we present a new and more realistic model of the response of a PMT, called the cascade model, and use it to characterize two different PMTs at various voltages and light levels. The cascade model is shown to outperform the more common Gaussian model in almost all circumstances and to agree well with a newly introduced model independent approach. The technical and computational challenges of this model are also presented along with the employed solution of developing a robust GPU-based analysis framework for this and other non-analytical models.

  5. Tuukka Kaidesoja on Critical Realist Transcendental Realism

    Directory of Open Access Journals (Sweden)

    Groff Ruth

    2015-09-01

    Full Text Available I argue that critical realists think pretty much what Tukka Kaidesoja says that he himself thinks, but also that Kaidesoja’s objections to the views that he attributes to critical realists are not persuasive.

  6. Sensitivity studies on the approaches for addressing multiple initiating events in fire events PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Lim, Ho Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    A single fire event within a fire compartment or a fire scenario can cause multiple initiating events (IEs). As an example, a fire in a turbine building fire area can cause a loss of the main feed-water (LOMF) and loss of off-site power (LOOP) IEs. Previous domestic fire events PSA had considered only the most severe initiating event among multiple initiating events. NUREG/CR-6850 and ANS/ASME PRA Standard require that multiple IEs are to be addressed in fire events PSA. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA for Hanul Unit 3 were performed and their results were presented. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA are performed and their results were presented. From the sensitivity analysis results, we can find that the incorporations of multiple IEs into fire events PSA model result in the core damage frequency (CDF) increase and may lead to the generation of the duplicate cutsets. Multiple IEs also can occur at internal flooding event or other external events such as seismic event. They should be considered in the constructions of PSA models in order to realistically estimate risk due to flooding or seismic events.

  7. Discrete-event system simulation on small and medium enterprises productivity improvement

    Science.gov (United States)

    Sulistio, J.; Hidayah, N. A.

    2017-12-01

    Small and medium industries in Indonesia is currently developing. The problem faced by SMEs is the difficulty of meeting growing demand coming into the company. Therefore, SME need an analysis and evaluation on its production process in order to meet all orders. The purpose of this research is to increase the productivity of SMEs production floor by applying discrete-event system simulation. This method preferred because it can solve complex problems die to the dynamic and stochastic nature of the system. To increase the credibility of the simulation, model validated by cooperating the average of two trials, two trials of variance and chi square test. Afterwards, Benferroni method applied to development several alternatives. The article concludes that, the productivity of SMEs production floor increased up to 50% by adding the capacity of dyeing and drying machines.

  8. Validation of a realistic powder sample using data from DMC at PSI

    International Nuclear Information System (INIS)

    Willendrup, Peter; Filges, Uwe; Keller, Lukas; Farhi, Emmanuel; Lefmann, Kim

    2006-01-01

    We present results of a virtual experiment, carried out by means of a McStas simulation of the powder diffractometer DMC at PSI, using the new powder sample component PowderN. This powder component takes tabulated crystallographic input to define realistic powder lines. The simulated output data from the virtual experiment on the compound Na 2 Ca 3 Al 2 F 14 are compared to real measurement data from the DMC instrument. The agreement is very good with respect to peak positions, widths, background intensity and relative peak intensities. This work represents an important step towards reliable virtual experiments and also act as a validation of the PowderN sample component in McStas

  9. Validation of a realistic powder sample using data from DMC at PSI

    Energy Technology Data Exchange (ETDEWEB)

    Willendrup, Peter [Riso National Laboratory, Frederiksborgvej 399, DK-4000 Roskilde (Denmark)]. E-mail: peter.willendrup@risoe.dk; Filges, Uwe [Laboratory for Development and Methods ETHZ and PSI CH-5232 Villigen PSI (Switzerland); Keller, Lukas [Laboratory for Neutron Scattering ETHZ and PSI CH-5232 Villigen PSI (Switzerland); Farhi, Emmanuel [Institut Laue-Langevin (ILL) Grenoble, 6 rue J. Horowitz, BP 156, 38042 Grenoble Cedex 9 (France); Lefmann, Kim [Riso National Laboratory, Frederiksborgvej 399, DK-4000 Roskilde (Denmark)

    2006-11-15

    We present results of a virtual experiment, carried out by means of a McStas simulation of the powder diffractometer DMC at PSI, using the new powder sample component PowderN. This powder component takes tabulated crystallographic input to define realistic powder lines. The simulated output data from the virtual experiment on the compound Na{sub 2}Ca{sub 3}Al{sub 2}F{sub 14} are compared to real measurement data from the DMC instrument. The agreement is very good with respect to peak positions, widths, background intensity and relative peak intensities. This work represents an important step towards reliable virtual experiments and also act as a validation of the PowderN sample component in McStas.

  10. Validation of a realistic powder sample using data from DMC at PSI

    DEFF Research Database (Denmark)

    Willendrup, Peter Kjær; Filges, U.; Keller, L.

    2006-01-01

    We present results of a virtual experiment, carried out by means of a McStas simulation of the powder diffractometer DMC at PSI, using the new powder sample component PowderN. This powder component takes tabulated crystallographic input to define realistic powder lines. The simulated output data...... from the virtual experiment on the compound Na2Ca3Al2F14 are compared to real measurement data from the DMC instrument. The agreement is very good with respect to peak positions, widths, background intensity and relative peak intensities. This work represents an important step towards reliable virtual...... experiments and also act as a validation of the PowderN sample component in McStas....

  11. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    International Nuclear Information System (INIS)

    Aristovich, K Y; Khan, S H

    2010-01-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  12. Knowledge-based simulation using object-oriented programming

    Science.gov (United States)

    Sidoran, Karen M.

    1993-01-01

    Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.

  13. Augmented versus virtual reality laparoscopic simulation: what is the difference? A comparison of the ProMIS augmented reality laparoscopic simulator versus LapSim virtual reality laparoscopic simulator

    NARCIS (Netherlands)

    Botden, Sanne M. B. I.; Buzink, Sonja N.; Schijven, Marlies P.; Jakimowicz, Jack J.

    2007-01-01

    BACKGROUND: Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic

  14. Effects of realistic topography on the ground motion of the Colombian Andes - A case study at the Aburrá Valley, Antioquia

    Science.gov (United States)

    Restrepo, Doriam; Bielak, Jacobo; Serrano, Ricardo; Gómez, Juan; Jaramillo, Juan

    2016-03-01

    This paper presents a set of deterministic 3-D ground motion simulations for the greater metropolitan area of Medellín in the Aburrá Valley, an earthquake-prone region of the Colombian Andes that exhibits moderate-to-strong topographic irregularities. We created the velocity model of the Aburrá Valley region (version 1) using the geological structures as a basis for determining the shear wave velocity. The irregular surficial topography is considered by means of a fictitious domain strategy. The simulations cover a 50 × 50 × 25 km3 volume, and four Mw = 5 rupture scenarios along a segment of the Romeral fault, a significant source of seismic activity in Colombia. In order to examine the sensitivity of ground motion to the irregular topography and the 3-D effects of the valley, each earthquake scenario was simulated with three different models: (i) realistic 3-D velocity structure plus realistic topography, (ii) realistic 3-D velocity structure without topography, and (iii) homogeneous half-space with realistic topography. Our results show how surface topography affects the ground response. In particular, our findings highlight the importance of the combined interaction between source-effects, source-directivity, focusing, soft-soil conditions, and 3-D topography. We provide quantitative evidence of this interaction and show that topographic amplification factors can be as high as 500 per cent at some locations. In other areas within the valley, the topographic effects result in relative reductions, but these lie in the 0-150 per cent range.

  15. A Discrete-Event Simulation Model for Evaluating Air Force Reusable Military Launch Vehicle Post-Landing Operations

    National Research Council Canada - National Science Library

    Martindale, Michael

    2006-01-01

    The purpose of this research was to develop a discrete-event computer simulation model of the post-landing vehicle recoveoperations to allow the Air Force Research Laboratory, Air Vehicles Directorate...

  16. Wavelet spectra of JACEE events

    International Nuclear Information System (INIS)

    Suzuki, Naomichi; Biyajima, Minoru; Ohsawa, Akinori.

    1995-01-01

    Pseudo-rapidity distributions of two high multiplicity events Ca-C and Si-AgBr observed by the JACEE are analyzed by a wavelet transform. Wavelet spectra of those events are calculated and compared with the simulation calculations. The wavelet spectrum of the Ca-C event somewhat resembles that simulated with the uniform random numbers. That of Si-AgBr event, however, is not reproduced by simulation calculations with Poisson random numbers, uniform random numbers, or a p-model. (author)

  17. A Simple Ensemble Simulation Technique for Assessment of Future Variations in Specific High-Impact Weather Events

    Science.gov (United States)

    Taniguchi, Kenji

    2018-04-01

    To investigate future variations in high-impact weather events, numerous samples are required. For the detailed assessment in a specific region, a high spatial resolution is also required. A simple ensemble simulation technique is proposed in this paper. In the proposed technique, new ensemble members were generated from one basic state vector and two perturbation vectors, which were obtained by lagged average forecasting simulations. Sensitivity experiments with different numbers of ensemble members, different simulation lengths, and different perturbation magnitudes were performed. Experimental application to a global warming study was also implemented for a typhoon event. Ensemble-mean results and ensemble spreads of total precipitation, atmospheric conditions showed similar characteristics across the sensitivity experiments. The frequencies of the maximum total and hourly precipitation also showed similar distributions. These results indicate the robustness of the proposed technique. On the other hand, considerable ensemble spread was found in each ensemble experiment. In addition, the results of the application to a global warming study showed possible variations in the future. These results indicate that the proposed technique is useful for investigating various meteorological phenomena and the impacts of global warming. The results of the ensemble simulations also enable the stochastic evaluation of differences in high-impact weather events. In addition, the impacts of a spectral nudging technique were also examined. The tracks of a typhoon were quite different between cases with and without spectral nudging; however, the ranges of the tracks among ensemble members were comparable. It indicates that spectral nudging does not necessarily suppress ensemble spread.

  18. GRMHD Simulations of Visibility Amplitude Variability for Event Horizon Telescope Images of Sgr A*

    Science.gov (United States)

    Medeiros, Lia; Chan, Chi-kwan; Özel, Feryal; Psaltis, Dimitrios; Kim, Junhan; Marrone, Daniel P.; Sa¸dowski, Aleksander

    2018-04-01

    The Event Horizon Telescope will generate horizon scale images of the black hole in the center of the Milky Way, Sgr A*. Image reconstruction using interferometric visibilities rests on the assumption of a stationary image. We explore the limitations of this assumption using high-cadence disk- and jet-dominated GRMHD simulations of Sgr A*. We also employ analytic models that capture the basic characteristics of the images to understand the origin of the variability in the simulated visibility amplitudes. We find that, in all simulations, the visibility amplitudes for baselines oriented parallel and perpendicular to the spin axis of the black hole follow general trends that do not depend strongly on accretion-flow properties. This suggests that fitting Event Horizon Telescope observations with simple geometric models may lead to a reasonably accurate determination of the orientation of the black hole on the plane of the sky. However, in the disk-dominated models, the locations and depths of the minima in the visibility amplitudes are highly variable and are not related simply to the size of the black hole shadow. This suggests that using time-independent models to infer additional black hole parameters, such as the shadow size or the spin magnitude, will be severely affected by the variability of the accretion flow.

  19. Plant analyzer development for high-speed interactive simulation of BWR plant transients

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.

    1986-01-01

    Advanced modeling techniques have been combined with modern, special-purpose peripheral minicomputer technology to develop a plant analyzer which provides realistic and accurate predictions of plant transients and severe off-normal events in nuclear power plants through on-line simulations at speeds of approximately 10 times faster than actual process speeds. The new simulation technology serves not only for carrying out routinely and efficiently safety analyses, optimizations of emergency procedures and design changes, parametric studies for obtaining safety margins and for generic training but also for assisting plant operations. Five modeling principles are presented which serve to achieve high-speed simulation of neutron kinetics, thermal conduction, nonhomogeneous and nonequilibrium two-phase flow coolant dynamics, steam line acoustical effects, and the dynamics of the balance of plant and containment systems, control systems and plant protection systems. 21 refs

  20. ATC-lab(Advanced): an air traffic control simulator with realism and control.

    Science.gov (United States)

    Fothergill, Selina; Loft, Shayne; Neal, Andrew

    2009-02-01

    ATC-lab(Advanced) is a new, publicly available air traffic control (ATC) simulation package that provides both realism and experimental control. ATC-lab(Advanced) simulations are realistic to the extent that the display features (including aircraft performance) and the manner in which participants interact with the system are similar to those used in an operational environment. Experimental control allows researchers to standardize air traffic scenarios, control levels of realism, and isolate specific ATC tasks. Importantly, ATC-lab(Advanced) also provides the programming control required to cost effectively adapt simulations to serve different research purposes without the need for technical support. In addition, ATC-lab(Advanced) includes a package for training participants and mathematical spreadsheets for designing air traffic events. Preliminary studies have demonstrated that ATC-lab(Advanced) is a flexible tool for applied and basic research.

  1. Simulating continuous renal replacement therapy: usefulness of a new simulator device.

    Science.gov (United States)

    Mencía, Santiago; López, Manuel; López-Herce, Jesús; Ferrero, Luis; Rodríguez-Núñez, Antonio

    2014-03-01

    Simulation allows the training of life-support procedures without patient risk. We analyzed the performance and usefulness of a new device that makes feasible the external control of continuous renal replacement therapy (CRRT) machines in order to realistically generate clinical conditions and problems in simulated patients. A simple mechanical device was designed according to training needs and then hand made. This device permits the control of all monitorable pressures and therefore allows simulation of a range of clinical situations and eventual complications that might occur in real patients. We tested its performance in vitro and then during 16 high-fidelity patient-simulation scenarios included in the program of pediatric CRRT courses. Student and teacher satisfaction was assessed through an anonymous survey. Quick, accurate, real-time monitor of pressure changes, concordant with the usual clinical problems to be simulated (catheter complications, filter coagulation, inadequate CRRT device settings), were easily achieved with the new device. Instructors rated the device as user friendly and well adapted to the reality being simulated. During scenarios, students were not aware of the simulator and considered that simulated clinical conditions were realistic. Our device may be very useful for training healthcare professionals in CRRT management, thus avoiding risk to patients.

  2. Instructional environments for simulations.

    NARCIS (Netherlands)

    van Berkum, J.J.A.; de Jong, T.

    1991-01-01

    The use of computer simulations in education and training can have substantial advantages over other approaches. In comparison with alternatives such as textbooks, lectures, and tutorial courseware, a simulation-based approach offers the opportunity to learn in a relatively realistic problem-solving

  3. Instructional environments for simulations

    NARCIS (Netherlands)

    van Berkum, Jos J.A.; de Jong, Anthonius J.M.

    1991-01-01

    The use of computer simulations in education and training can have substantial advantages over other approaches. In comparison with alternatives such as textbooks, lectures, and tutorial courseware, a simulation-based approach offers the opportunity to learn in a relatively realistic problem-solving

  4. The neural basis of event simulation: an FMRI study.

    Directory of Open Access Journals (Sweden)

    Yukihito Yomogida

    Full Text Available Event simulation (ES is the situational inference process in which perceived event features such as objects, agents, and actions are associated in the brain to represent the whole situation. ES provides a common basis for various cognitive processes, such as perceptual prediction, situational understanding/prediction, and social cognition (such as mentalizing/trait inference. Here, functional magnetic resonance imaging was used to elucidate the neural substrates underlying important subdivisions within ES. First, the study investigated whether ES depends on different neural substrates when it is conducted explicitly and implicitly. Second, the existence of neural substrates specific to the future-prediction component of ES was assessed. Subjects were shown contextually related object pictures implying a situation and performed several picture-word-matching tasks. By varying task goals, subjects were made to infer the implied situation implicitly/explicitly or predict the future consequence of that situation. The results indicate that, whereas implicit ES activated the lateral prefrontal cortex and medial/lateral parietal cortex, explicit ES activated the medial prefrontal cortex, posterior cingulate cortex, and medial/lateral temporal cortex. Additionally, the left temporoparietal junction plays an important role in the future-prediction component of ES. These findings enrich our understanding of the neural substrates of the implicit/explicit/predictive aspects of ES-related cognitive processes.

  5. Numerical simulation of a winter hailstorm event over Delhi, India on 17 January 2013

    Science.gov (United States)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-09-01

    This study analyzes the cause of rare occurrence of winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using Weather Research and Forecasting (WRF) model with Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options, hail or graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with comparative analysis of the two options of GCE microphysics. On evaluating the model simulations, it is observed that hail option shows similar precipitation intensity with TRMM observation than the graupel option and is able to simulate hail precipitation. Using the model simulated output with hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached upto the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of WD. Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  6. Simulation of the catastrophic floods caused by extreme rainfall events - Uh River basin case study

    OpenAIRE

    Pekárová, Pavla; Halmová, Dana; Mitková, Veronika

    2005-01-01

    The extreme rainfall events in Central and East Europe on August 2002 rise the question, how other basins would respond on such rainfall situations. Such theorisation helps us to arrange in advance the necessary activity in the basin to reduce the consequence of the assumed disaster. The aim of the study is to recognise a reaction of the Uh River basin (Slovakia, Ukraine) to the simulated catastrophic rainfall events from August 2002. Two precipitation scenarios, sc1 and sc2, were created. Th...

  7. A Local Realistic Reconciliation of the EPR Paradox

    Science.gov (United States)

    Sanctuary, Bryan

    2014-03-01

    The exact violation of Bell's Inequalities is obtained with a local realistic model for spin. The model treats one particle that comprises a quantum ensemble and simulates the EPR data one coincidence at a time as a product state. Such a spin is represented by operators σx , iσy ,σz in its body frame rather than the usual set of σX ,σY ,σZ in the laboratory frame. This model, assumed valid in the absence of a measuring probe, contains both quantum polarizations and coherences. Each carries half the EPR correlation, but only half can be measured using coincidence techniques. The model further predicts the filter angles that maximize the spin correlation in EPR experiments.

  8. Turbulence studies in tokamak boundary plasmas with realistic divertor geometry

    International Nuclear Information System (INIS)

    Xu, X.Q.; Cohen, R.H.; Porter, G.D.; Rognlien, T.; Ryutov, D.D.; Myra, J.R.; D'Ippolito, D.A.; Moyer, R.; Groebner, R.J.

    2001-01-01

    Results are presented from the 3D nonlocal electromagnetic turbulence code BOUT and the linearized shooting code BAL for studies of turbulence in tokamak boundary plasmas and its relationship to the L-H transition, in a realistic divertor plasma geometry. The key results include: (1) the identification of the dominant resistive X-point mode in divertor geometry and (2) turbulence suppression in the L-H transition by shear in the ExB drift speed, ion diamagnetism and nite polarization. Based on the simulation results, a parameterization of the transport is given that includes the dependence on the relevant physical parameters. (author)

  9. Turbulence studies in tokamak boundary plasmas with realistic divertor geometry

    International Nuclear Information System (INIS)

    Xu, X.Q.; Cohen, R.H.; Por, G.D. ter; Rognlien, T.D.; Ryutov, D.D.; Myra, J.R.; D'Ippolito, D.A.; Moyer, R.; Groebner, R.J.

    1999-01-01

    Results are presented from the 3D nonlocal electromagnetic turbulence code BOUT and the linearized shooting code BAL for studies of turbulence in tokamak boundary plasmas and its relationship to the L-H transition, in a realistic divertor plasma geometry. The key results include: (1) the identification of the dominant resistive X-point mode in divertor geometry and (2) turbulence suppression in the L-H transition by shear in the E x B drift speed, ion diamagnetism and finite polarization. Based on the simulation results, a parameterization of the transport is given that includes the dependence on the relevant physical parameters. (author)

  10. Multi-spacecraft observations and transport simulations of solar energetic particles for the May 17th 2012 event

    Science.gov (United States)

    Battarbee, M.; Guo, J.; Dalla, S.; Wimmer-Schweingruber, R.; Swalwell, B.; Lawrence, D. J.

    2018-05-01

    Context. The injection, propagation and arrival of solar energetic particles (SEPs) during eruptive solar events is an important and current research topic of heliospheric physics. During the largest solar events, particles may have energies up to a few GeVs and sometimes even trigger ground-level enhancements (GLEs) at Earth. These large SEP events are best investigated through multi-spacecraft observations. Aims: We aim to study the first GLE-event of solar cycle 24, from 17th May 2012, using data from multiple spacecraft (SOHO, GOES, MSL, STEREO-A, STEREO-B and MESSENGER). These spacecraft are located throughout the inner heliosphere, at heliocentric distances between 0.34 and 1.5 astronomical units (au), covering nearly the whole range of heliospheric longitudes. Methods: We present and investigate sub-GeV proton time profiles for the event at several energy channels, obtained via different instruments aboard the above spacecraft. We investigated issues caused by magnetic connectivity, and present results of three-dimensional SEP propagation simulations. We gathered virtual time profiles and perform qualitative and quantitative comparisons with observations, assessed longitudinal injection and transport effects as well as peak intensities. Results: We distinguish different time profile shapes for well-connected and weakly connected observers, and find our onset time analysis to agree with this distinction. At select observers, we identify an additional low-energy component of Energetic Storm Particles (ESPs). Using well-connected observers for normalisation, our simulations are able to accurately recreate both time profile shapes and peak intensities at multiple observer locations. Conclusions: This synergetic approach combining numerical modelling with multi-spacecraft observations is crucial for understanding the propagation of SEPs within the interplanetary magnetic field. Our novel analysis provides valuable proof of the ability to simulate SEP propagation

  11. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs, and show a connection with the straight skeleton of P. We show that the maximum possible number of distinct realistic roofs over P is ( ⌊(n-4)/4⌋ (n-4)/2) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n 4) preprocessing time. We also present an O(n 5)-time algorithm for computing a realistic roof with minimum height or volume. © 2011 Springer-Verlag.

  12. Two Hours of Teamwork Training Improves Teamwork in Simulated Cardiopulmonary Arrest Events.

    Science.gov (United States)

    Mahramus, Tara L; Penoyer, Daleen A; Waterval, Eugene M E; Sole, Mary L; Bowe, Eileen M

    2016-01-01

    Teamwork during cardiopulmonary arrest events is important for resuscitation. Teamwork improvement programs are usually lengthy. This study assessed the effectiveness of a 2-hour teamwork training program. A prospective, pretest/posttest, quasi-experimental design assessed the teamwork training program targeted to resident physicians, nurses, and respiratory therapists. Participants took part in a simulated cardiac arrest. After the simulation, participants and trained observers assessed perceptions of teamwork using the Team Emergency Assessment Measure (TEAM) tool (ratings of 0 [low] to 4 [high]). A debriefing and 45 minutes of teamwork education followed. Participants then took part in a second simulated cardiac arrest scenario. Afterward, participants and observers assessed teamwork. Seventy-three team members participated-resident physicians (25%), registered nurses (32%), and respiratory therapists (41%). The physicians had significantly less experience on code teams (P teamwork scores were 2.57 to 2.72. Participants' mean (SD) scores on the TEAM tool for the first and second simulations were 3.2 (0.5) and 3.7 (0.4), respectively (P teamwork educational intervention resulted in improved perceptions of teamwork behaviors. Participants reported interactions with other disciplines, teamwork behavior education, and debriefing sessions were beneficial for enhancing the program.

  13. COCOA: Simulating Observations of Star Cluster Simulations

    Science.gov (United States)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Dalessandro, Emanuele

    2017-03-01

    COCOA (Cluster simulatiOn Comparison with ObservAtions) creates idealized mock photometric observations using results from numerical simulations of star cluster evolution. COCOA is able to present the output of realistic numerical simulations of star clusters carried out using Monte Carlo or N-body codes in a way that is useful for direct comparison with photometric observations. The code can simulate optical observations from simulation snapshots in which positions and magnitudes of objects are known. The parameters for simulating the observations can be adjusted to mimic telescopes of various sizes. COCOA also has a photometry pipeline that can use standalone versions of DAOPHOT (ascl:1104.011) and ALLSTAR to produce photometric catalogs for all observed stars.

  14. Gravitational Reference Sensor Front-End Electronics Simulator for LISA

    International Nuclear Information System (INIS)

    Meshksar, Neda; Ferraioli, Luigi; Mance, Davor; Zweifel, Peter; Giardini, Domenico; Ten Pierick, Jan

    2017-01-01

    At the ETH Zurich we are developing a modular simulator that provides a realistic simulation of the Front End Electronics (FEE) for LISA Gravitational Reference Sensor (GRS). It is based on the GRS FEE-simulator already implemented for LISA Pathfinder. It considers, in particular, the non-linearity and the critical details of hardware, such as the non-linear multiplicative noise caused by voltage reference instability, test mass charging and detailed actuation and sensing algorithms. We present the simulation modules, considering the above-mentioned features. Based on the ETH GRS FEE-simulator for LISA Pathfinder we aim to develop a modular simulator that provides a realistic simulation of GRS FEE for LISA. (paper)

  15. [A new age of mass casuality education? : The InSitu project: realistic training in virtual reality environments].

    Science.gov (United States)

    Lorenz, D; Armbruster, W; Vogelgesang, C; Hoffmann, H; Pattar, A; Schmidt, D; Volk, T; Kubulus, D

    2016-09-01

    Chief emergency physicians are regarded as an important element in the care of the injured and sick following mass casualty accidents. Their education is very theoretical; practical content in contrast often falls short. Limitations are usually the very high costs of realistic (large-scale) exercises, poor reproducibility of the scenarios, and poor corresponding results. To substantially improve the educational level because of the complexity of mass casualty accidents, modified training concepts are required that teach the not only the theoretical but above all the practical skills considerably more intensively than at present. Modern training concepts should make it possible for the learner to realistically simulate decision processes. This article examines how interactive virtual environments are applicable for the education of emergency personnel and how they could be designed. Virtual simulation and training environments offer the possibility of simulating complex situations in an adequately realistic manner. The so-called virtual reality (VR) used in this context is an interface technology that enables free interaction in addition to a stereoscopic and spatial representation of virtual large-scale emergencies in a virtual environment. Variables in scenarios such as the weather, the number wounded, and the availability of resources, can be changed at any time. The trainees are able to practice the procedures in many virtual accident scenes and act them out repeatedly, thereby testing the different variants. With the aid of the "InSitu" project, it is possible to train in a virtual reality with realistically reproduced accident situations. These integrated, interactive training environments can depict very complex situations on a scale of 1:1. Because of the highly developed interactivity, the trainees can feel as if they are a direct part of the accident scene and therefore identify much more with the virtual world than is possible with desktop systems

  16. Trunk muscle recruitment patterns in simulated precrash events.

    Science.gov (United States)

    Ólafsdóttir, Jóna Marín; Fice, Jason B; Mang, Daniel W H; Brolin, Karin; Davidsson, Johan; Blouin, Jean-Sébastien; Siegmund, Gunter P

    2018-02-28

    To quantify trunk muscle activation levels during whole body accelerations that simulate precrash events in multiple directions and to identify recruitment patterns for the development of active human body models. Four subjects (1 female, 3 males) were accelerated at 0.55 g (net Δv = 4.0 m/s) in 8 directions while seated on a sled-mounted car seat to simulate a precrash pulse. Electromyographic (EMG) activity in 4 trunk muscles was measured using wire electrodes inserted into the left rectus abdominis, internal oblique, iliocostalis, and multifidus muscles at the L2-L3 level. Muscle activity evoked by the perturbations was normalized by each muscle's isometric maximum voluntary contraction (MVC) activity. Spatial tuning curves were plotted at 150, 300, and 600 ms after acceleration onset. EMG activity remained below 40% MVC for the three time points for most directions. At the 150- and 300 ms time points, the highest EMG amplitudes were observed during perturbations to the left (-90°) and left rearward (-135°). EMG activity diminished by 600 ms for the anterior muscles, but not for the posterior muscles. These preliminary results suggest that trunk muscle activity may be directionally tuned at the acceleration level tested here. Although data from more subjects are needed, these preliminary data support the development of modeled trunk muscle recruitment strategies in active human body models that predict occupant responses in precrash scenarios.

  17. Mini Combat Trauma Patient Simulation System Defense Acquisition Challenge Program (DACP): Mini Combat Trauma Patient Simulation (Mini CTPS)

    National Research Council Canada - National Science Library

    2004-01-01

    .... It consists of networked realistic casualty generators, patient simulators and computer-based casualty simulations, virtual patients and equipment, data and sensor recorders, and an After- Action Review System...

  18. Suitability of simple rheological laws for the numerical simulation of dense pyroclastic flows and long-runout volcanic avalanches

    Science.gov (United States)

    Kelfoun, Karim

    2011-08-01

    The rheology of volcanic rock avalanches and dense pyroclastic flows is complex, and it is difficult at present to constrain the physics of their processes. The problem lies in defining the most suitable parameters for simulating the behavior of these natural flows. Existing models are often based on the Coulomb rheology, sometimes with a velocity-dependent stress (e.g., Voellmy), but other laws have also been used. Here I explore the characteristics of flows, and their deposits, obtained on simplified topographies by varying source conditions and rheology. The Coulomb rheology, irrespective of whether there is a velocity-dependent stress, forms cone-shaped deposits that do not resemble those of natural long-runout events. A purely viscous or a purely turbulent flow can achieve realistic velocities and thicknesses but cannot form a deposit on slopes. The plastic rheology, with (e.g., Bingham) or without a velocity-dependent stress, is more suitable for the simulation of dense pyroclastic flows and long-runout volcanic avalanches. With this rheology, numerical flows form by pulses, which are often observed during natural flow emplacement. The flows exhibit realistic velocities and deposits of realistic thicknesses. The plastic rheology is also able to generate the frontal lobes and lateral levées which are commonly observed in the field. With the plastic rheology, levée formation occurs at the flow front due to a divergence of the driving stresses at the edges. Once formed, the levées then channel the remaining flow mass. The results should help future modelers of volcanic flows with their choice of which mechanical law corresponds best to the event they are studying.

  19. CATCC/AATCC Simulator

    Data.gov (United States)

    Federal Laboratory Consortium — The 15G30 CATCC/AATCC simulator provides high fidelity training for Navy Air Traffic Control (ATC) trainees in a realistic shipboard air traffic control environment....

  20. Dynamics and predictions in the co-event interpretation

    International Nuclear Information System (INIS)

    Ghazi-Tabatabai, Yousef; Wallden, Petros

    2009-01-01

    Sorkin has introduced a new, observer independent, interpretation of quantum mechanics that can give a successful realist account of the 'quantum micro-world' as well as explaining how classicality emerges at the level of observable events for a range of systems including single time 'Copenhagen measurements'. This 'co-event interpretation' presents us with a new ontology, in which a single 'co-event' is real. A new ontology necessitates a review of the dynamical and predictive mechanism of a theory, and in this paper we begin the process by exploring means of expressing the dynamical and predictive content of histories theories in terms of co-events

  1. Kuhn: Realist or Antirealist?

    Directory of Open Access Journals (Sweden)

    Michel Ghins

    1998-06-01

    Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.

  2. Methodology Development of Computationally-Efficient Full Vehicle Simulations for the Entire Blast Event

    Science.gov (United States)

    2015-08-06

    NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Altair Engineering 888 W Big Beaver Road #402 Troy MI 48084...soldiers, it is imperative to analyze impact of each sub-event on soldier injuries. Using traditional finite element analysis techniques [1-6] to...CONSTRAINED_LAGRANGE_IN_SOLID) and the results from another commonly used non-linear explicit solver for impact simulations (RADIOSS, [4]) using a coupling

  3. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance...... eigenfunctions and estimates of the distributions of the corresponding expansion coefficients. The simulation method utilizes the eigenfunction expansion procedure to produce preliminary time histories of the three velocity components simultaneously. As a final step, a spectral shaping procedure is then applied....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  4. Simulation of dense colloids

    NARCIS (Netherlands)

    Herrmann, H.J.; Harting, J.D.R.; Hecht, M.; Ben-Naim, E.

    2008-01-01

    We present in this proceeding recent large scale simulations of dense colloids. On one hand we simulate model clay consisting of nanometric aluminum oxide spheres in water using realistic DLVO potentials and a combination of MD and SRD. We find pronounced cluster formation and retrieve the shear

  5. DECISION WITH ARTIFICIAL NEURAL NETWORKS IN DISCRETE EVENT SIMULATION MODELS ON A TRAFFIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Marília Gonçalves Dutra da Silva

    2016-04-01

    Full Text Available ABSTRACT This work aims to demonstrate the use of a mechanism to be applied in the development of the discrete-event simulation models that perform decision operations through the implementation of an artificial neural network. Actions that involve complex operations performed by a human agent in a process, for example, are often modeled in simplified form with the usual mechanisms of simulation software. Therefore, it was chosen a traffic system controlled by a traffic officer with a flow of vehicles and pedestrians to demonstrate the proposed solution. From a module built in simulation software itself, it was possible to connect the algorithm for intelligent decision to the simulation model. The results showed that the model elaborated responded as expected when it was submitted to actions, which required different decisions to maintain the operation of the system with changes in the flow of people and vehicles.

  6. Comparative Effectiveness of Tacrolimus-Based Steroid Sparing versus Steroid Maintenance Regimens in Kidney Transplantation: Results from Discrete Event Simulation.

    Science.gov (United States)

    Desai, Vibha C A; Ferrand, Yann; Cavanaugh, Teresa M; Kelton, Christina M L; Caro, J Jaime; Goebel, Jens; Heaton, Pamela C

    2017-10-01

    Corticosteroids used as immunosuppressants to prevent acute rejection (AR) and graft loss (GL) following kidney transplantation are associated with serious cardiovascular and other adverse events. Evidence from short-term randomized controlled trials suggests that many patients on a tacrolimus-based immunosuppressant regimen can withdraw from steroids without increased AR or GL risk. To measure the long-term tradeoff between GL and adverse events for a heterogeneous-risk population and determine the optimal timing of steroid withdrawal. A discrete event simulation was developed including, as events, AR, GL, myocardial infarction (MI), stroke, cytomegalovirus, and new onset diabetes mellitus (NODM), among others. Data from the United States Renal Data System were used to estimate event-specific parametric regressions, which accounted for steroid-sparing regimen (avoidance, early 7-d withdrawal, 6-mo withdrawal, 12-mo withdrawal, and maintenance) as well as patients' demographics, immunologic risks, and comorbidities. Regression-equation results were used to derive individual time-to-event Weibull distributions, used, in turn, to simulate the course of patients over 20 y. Patients on steroid avoidance or an early-withdrawal regimen were more likely to experience AR (45.9% to 55.0% v. 33.6%, P events and other outcomes with no worsening of AR or GL rates compared with steroid maintenance.

  7. Simulation of the lateral pole-impact on a crash simulation system; Simulation des seitlichen Pfahlaufpralls auf einer Katapultanlage

    Energy Technology Data Exchange (ETDEWEB)

    Hoegner, C.; Gajewski, M.; Zippel, I. [ACTS GmbH und Co. KG, Sailauf (Germany)

    2001-07-01

    The test set-up for the simulation of a lateral pole impact has the following characteristics: - Simulation of pole impact procedures Euro-NCAP and others - Very realistic reproduction of the deformation performance of side panels (with simulated or original part assemblies) - Consideration of the dynamic displacement of the cant rail - Seat displacement and deformation - Realistic depiction of the vehicle displacement (i.e. car-to-pole) - Test velocity: up to 50 km/h - Intrusion: up to 550 mm (more also possible on demand) - Pole diameter 254 mm (variable). This test system design results in variable set-up possibilities: - Test-set up possible from very simple (on the basis of simulation data without real parts) to very complex (use of side panels and interior door trim) - Support of different protection systems (window bag and side bag) - A maximum of two occupants feasible (set up weight max. 300 kg) - Optimal camera perspectives, stationary or onboard - Front as well as rear side impact simulation possible - Front-rear interaction of protection systems feasible (two dummies). A test system has been realised with which the complex process in a lateral pole impact can be simulated with excellent approximation and with relatively simple means. Due to the avoidance of point validation this methodology can be ideally implemented for development tests. (orig.)

  8. Study of Event Topology for a new Fast Primary Vertex Finder for the ATLAS Trigger

    CERN Document Server

    AUTHOR|(SzGeCERN)739389; The ATLAS collaboration

    2016-01-01

    This document presents a transform-based approach to primary vertex finding and a feasibility analysis. The feasibility analysis first shows theoretical distinguishability of different signal events and pileup with a metric devised for this purpose. The results show high distinguishability for the majority of event types with expectedly low distinguishability for special cases. The algorithm is intended for use in the high level trigger. At this stage of computation, event types can be distinguished through the trigger, allowing choosing this algorithm only for appropriate events. An implementation of the algorithm with different increasingly realistic settings shows the impact of the different factors on efficiency. With realistic settings, distinguishability only reduces by a small margin, remaining for applicable events between 95% and 100% depending on the scenario. By gradually increasing the degree of realism of the setting, efficient countermeasures could be devised for different problems, which are al...

  9. StochKit2: software for discrete stochastic simulation of biochemical systems with events.

    Science.gov (United States)

    Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R

    2011-09-01

    StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.

  10. Simulating the energy deposits of particles in the KASCADE-grande detector stations as a preliminary step for EAS event reconstruction

    International Nuclear Information System (INIS)

    Toma, G.; Brancus, I.M.; Mitrica, B.; Sima, O.; Rebel, H.; Haungs, A.

    2005-01-01

    The study of primary cosmic rays with energies higher than 10 14 eV is done mostly by indirect observation techniques such as the study of Extensive Air Showers (EAS). In the much larger framework effort of inferring data on the mass and energy of the primaries from EAS observables, the present study aims at developing a versatile method and software tool that will be used to reconstruct lateral particle densities from the energy deposits of particles in the KASCADE-Grande detector stations. The study has been performed on simulated events, by taking into account the interaction of the EAS components with the detector array (energy deposits). The energy deposits have been simulated using the GEANT code and then the energy deposits have been parametrized for different incident energies and angles of EAS particles. Thus the results obtained for simulated events have the same level of consistency as the experimental data. This technique will allow an increased speed of lateral particle density reconstruction when studying real events detected by the KASCADE-Grande array. The particle densities in detectors have been reconstructed from the energy deposits. A correlation between lateral particle density and primary mass and primary energy (at ∼600 m from shower core) has been established. The study puts great emphasis on the quality of reconstruction and also on the speed of the technique. The data obtained from the study on simulated events creates the basis for the next stage of the study, the study of real events detected by the KASCADE-Grande array. (authors)

  11. A hadron-nucleus collision event generator for simulations at intermediate energies

    CERN Document Server

    Ackerstaff, K; Bollmann, R

    2002-01-01

    Several available codes for hadronic event generation and shower simulation are discussed and their predictions are compared to experimental data in order to obtain a satisfactory description of hadronic processes in Monte Carlo studies of detector systems for medium energy experiments. The most reasonable description is found for the intra-nuclear-cascade (INC) model of Bertini which employs microscopic description of the INC, taking into account elastic and inelastic pion-nucleon and nucleon-nucleon scattering. The isobar model of Sternheimer and Lindenbaum is used to simulate the inelastic elementary collisions inside the nucleus via formation and decay of the DELTA sub 3 sub 3 -resonance which, however, limits the model at higher energies. To overcome this limitation, the INC model has been extended by using the resonance model of the HADRIN code, considering all resonances in elementary collisions contributing more than 2% to the total cross-section up to kinetic energies of 5 GeV. In addition, angular d...

  12. Developing a discrete event simulation model for university student shuttle buses

    Science.gov (United States)

    Zulkepli, Jafri; Khalid, Ruzelan; Nawawi, Mohd Kamal Mohd; Hamid, Muhammad Hafizan

    2017-11-01

    Providing shuttle buses for university students to attend their classes is crucial, especially when their number is large and the distances between their classes and residential halls are far. These factors, in addition to the non-optimal current bus services, typically require the students to wait longer which eventually opens a space for them to complain. To considerably reduce the waiting time, providing the optimal number of buses to transport them from location to location and the effective route schedules to fulfil the students' demand at relevant time ranges are thus important. The optimal bus number and schedules are to be determined and tested using a flexible decision platform. This paper thus models the current services of student shuttle buses in a university using a Discrete Event Simulation approach. The model can flexibly simulate whatever changes configured to the current system and report its effects to the performance measures. How the model was conceptualized and formulated for future system configurations are the main interest of this paper.

  13. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    Science.gov (United States)

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  14. Beyond Iconic Simulation

    Science.gov (United States)

    Dormans, Joris

    2011-01-01

    Realism remains a prominent topic in game design and industry research; yet, a strong academic case can be made that games are anything, but realistic. This article frames realism in games in semiotic terms as iconic simulation and argues that games can gain expressiveness when they move beyond the current focus on iconic simulation. In parallel…

  15. Simulating X-ray bursts during a transient accretion event

    Science.gov (United States)

    Johnston, Zac; Heger, Alexander; Galloway, Duncan K.

    2018-06-01

    Modelling of thermonuclear X-ray bursts on accreting neutron stars has to date focused on stable accretion rates. However, bursts are also observed during episodes of transient accretion. During such events, the accretion rate can evolve significantly between bursts, and this regime provides a unique test for burst models. The accretion-powered millisecond pulsar SAX J1808.4-3658 exhibits accretion outbursts every 2-3 yr. During the well-sampled month-long outburst of 2002 October, four helium-rich X-ray bursts were observed. Using this event as a test case, we present the first multizone simulations of X-ray bursts under a time-dependent accretion rate. We investigate the effect of using a time-dependent accretion rate in comparison to constant, averaged rates. Initial results suggest that using a constant, average accretion rate between bursts may underestimate the recurrence time when the accretion rate is decreasing, and overestimate it when the accretion rate is increasing. Our model, with an accreted hydrogen fraction of X = 0.44 and a CNO metallicity of ZCNO = 0.02, reproduces the observed burst arrival times and fluences with root mean square (rms) errors of 2.8 h, and 0.11× 10^{-6} erg cm^{-2}, respectively. Our results support previous modelling that predicted two unobserved bursts and indicate that additional bursts were also missed by observations.

  16. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    Science.gov (United States)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  17. Coniferous Canopy BRF Simulation Based on 3-D Realistic Scene

    Science.gov (United States)

    Wang, Xin-yun; Guo, Zhi-feng; Qin, Wen-han; Sun, Guo-qing

    2011-01-01

    It is difficulties for the computer simulation method to study radiation regime at large-scale. Simplified coniferous model was investigate d in the present study. It makes the computer simulation methods such as L-systems and radiosity-graphics combined method (RGM) more powerf ul in remote sensing of heterogeneous coniferous forests over a large -scale region. L-systems is applied to render 3-D coniferous forest scenarios: and RGM model was used to calculate BRF (bidirectional refle ctance factor) in visible and near-infrared regions. Results in this study show that in most cases both agreed well. Meanwhiie at a tree and forest level. the results are also good.

  18. A PC-based discrete event simulation model of the civilian radioactive waste management system

    International Nuclear Information System (INIS)

    Airth, G.L.; Joy, D.S.; Nehls, J.W.

    1992-01-01

    This paper discusses a System Simulation Model which has been developed for the Department of Energy to simulate the movement of individual waste packages (spent fuel assemblies and fuel containers) through the Civilian Radioactive Waste Management System (CRWMS). A discrete event simulation language, GPSS/PC, which runs on an IBM/PC and operates under DOS 5.0, mathematically represents the movement and processing of radioactive waste packages through the CRWMS and the interaction of these packages with the equipment in the various facilities. The major features of the System Simulation Model are: the ability to reference characteristics of the different types of radioactive waste (age, burnup, etc.) in order to make operational and/or system design decisions, the ability to place stochastic variations on operational parameters such as processing time and equipment outages, and the ability to include a rigorous simulation of the transportation system. Output from the model includes the numbers, types, and characteristics of waste packages at selected points in the CRWMS and the extent to which various resources will be utilized in order to transport, process, and emplace the waste

  19. Core discrete event simulation model for the evaluation of health care technologies in major depressive disorder.

    Science.gov (United States)

    Vataire, Anne-Lise; Aballéa, Samuel; Antonanzas, Fernando; Roijen, Leona Hakkaart-van; Lam, Raymond W; McCrone, Paul; Persson, Ulf; Toumi, Mondher

    2014-03-01

    A review of existing economic models in major depressive disorder (MDD) highlighted the need for models with longer time horizons that also account for heterogeneity in treatment pathways between patients. A core discrete event simulation model was developed to estimate health and cost outcomes associated with alternative treatment strategies. This model simulated short- and long-term clinical events (partial response, remission, relapse, recovery, and recurrence), adverse events, and treatment changes (titration, switch, addition, and discontinuation) over up to 5 years. Several treatment pathways were defined on the basis of fictitious antidepressants with three levels of efficacy, tolerability, and price (low, medium, and high) from first line to third line. The model was populated with input data from the literature for the UK setting. Model outputs include time in different health states, quality-adjusted life-years (QALYs), and costs from National Health Service and societal perspectives. The codes are open source. Predicted costs and QALYs from this model are within the range of results from previous economic evaluations. The largest cost components from the payer perspective were physician visits and hospitalizations. Key parameters driving the predicted costs and QALYs were utility values, effectiveness, and frequency of physician visits. Differences in QALYs and costs between two strategies with different effectiveness increased approximately twofold when the time horizon increased from 1 to 5 years. The discrete event simulation model can provide a more comprehensive evaluation of different therapeutic options in MDD, compared with existing Markov models, and can be used to compare a wide range of health care technologies in various groups of patients with MDD. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention

    Science.gov (United States)

    Elangovan, Tavasuria; Ismail, Zurida

    2014-01-01

    A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…

  1. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model

    Directory of Open Access Journals (Sweden)

    Fabian M Patricia

    2012-09-01

    Full Text Available Abstract Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%, which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens

  2. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach

    Science.gov (United States)

    Sun, Yang; Song, Huajing; Zhang, Feng; Yang, Lin; Ye, Zhuo; Mendelev, Mikhail I.; Wang, Cai-Zhuang; Ho, Kai-Ming

    2018-02-01

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a "persistent embryo" method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleation of the B 2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. Thus, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.

  3. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach.

    Science.gov (United States)

    Sun, Yang; Song, Huajing; Zhang, Feng; Yang, Lin; Ye, Zhuo; Mendelev, Mikhail I; Wang, Cai-Zhuang; Ho, Kai-Ming

    2018-02-23

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a "persistent embryo" method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleation of the B2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. Thus, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.

  4. Dynamics and predictions in the co-event interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Ghazi-Tabatabai, Yousef [Blackett Laboratory, Imperial College, London, SW7 2AZ (United Kingdom); Wallden, Petros [Raman Research Institute, Bangalore 560 080 (India)

    2009-06-12

    Sorkin has introduced a new, observer independent, interpretation of quantum mechanics that can give a successful realist account of the 'quantum micro-world' as well as explaining how classicality emerges at the level of observable events for a range of systems including single time 'Copenhagen measurements'. This 'co-event interpretation' presents us with a new ontology, in which a single 'co-event' is real. A new ontology necessitates a review of the dynamical and predictive mechanism of a theory, and in this paper we begin the process by exploring means of expressing the dynamical and predictive content of histories theories in terms of co-events.

  5. Development and verification of an efficient spatial neutron kinetics method for reactivity-initiated event analyses

    International Nuclear Information System (INIS)

    Ikeda, Hideaki; Takeda, Toshikazu

    2001-01-01

    A space/time nodal diffusion code based on the nodal expansion method (NEM), EPISODE, was developed in order to evaluate transient neutron behavior in light water reactor cores. The present code employs the improved quasistatic (IQS) method for spatial neutron kinetics, and neutron flux distribution is numerically obtained by solving the neutron diffusion equation with the nonlinear iteration scheme to achieve fast computation. A predictor-corrector (PC) method developed in the present study enabled to apply a coarse time mesh to the transient spatial neutron calculation than that applicable in the conventional IQS model, which improved computational efficiency further. Its computational advantage was demonstrated by applying to the numerical benchmark problems that simulate reactivity-initiated events, showing reduction of computational times up to a factor of three than the conventional IQS. The thermohydraulics model was also incorporated in EPISODE, and the capability of realistic reactivity event analyses was verified using the SPERT-III/E-Core experimental data. (author)

  6. 3D nonlinear magnetohydrodynamic simulations of macroscopic internal instabilities in tokamak plasmas

    International Nuclear Information System (INIS)

    Krebs, Isabel

    2017-01-01

    kept from decreasing below unity by flux pumping. A detailed analysis of this flux pumping mechanism is presented, and it is discussed under which conditions the mechanism is able to sustain itself. This includes a linear stability analysis of an equilibrium with low central magnetic shear and a central q∼1. Furthermore, a step towards more realistic simulations of Hybrid discharges has been made by performing 3D nonlinear MHD simulations based on ASDEX Upgrade geometry in which some features of the current ramp-up phase in realistic Hybrid discharges are imitated. Although in the framework of single-fluid MHD and with the used parameters it is not expected that all features of realistic sawtooth cycles are reproduced, the reconnection events obtained in the simulations share some of their characteristics and two interesting phenomena are found for specific sets of parameters. First, in some cases, the reconnection process during these sawtooth-like events does not complete but stops and reverses. And second, in one case the sawtooth-like reconnection events are separated by quiescent phases showing similar characteristics as the sawtooth-free stationary states.

  7. 3D nonlinear magnetohydrodynamic simulations of macroscopic internal instabilities in tokamak plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Krebs, Isabel

    2017-08-08

    kept from decreasing below unity by flux pumping. A detailed analysis of this flux pumping mechanism is presented, and it is discussed under which conditions the mechanism is able to sustain itself. This includes a linear stability analysis of an equilibrium with low central magnetic shear and a central q∼1. Furthermore, a step towards more realistic simulations of Hybrid discharges has been made by performing 3D nonlinear MHD simulations based on ASDEX Upgrade geometry in which some features of the current ramp-up phase in realistic Hybrid discharges are imitated. Although in the framework of single-fluid MHD and with the used parameters it is not expected that all features of realistic sawtooth cycles are reproduced, the reconnection events obtained in the simulations share some of their characteristics and two interesting phenomena are found for specific sets of parameters. First, in some cases, the reconnection process during these sawtooth-like events does not complete but stops and reverses. And second, in one case the sawtooth-like reconnection events are separated by quiescent phases showing similar characteristics as the sawtooth-free stationary states.

  8. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    Science.gov (United States)

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  9. PROBABILISTIC CROSS-IDENTIFICATION OF COSMIC EVENTS

    International Nuclear Information System (INIS)

    Budavari, Tamas

    2011-01-01

    I discuss a novel approach to identifying cosmic events in separate and independent observations. The focus is on the true events, such as supernova explosions, that happen once and, hence, whose measurements are not repeatable. Their classification and analysis must make the best use of all available data. Bayesian hypothesis testing is used to associate streams of events in space and time. Probabilities are assigned to the matches by studying their rates of occurrence. A case study of Type Ia supernovae illustrates how to use light curves in the cross-identification process. Constraints from realistic light curves happen to be well approximated by Gaussians in time, which makes the matching process very efficient. Model-dependent associations are computationally more demanding but can further boost one's confidence.

  10. QUALITY THROUGH INTEGRATION OF PRODUCTION AND SHOP FLOOR MANAGEMENT BY DISCRETE EVENT SIMULATION

    Directory of Open Access Journals (Sweden)

    Zoran Mirović

    2007-06-01

    Full Text Available With the intention to integrate strategic and tactical decision making and develop the capability of plans and schedules reconfiguration and synchronization in a very short cycle time many firms have proceeded to the adoption of ERP and Advanced Planning and Scheduling (APS technologies. The final goal is a purposeful scheduling system that guide in the right direction the current, high priority needs of the shop floor while remaining consistent with long-term production plans. The difference, and the power, of Discrete-Event Simulation (DES is its ability to mimic dynamic manufacturing systems, consisting of complex structures, and many heterogeneous interacting components. This paper describes such an integrated system (ERP/APS/DES and draw attention to the essential role of simulation based scheduling within it.

  11. An analysis of simulated and observed storm characteristics

    Science.gov (United States)

    Benestad, R. E.

    2010-09-01

    A calculus-based cyclone identification (CCI) method has been applied to the most recent re-analysis (ERAINT) from the European Centre for Medium-range Weather Forecasts and results from regional climate model (RCM) simulations. The storm frequency for events with central pressure below a threshold value of 960-990hPa were examined, and the gradient wind from the simulated storm systems were compared with corresponding estimates from the re-analysis. The analysis also yielded estimates for the spatial extent of the storm systems, which was also included in the regional climate model cyclone evaluation. A comparison is presented between a number of RCMs and the ERAINT re-analysis in terms of their description of the gradient winds, number of cyclones, and spatial extent. Furthermore, a comparison between geostrophic wind estimated though triangules of interpolated or station measurements of SLP is presented. Wind still represents one of the more challenging variables to model realistically.

  12. Simulación clínica de alto realismo: una experiencia en el pregrado Realistic clinical simulation: an experience with undergraduate medical students

    Directory of Open Access Journals (Sweden)

    Javier Riancho

    2012-06-01

    Full Text Available Introducción. La simulación con modelos de alto realismo se utiliza a menudo en la formación de los profesionales sanitarios. Sin embargo, son escasas las experiencias en el pregrado. El objetivo de este trabajo fue conocer la factibilidad y la aceptación de su aplicación con estudiantes de sexto curso de la licenciatura de Medicina. Materiales y métodos. Se diseñaron ocho escenarios que simulaban problemas clínicos frecuentes para su desarrollo con maniquíes de alto realismo. Los estudiantes se dividieron en grupos de 6-8 sujetos, cada uno de los cuales atendió dos casos durante 30 minutos. Posteriormente se llevó a cabo un análisis reflexivo durante 25-40 minutos. La actividad se repitió en dos años consecutivos. Al final se recabó la opinión de los estudiantes mediante encuestas anónimas. Resultados. La actividad fue valorada muy positivamente por los estudiantes, quienes la consideraron como "útil" (4,8 y 4,9 puntos sobre 5 e "interesante" (4,9 y 4,9 puntos. El tiempo preciso para preparar cada escenario fue de unas 3 horas. Fueron necesarias una jornada completa de un profesor, un técnico y un enfermero para que un colectivo de unos 40 estudiantes se expusiera a dos casos clínicos. Conclusiones. Esta experiencia piloto sugiere que la simulación de alto realismo es factible en el pregrado, supone un consumo razonable de recursos y tiene una elevada aceptación por parte de los estudiantes. No obstante, se necesitan otros estudios que confirmen la impresión subjetiva de que resulta útil para potenciar el aprendizaje de los alumnos y su competencia clínica.Introduction. Realistic clinical simulation is commonly used with physicians and other health professionals. However it has been rarely used with undergraduate students. The aim of this study was to explore its feasibility and acceptance with medical students. Materials and methods. Eight clinical scenarios representing common acute problems in medical practice were

  13. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Energy Technology Data Exchange (ETDEWEB)

    Guerrier, C. [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Holcman, D., E-mail: david.holcman@ens.fr [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Mathematical Institute, Oxford OX2 6GG, Newton Institute (United Kingdom)

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  14. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    International Nuclear Information System (INIS)

    Guerrier, C.; Holcman, D.

    2017-01-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  15. Simulating spontaneous aseismic and seismic slip events on evolving faults

    Science.gov (United States)

    Herrendörfer, Robert; van Dinther, Ylona; Pranger, Casper; Gerya, Taras

    2017-04-01

    Plate motion along tectonic boundaries is accommodated by different slip modes: steady creep, seismic slip and slow slip transients. Due to mainly indirect observations and difficulties to scale results from laboratory experiments to nature, it remains enigmatic which fault conditions favour certain slip modes. Therefore, we are developing a numerical modelling approach that is capable of simulating different slip modes together with the long-term fault evolution in a large-scale tectonic setting. We extend the 2D, continuum mechanics-based, visco-elasto-plastic thermo-mechanical model that was designed to simulate slip transients in large-scale geodynamic simulations (van Dinther et al., JGR, 2013). We improve the numerical approach to accurately treat the non-linear problem of plasticity (see also EGU 2017 abstract by Pranger et al.). To resolve a wide slip rate spectrum on evolving faults, we develop an invariant reformulation of the conventional rate-and-state dependent friction (RSF) and adapt the time step (Lapusta et al., JGR, 2000). A crucial part of this development is a conceptual ductile fault zone model that relates slip rates along discrete planes to the effective macroscopic plastic strain rates in the continuum. We test our implementation first in a simple 2D setup with a single fault zone that has a predefined initial thickness. Results show that deformation localizes in case of steady creep and for very slow slip transients to a bell-shaped strain rate profile across the fault zone, which suggests that a length scale across the fault zone may exist. This continuum length scale would overcome the common mesh-dependency in plasticity simulations and question the conventional treatment of aseismic slip on infinitely thin fault zones. We test the introduction of a diffusion term (similar to the damage description in Lyakhovsky et al., JMPS, 2011) into the state evolution equation and its effect on (de-)localization during faster slip events. We compare

  16. Adapting realist synthesis methodology: The case of workplace harassment interventions.

    Science.gov (United States)

    Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Gerrard, Angie

    2017-12-01

    Realist synthesis techniques can be used to assess complex interventions by extracting and synthesizing configurations of contexts, mechanisms, and outcomes found in the literature. Our novel and multi-pronged approach to the realist synthesis of workplace harassment interventions describes our pursuit of theory to link macro and program level theories. After discovering the limitations of a dogmatic approach to realist synthesis, we adapted our search strategy and focused our analysis on a subset of data. We tailored our realist synthesis to understand how, why, and under what circumstances workplace harassment interventions are effective. The result was a conceptual framework to test our theory-based interventions and provide the basis for subsequent realist evaluation. Our experience documented in this article contributes to an understanding of how, under what circumstances, and with what consequences realist synthesis principles can be customized. Copyright © 2017 John Wiley & Sons, Ltd.

  17. A predictive model of nuclear power plant crew decision-making and performance in a dynamic simulation environment

    Science.gov (United States)

    Coyne, Kevin Anthony

    branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.

  18. Incorporating information from source simulations into searches for gravitational-wave bursts

    International Nuclear Information System (INIS)

    Brady, Patrick R; Ray-Majumder, Saikat

    2004-01-01

    The detection of gravitational waves from astrophysical sources of gravitational waves is a realistic goal for the current generation of interferometric gravitational-wave detectors. Short duration bursts of gravitational waves from core-collapse supernovae or mergers of binary black holes may bring a wealth of astronomical and astrophysical information. The weakness of the waves and the rarity of the events urges the development of optimal methods to detect the waves. The waves from these sources are not generally known well enough to use matched filtering however; this drives the need to develop new ways to exploit source simulation information in both detection and information extraction. We present an algorithmic approach to using catalogues of gravitational-wave signals developed through numerical simulation, or otherwise, to enhance our ability to detect these waves. As more detailed simulations become available, it is straightforward to incorporate the new information into the search method. This approach may also be useful when trying to extract information from a gravitational-wave observation by allowing direct comparison between the observation and simulations

  19. Simulation of Martian surface-atmosphere interaction in a space-simulator: Technical considerations and feasibility

    Science.gov (United States)

    Moehlmann, D.; Kochan, H.

    1992-01-01

    The Space Simulator of the German Aerospace Research Establishment at Cologne, formerly used for testing satellites, is now, since 1987, the central unit within the research sub-program 'Comet-Simulation' (KOSI). The KOSI team has investigated physical processes relevant to comets and their surfaces. As a byproduct we gained experience in sample-handling under simulated space conditions. In broadening the scope of the research activities of the DLR Institute of Space Simulation an extension to 'Laboratory-Planetology' is planned. Following the KOSI-experiments a Mars Surface-Simulation with realistic minerals and surface soil in a suited environment (temperature, pressure, and CO2-atmosphere) is foreseen as the next step. Here, our main interest is centered on thermophysical properties of the Martian surface and energy transport (and related gas transport) through the surface. These laboratory simulation activities can be related to space missions as typical pre-mission and during-the-mission support of the experiments design and operations (simulation in parallel). Post mission experiments for confirmation and interpretation of results are of great value. The physical dimensions of the Space Simulator (cylinder of about 2.5 m diameter and 5 m length) allows for testing and qualification of experimental hardware under realistic Martian conditions.

  20. Breaking with fun, educational and realistic learning games

    DEFF Research Database (Denmark)

    Duus Henriksen, Thomas

    2009-01-01

    are commonly conceived as means for staging learning processes, and that thinking learning games so has an inhibiting effect in regard to creating learning processes. The paper draws upon a qualitative study of participants' experiences with ‘the EIS Simulation', which is a computer-based learning game......This paper addresses the game conceptions and values that learning games inherit from regular gaming, as well as how they affect the use and development of learning games. Its key points concern the issues of thinking learning games as fun, educative and realistic, which is how learning games...... for teaching change management and change implementation. The EIS is played in groups, who share the game on a computer, and played by making change decisions in order to implement an IT system in an organisation. In this study, alternative participatory incentives, means for creating learning processes...

  1. Evaluation of photovoltaic panel temperature in realistic scenarios

    International Nuclear Information System (INIS)

    Du, Yanping; Fell, Christopher J.; Duck, Benjamin; Chen, Dong; Liffman, Kurt; Zhang, Yinan; Gu, Min; Zhu, Yonggang

    2016-01-01

    Highlights: • The developed realistic model captures more reasonably the thermal response and hysteresis effects. • The predicted panel temperature is as high as 60 °C under a solar irradiance of 1000 W/m"2 in no-wind weather. • In realistic scenarios, the thermal response normally takes 50–250 s. • The actual heating effect may cause a photoelectric efficiency drop of 2.9–9.0%. - Abstract: Photovoltaic (PV) panel temperature was evaluated by developing theoretical models that are feasible to be used in realistic scenarios. Effects of solar irradiance, wind speed and ambient temperature on the PV panel temperature were studied. The parametric study shows significant influence of solar irradiance and wind speed on the PV panel temperature. With an increase of ambient temperature, the temperature rise of solar cells is reduced. The characteristics of panel temperature in realistic scenarios were analyzed. In steady weather conditions, the thermal response time of a solar cell with a Si thickness of 100–500 μm is around 50–250 s. While in realistic scenarios, the panel temperature variation in a day is different from that in steady weather conditions due to the effect of thermal hysteresis. The heating effect on the photovoltaic efficiency was assessed based on real-time temperature measurement of solar cells in realistic weather conditions. For solar cells with a temperature coefficient in the range of −0.21%∼−0.50%, the current field tests indicated an approximate efficiency loss between 2.9% and 9.0%.

  2. Comparative study of non-premixed and partially-premixed combustion simulations in a realistic Tay model combustor

    OpenAIRE

    Zhang, K.; Ghobadian, A.; Nouri, J. M.

    2017-01-01

    A comparative study of two combustion models based on non-premixed assumption and partially premixed assumptions using the overall models of Zimont Turbulent Flame Speed Closure Method (ZTFSC) and Extended Coherent Flamelet Method (ECFM) are conducted through Reynolds stress turbulence modelling of Tay model gas turbine combustor for the first time. The Tay model combustor retains all essential features of a realistic gas turbine combustor. It is seen that the non-premixed combustion model fa...

  3. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  4. Simulating coronal condensation dynamics in 3D

    Science.gov (United States)

    Moschou, S. P.; Keppens, R.; Xia, C.; Fang, X.

    2015-12-01

    We present numerical simulations in 3D settings where coronal rain phenomena take place in a magnetic configuration of a quadrupolar arcade system. Our simulation is a magnetohydrodynamic simulation including anisotropic thermal conduction, optically thin radiative losses, and parametrised heating as main thermodynamical features to construct a realistic arcade configuration from chromospheric to coronal heights. The plasma evaporation from chromospheric and transition region heights eventually causes localised runaway condensation events and we witness the formation of plasma blobs due to thermal instability, that evolve dynamically in the heated arcade part and move gradually downwards due to interchange type dynamics. Unlike earlier 2.5D simulations, in this case there is no large scale prominence formation observed, but a continuous coronal rain develops which shows clear indications of Rayleigh-Taylor or interchange instability, that causes the denser plasma located above the transition region to fall down, as the system moves towards a more stable state. Linear stability analysis is used in the non-linear regime for gaining insight and giving a prediction of the system's evolution. After the plasma blobs descend through interchange, they follow the magnetic field topology more closely in the lower coronal regions, where they are guided by the magnetic dips.

  5. Simulation of Material Flow Through a Sample Divider

    Directory of Open Access Journals (Sweden)

    Jiří Rozbroj

    2018-03-01

    Full Text Available The prerequisite for a modern approach to innovative procedures of the development of current or even newly created equipment for the transport of particulate materials is the utilization of simulation methods, such as the Discrete Element Method (DEM. This article focuses on the basic, or initial, validation of movement of material through the sample divider. The mechanical-physical properties of brown coal were measured. Based on these parameters the preliminary input values for EDEM Academic were selected, and a simulation of the dividing process was run. The key monitored parameters included density and friction coefficient. Experiments on a realistic model of the equipment were performed and assessed. The total weights of brown coal at the exit from the divider were determined for a specific speed of the divider. The aim of this task was to simulate the realistically determined weight division of the brown coal sample. The result from the DEM was compared with the results of measurement on a realistic model.

  6. Simulation study of pedestrian flow in a station hall during the Spring Festival travel rush

    Science.gov (United States)

    Wang, Lei; Zhang, Qian; Cai, Yun; Zhang, Jianlin; Ma, Qingguo

    2013-05-01

    The Spring Festival is the most important festival in China. How can passengers go home smoothly and quickly during the Spring Festival travel rush, especially when emergencies of terrible winter weather happen? By modifying the social force model, we simulated the pedestrian flow in a station hall. The simulation revealed casualties happened when passengers escaped from panic induced by crowd turbulence. The results suggest that passenger numbers, ticket checking patterns, baggage volumes, and anxiety can affect the speed of passing through the waiting corridor. Our approach is meaningful in understanding the feature of a crowd moving and can be served to reproduce mass events. Therefore, it not only develops a realistic modeling of pedestrian flow but also is important for a better preparation of emergency management.

  7. Using discrete event simulation to change from a functional layout to a cellular layout in an auto parts industry

    Directory of Open Access Journals (Sweden)

    Thiago Buselato Maurício

    2015-07-01

    Full Text Available This paper presents a discrete event simulation employed in a Brazilian automotive company. There was a huge waste caused by one family scrap. It was believed one reason was the company functional layout. In this case, changing from current to cellular layout, employee synergy and knowledge about this family would increase. Due to the complexity for dimensioning a new cellular layout, mainly because of batch size and client’s demand variation. In this case, discrete event simulation was used, which made possible to introduce those effects improving accuracy in final results. This accuracy will be shown by comparing results obtained with simulation and without it (as company used to do. To conclude, cellular layout was responsible for increasing 15% of productivity, reducing lead-time in 7 days and scrap in 15% for this family.

  8. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    Science.gov (United States)

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintore, Joaquin

    2012-01-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  9. Simulating human behavior for national security human interactions.

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.; Glickman, Matthew R.; Wolfenbarger, Paul R.; Xavier, Patrick Gordon

    2007-01-01

    This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humans were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.

  10. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    Science.gov (United States)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-12-01

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  11. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    KAUST Repository

    Chevuturi, A.

    2014-12-19

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00–18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  12. TWO-DIMENSIONAL SIMULATIONS OF EXPLOSIVE ERUPTIONS OF KICK-EM JENNY AND OTHER SUBMARINE VOLCANOS

    Directory of Open Access Journals (Sweden)

    Galen Gisler

    2006-01-01

    Full Text Available Kick-em Jenny, in the Eastern Caribbean, is a submerged volcanic cone that has erupted a dozen or more times since its discovery in 1939. The most likely hazard posed by this volcano is to shipping in the immediate vicinity (through volcanic missiles or loss-of-buoyancy, but it is of interest to estimate upper limits on tsunamis that might be produced by a catastrophic explosive eruption. To this end, we have performed two-dimensional simulations of such an event in a geometry resembling that of Kick-em Jenny with our SAGE adaptive mesh Eulerian multifluid compressible hydrocode. We use realistic equations of state for air, water, and basalt, and follow the event from the initial explosive eruption, through the generation of a transient water cavity and the propagation of waves away from the site. We find that even for extremely catastrophic explosive eruptions, tsunamis from Kick-em Jenny are unlikely to pose significant danger to nearby islands. For comparison, we have also performed simulations of explosive eruptions at the much larger shield volcano Vailulu'u in the Samoan chain, where the greater energy available can produce a more impressive wave. In general, however, we conclude that explosive eruptions do not couple well to water waves. The waves that are produced from such events are turbulent and highly dissipative, and don't propagate well. This is consistent with what we have found previously in simulations of asteroid-impact generated tsunamis. Non-explosive events, however, such as landslides or gas hydrate releases, do couple well to waves, and our simulations of tsunamis generated by sub- aerial and sub-aqueous landslides demonstrate this.

  13. Tree Simulation Techniques for Integrated Safety Assessment

    International Nuclear Information System (INIS)

    Melendez Asensio, E.; Izquierdo Rocha, J.M.; Sanchez Perez, M.; Hortal Reymundo, J.; Perez Mulas, A.

    1999-01-01

    In the development of a PSA a central role is played by the construction of the event trees that, stemming from each initiating event considered, will finally lead to the calculation of the core damage frequency (CDF). This construction is extensively done by means of expert judgement, performing simulation of the sequences only when some doubt arises (typically for small break LOCAs) using integrated codes. Transient analysis done within the framework of the PSA has important simplifications that affect the transient simulation, specially in relation with the control and protection systems, and may not identify complicated sequences. Specialised simulation tools such as RELAP5 have been used only at very specific points of the PSAs due to time and model complexity constraints. On the other hand, a large amount of work and expertise has been laid in the development of simulation tools, independently of the PSA effort. Tools simplified in the phenomena they consider, but complete in the treatment of automatic and manual control and protection systems are now available for automatic generation and testing of the sequences appearing in the Event Trees. Additionally, an increasing effort and interest has been spent in the development of efficient means for the representation of the Emergency Operating Procedures, which, incorporated into the simulating tool would make a more realistic picture of the real plant performance. At present; both types of analyses - transient and probabilistic - can converge to be able to dynamically generate an Event Tree while taking into account the actual performance of the control and protection systems of the plant and the operator actions. Such a tool would allow for an assessment of the Event Trees, and, if fed with probabilistic data, can provide results of the same nature. Key aspects of this approach that imply new theoretical as well as practical developments in addition to those traditionally covered by classical simulation

  14. Realist synthesis: illustrating the method for implementation research

    Directory of Open Access Journals (Sweden)

    Rycroft-Malone Jo

    2012-04-01

    Full Text Available Abstract Background Realist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not. There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered. Methods The realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation; searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses. Results Based on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area. Conclusions Realist synthesis lends itself to the review of complex interventions because it accounts for context as well as

  15. Production of a faithful realistic phantom to human head and thermal neutron flux measurement on the brain surface. Cooperative research

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Kazuyoshi; Kumada, Hiroaki; Kishi, Toshiaki; Torii, Yoshiya; Uchiyama, Junzo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Endo, Kiyoshi; Yamamoto, Tetsuya; Matsumura, Akira; Nose, Tadao [Tsukuba Univ., Tsukuba, Ibaraki (Japan)

    2002-12-01

    Thermal neutron flux is determined using the gold wires in current BNCT irradiation, so evaluation of arbitrary points after the irradiation is limited in the quantity of these detectors. In order to make up for the weakness, dose estimation of a patient is simulated by a computational dose calculation supporting system. In another way without computer simulation, a medical irradiation condition can be replicate experimentally using of realistic phantom which was produced from CT images by rapid prototyping technique. This phantom was irradiated at a same JRR-4 neutron beam as clinical irradiation condition of the patient and the thermal neutron distribution on the brain surface was measured in detail. This experimental evaluation technique using a realistic phantom is applicable to in vitro cell irradiation experiments for radiation biological effects as well as in-phantom experiments for dosimetry under the nearly medical irradiation condition of patient. (author)

  16. Production of a faithful realistic phantom to human head and thermal neutron flux measurement on the brain surface. Cooperative research

    CERN Document Server

    Yamamoto, K; Kishi, T; Kumada, H; Matsumura, A; Nose, T; Torii, Y; Uchiyama, J; Yamamoto, T

    2002-01-01

    Thermal neutron flux is determined using the gold wires in current BNCT irradiation, so evaluation of arbitrary points after the irradiation is limited in the quantity of these detectors. In order to make up for the weakness, dose estimation of a patient is simulated by a computational dose calculation supporting system. In another way without computer simulation, a medical irradiation condition can be replicate experimentally using of realistic phantom which was produced from CT images by rapid prototyping technique. This phantom was irradiated at a same JRR-4 neutron beam as clinical irradiation condition of the patient and the thermal neutron distribution on the brain surface was measured in detail. This experimental evaluation technique using a realistic phantom is applicable to in vitro cell irradiation experiments for radiation biological effects as well as in-phantom experiments for dosimetry under the nearly medical irradiation condition of patient.

  17. Sotsialistlik realist Keskküla

    Index Scriptorium Estoniae

    1998-01-01

    Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid

  18. Virtual reality in urban water management: communicating urban flooding with particle-based CFD simulations.

    Science.gov (United States)

    Winkler, Daniel; Zischg, Jonatan; Rauch, Wolfgang

    2018-01-01

    For communicating urban flood risk to authorities and the public, a realistic three-dimensional visual display is frequently more suitable than detailed flood maps. Virtual reality could also serve to plan short-term flooding interventions. We introduce here an alternative approach for simulating three-dimensional flooding dynamics in large- and small-scale urban scenes by reaching out to computer graphics. This approach, denoted 'particle in cell', is a particle-based CFD method that is used to predict physically plausible results instead of accurate flow dynamics. We exemplify the approach for the real flooding event in July 2016 in Innsbruck.

  19. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    Science.gov (United States)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  20. Simulating an extreme over-the-horizon optical propagation event over Lake Michigan using a coupled mesoscale modeling and ray tracing framework

    NARCIS (Netherlands)

    Basu, S.

    2017-01-01

    Accurate simulation and forecasting of over-the-horizon propagation events are essential for various civilian and defense applications. We demonstrate the prowess of a newly proposed coupled mesoscale modeling and ray tracing framework in reproducing such an event. Wherever possible, routinely