WorldWideScience

Sample records for model simulations performed

  1. Maintenance Personnel Performance Simulation (MAPPS) model

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.; Haas, P.M.

    1984-01-01

    A stochastic computer model for simulating the actions and behavior of nuclear power plant maintenance personnel is described. The model considers personnel, environmental, and motivational variables to yield predictions of maintenance performance quality and time to perform. The mode has been fully developed and sensitivity tested. Additional evaluation of the model is now taking place

  2. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  3. Impact of reactive settler models on simulated WWTP performance

    DEFF Research Database (Denmark)

    Gernaey, Krist; Jeppsson, Ulf; Batstone, Damien J.

    2006-01-01

    for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takacs settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate......, combined with a non-reactive Takacs settler. The second is a fully reactive ASM1 Takacs settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively....... The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler....

  4. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.

    1985-01-01

    A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

  5. MODELING SIMULATION AND PERFORMANCE STUDY OF GRIDCONNECTED PHOTOVOLTAIC ENERGY SYSTEM

    OpenAIRE

    Nagendra K; Karthik J; Keerthi Rao C; Kumar Raja Pemmadi

    2017-01-01

    This paper presents Modeling Simulation of grid connected Photovoltaic Energy System and performance study using MATLAB/Simulink. The Photovoltaic energy system is considered in three main parts PV Model, Power conditioning System and Grid interface. The Photovoltaic Model is inter-connected with grid through full scale power electronic devices. The simulation is conducted on the PV energy system at normal temperature and at constant load by using MATLAB.

  6. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  7. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    Science.gov (United States)

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  8. Simulation model of a twin-tail, high performance airplane

    Science.gov (United States)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  9. Maintenance Personnel Performance Simulation (MAPPS) model: description of model content, structure, and sensitivity testing. Volume 2

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.

    1984-12-01

    This volume of NUREG/CR-3626 presents details of the content, structure, and sensitivity testing of the Maintenance Personnel Performance Simulation (MAPPS) model that was described in summary in volume one of this report. The MAPPS model is a generalized stochastic computer simulation model developed to simulate the performance of maintenance personnel in nuclear power plants. The MAPPS model considers workplace, maintenance technician, motivation, human factors, and task oriented variables to yield predictive information about the effects of these variables on successful maintenance task performance. All major model variables are discussed in detail and their implementation and interactive effects are outlined. The model was examined for disqualifying defects from a number of viewpoints, including sensitivity testing. This examination led to the identification of some minor recalibration efforts which were carried out. These positive results indicate that MAPPS is ready for initial and controlled applications which are in conformity with its purposes

  10. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  11. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  12. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    Science.gov (United States)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  13. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  14. Surrogate model approach for improving the performance of reactive transport simulations

    Science.gov (United States)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines

  15. Water desalination price from recent performances: Modelling, simulation and analysis

    International Nuclear Information System (INIS)

    Metaiche, M.; Kettab, A.

    2005-01-01

    The subject of the present article is the technical simulation of seawater desalination, by a one stage reverse osmosis system, the objectives of which are the recent valuation of cost price through the use of new membrane and permeator performances, the use of new means of simulation and modelling of desalination parameters, and show the main parameters influencing the cost price. We have taken as the simulation example the Seawater Desalting centre of Djannet (Boumerdes, Algeria). The present performances allow water desalting at a price of 0.5 $/m 3 , which is an interesting and promising price, corresponding with the very acceptable water product quality, in the order of 269 ppm. It is important to run the desalting systems by reverse osmosis under high pressure, resulting in further decrease of the desalting cost and the production of good quality water. Aberration in choice of functioning conditions produces high prices and unacceptable quality. However there exists the possibility of decreasing the price by decreasing the requirement on the product quality. The seawater temperature has an effect on the cost price and quality. The installation of big desalting centres, contributes to the decrease in prices. A very important, long and tedious calculation is effected, which is impossible to conduct without programming and informatics tools. The use of the simulation model has been much efficient in the design of desalination centres that can perform at very improved prices. (author)

  16. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  17. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Directory of Open Access Journals (Sweden)

    Jeon Soohong

    2014-12-01

    Full Text Available This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/ SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive elements, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  18. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Directory of Open Access Journals (Sweden)

    Soohong Jeon

    2014-12-01

    Full Text Available This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive ele- ments, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  19. Weather model performance on extreme rainfall events simulation's over Western Iberian Peninsula

    Science.gov (United States)

    Pereira, S. C.; Carvalho, A. C.; Ferreira, J.; Nunes, J. P.; Kaiser, J. J.; Rocha, A.

    2012-08-01

    This study evaluates the performance of the WRF-ARW numerical weather model in simulating the spatial and temporal patterns of an extreme rainfall period over a complex orographic region in north-central Portugal. The analysis was performed for the December month of 2009, during the Portugal Mainland rainy season. The heavy rainfall to extreme heavy rainfall periods were due to several low surface pressure's systems associated with frontal surfaces. The total amount of precipitation for December exceeded, in average, the climatological mean for the 1971-2000 time period in +89 mm, varying from 190 mm (south part of the country) to 1175 mm (north part of the country). Three model runs were conducted to assess possible improvements in model performance: (1) the WRF-ARW is forced with the initial fields from a global domain model (RunRef); (2) data assimilation for a specific location (RunObsN) is included; (3) nudging is used to adjust the analysis field (RunGridN). Model performance was evaluated against an observed hourly precipitation dataset of 15 rainfall stations using several statistical parameters. The WRF-ARW model reproduced well the temporal rainfall patterns but tended to overestimate precipitation amounts. The RunGridN simulation provided the best results but model performance of the other two runs was good too, so that the selected extreme rainfall episode was successfully reproduced.

  20. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    International Nuclear Information System (INIS)

    Kelly, Dana L.; Boring, Ronald L.; Mosleh, Ali; Smidts, Carol

    2011-01-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  1. Performance of the general circulation models in simulating temperature and precipitation over Iran

    Science.gov (United States)

    Abbasian, Mohammadsadegh; Moghim, Sanaz; Abrishamchi, Ahmad

    2018-03-01

    General Circulation Models (GCMs) are advanced tools for impact assessment and climate change studies. Previous studies show that the performance of the GCMs in simulating climate variables varies significantly over different regions. This study intends to evaluate the performance of the Coupled Model Intercomparison Project phase 5 (CMIP5) GCMs in simulating temperature and precipitation over Iran. Simulations from 37 GCMs and observations from the Climatic Research Unit (CRU) were obtained for the period of 1901-2005. Six measures of performance including mean bias, root mean square error (RMSE), Nash-Sutcliffe efficiency (NSE), linear correlation coefficient (r), Kolmogorov-Smirnov statistic (KS), Sen's slope estimator, and the Taylor diagram are used for the evaluation. GCMs are ranked based on each statistic at seasonal and annual time scales. Results show that most GCMs perform reasonably well in simulating the annual and seasonal temperature over Iran. The majority of the GCMs have a poor skill to simulate precipitation, particularly at seasonal scale. Based on the results, the best GCMs to represent temperature and precipitation simulations over Iran are the CMCC-CMS (Euro-Mediterranean Center on Climate Change) and the MRI-CGCM3 (Meteorological Research Institute), respectively. The results are valuable for climate and hydrometeorological studies and can help water resources planners and managers to choose the proper GCM based on their criteria.

  2. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  3. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  4. CASTOR detector. Model, objectives and simulated performance

    International Nuclear Information System (INIS)

    Angelis, A. L. S.; Mavromanolakis, G.; Panagiotou, A. D.; Aslanoglou, X.; Nicolis, N.; Lobanov, M.; Erine, S.; Kharlov, Y. V.; Bogolyubsky, M. Y.; Kurepin, A. B.; Chileev, K.; Wlodarczyk, Z.

    2001-01-01

    It is presented a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. It is described the CASTOR calorimeter, a sub detector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented

  5. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  6. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    Science.gov (United States)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  7. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  8. Shoulder Arthroscopy Simulator Training Improves Shoulder Arthroscopy Performance in a Cadaver Model

    Science.gov (United States)

    Henn, R. Frank; Shah, Neel; Warner, Jon J.P.; Gomoll, Andreas H.

    2013-01-01

    Purpose The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaver model of shoulder arthroscopy. Methods Seventeen first year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and nine of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The two groups were compared with students t-tests, and change over time within groups was analyzed with paired t-tests. Results There were no observed differences between the two groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (parthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaver model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. Clinical Relevance There may be a role for simulator training in shoulder arthroscopy education. PMID:23591380

  9. Shoulder arthroscopy simulator training improves shoulder arthroscopy performance in a cadaveric model.

    Science.gov (United States)

    Henn, R Frank; Shah, Neel; Warner, Jon J P; Gomoll, Andreas H

    2013-06-01

    The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaveric model of shoulder arthroscopy. Seventeen first-year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and 9 of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The 2 groups were compared by use of Student t tests, and change over time within groups was analyzed with paired t tests. There were no observed differences between the 2 groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (P arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaveric model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. There may be a role for simulator training in shoulder arthroscopy education. Copyright © 2013 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  10. A New Model to Simulate Energy Performance of VRF Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  11. Maintenance personnel performance simulation (MAPPS): a model for predicting maintenance performance reliability in nuclear power plants

    International Nuclear Information System (INIS)

    Knee, H.E.; Krois, P.A.; Haas, P.M.; Siegel, A.I.; Ryan, T.G.

    1983-01-01

    The NRC has developed a structured, quantitative, predictive methodology in the form of a computerized simulation model for assessing maintainer task performance. Objective of the overall program is to develop, validate, and disseminate a practical, useful, and acceptable methodology for the quantitative assessment of NPP maintenance personnel reliability. The program was organized into four phases: (1) scoping study, (2) model development, (3) model evaluation, and (4) model dissemination. The program is currently nearing completion of Phase 2 - Model Development

  12. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  13. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  14. Photovoltaic array performance simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Menicucci, D. F.

    1986-09-15

    The experience of the solar industry confirms that, despite recent cost reductions, the profitability of photovoltaic (PV) systems is often marginal and the configuration and sizing of a system is a critical problem for the design engineer. Construction and evaluation of experimental systems are expensive and seldom justifiable. A mathematical model or computer-simulation program is a desirable alternative, provided reliable results can be obtained. Sandia National Laboratories, Albuquerque (SNLA), has been studying PV-system modeling techniques in an effort to develop an effective tool to be used by engineers and architects in the design of cost-effective PV systems. This paper reviews two of the sources of error found in previous PV modeling programs, presents the remedies developed to correct these errors, and describes a new program that incorporates these improvements.

  15. The Maintenance Personnel Performance Simulation (MAPPS) model: A human reliability analysis tool

    International Nuclear Information System (INIS)

    Knee, H.E.

    1985-01-01

    The Maintenance Personnel Performance Simulation (MAPPS) model is a computerized, stochastic, task-oriented human behavioral model developed to provide estimates of nuclear power plant (NPP) maintenance team performance measures. It is capable of addressing person-machine, person-environment, and person-person relationships, and accounts for interdependencies that exist between the subelements that make up the maintenance task of interest. The primary measures of performance estimated by MAPPS are: 1) the probability of successfully completing the task of interest and 2) the task duration time. MAPPS also estimates a host of other performance indices, including the probability of an undetected error, identification of the most- and least-likely error-prone subelements, and maintenance team stress profiles during task execution

  16. Simulating Performance Risk for Lighting Retrofit Decisions

    Directory of Open Access Journals (Sweden)

    Jia Hu

    2015-05-01

    Full Text Available In building retrofit projects, dynamic simulations are performed to simulate building performance. Uncertainty may negatively affect model calibration and predicted lighting energy savings, which increases the chance of default on performance-based contracts. Therefore, the aim of this paper is to develop a simulation-based method that can analyze lighting performance risk in lighting retrofit decisions. The model uses a surrogate model, which is constructed by adaptively selecting sample points and generating approximation surfaces with fast computing time. The surrogate model is a replacement of the computation intensive process. A statistical method is developed to generate extreme weather profile based on the 20-year historical weather data. A stochastic occupancy model was created using actual occupancy data to generate realistic occupancy patterns. Energy usage of lighting, and heating, ventilation, and air conditioning (HVAC is simulated using EnergyPlus. The method can evaluate the influence of different risk factors (e.g., variation of luminaire input wattage, varying weather conditions on lighting and HVAC energy consumption and lighting electricity demand. Probability distributions are generated to quantify the risk values. A case study was conducted to demonstrate and validate the methods. The surrogate model is a good solution for quantifying the risk factors and probability distribution of the building performance.

  17. Improving firm performance in out-of-equilibrium, deregulated markets using feedback simulation models

    International Nuclear Information System (INIS)

    Gary, S.; Larsen, E.R.

    2000-01-01

    Deregulation has reshaped the utility sector in many countries around the world. Organisations in these deregulated industries must adopt new polices which guide strategic decisions, in an uncertain and unfamiliar environment, that determine the short- and long-term fate of their companies. Traditional economic equilibrium models do not adequately address the issues facing these organisations in the shift towards deregulated market competition. Equilibrium assumptions break down in the out-of-equilibrium transition to competitive markets, and therefore different underpinning assumptions must be adopted in order to guide management in these periods. Simulation models incorporating information feedback through behavioural policies fill the void left by equilibrium models and support strategic policy analysis in out-of-equilibrium markets. As an example, we present a feedback simulation model developed to examine firm and industry level performance consequences of new generation capacity investment policies in the deregulated UK electricity sector. The model explicitly captures behavioural decision polices of boundedly rational managers and avoids equilibrium assumptions. Such models are essential to help managers evaluate the performance impact of various strategic policies in environments in which disequilibrum behaviour dominates. (Author)

  18. Comparison of Two Models for Damage Accumulation in Simulations of System Performance

    Energy Technology Data Exchange (ETDEWEB)

    Youngblood, R. [Idaho National Laboratory, Idaho Falls, ID (United States); Mandelli, D. [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-11-01

    A comprehensive simulation study of system performance needs to address variations in component behavior, variations in phenomenology, and the coupling between phenomenology and component failure. This paper discusses two models of this: 1. damage accumulation is modeled as a random walk process in each time history, with component failure occurring when damage accumulation reaches a specified threshold; or 2. damage accumulation is modeled mechanistically within each time history, but failure occurs when damage reaches a time-history-specific threshold, sampled at time zero from each component’s distribution of damage tolerance. A limiting case of the latter is classical discrete-event simulation, with component failure times sampled a priori from failure time distributions; but in such models, the failure times are not typically adjusted for operating conditions varying within a time history. Nowadays, as discussed below, it is practical to account for this. The paper compares the interpretations and computational aspects of the two models mentioned above.

  19. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  20. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  1. High-Performance Modeling of Carbon Dioxide Sequestration by Coupling Reservoir Simulation and Molecular Dynamics

    KAUST Repository

    Bao, Kai

    2015-10-26

    The present work describes a parallel computational framework for carbon dioxide (CO2) sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel high-performance-computing (HPC) systems. In this framework, a parallel reservoir simulator, reservoir-simulation toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, whereas the MD simulations are performed to provide the required physical parameters. Technologies from several different fields are used to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large-scale CO2 sequestration for long-term storage in subsurface geological formations, such as depleted oil and gas reservoirs and deep saline aquifers, which has been proposed as one of the few attractive and practical solutions to reduce CO2 emissions and address the global-warming threat. Fine grids and accurate prediction of the properties of fluid mixtures under geological conditions are essential for accurate simulations. In this work, CO2 sequestration is presented as a first example for coupling reservoir simulation and MD, although the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical processes in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability is observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well-demonstrated with several experiments with hundreds of millions to one billion cells. To the best of our knowledge, the present work represents the first attempt to couple reservoir simulation and molecular simulation for large-scale modeling. Because of the complexity of

  2. High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications

    Science.gov (United States)

    Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad

    2012-01-01

    NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure

  3. Thermodynamic simulation model for predicting the performance of spark ignition engines using biogas as fuel

    International Nuclear Information System (INIS)

    Nunes de Faria, Mário M.; Vargas Machuca Bueno, Juan P.; Ayad, Sami M.M. Elmassalami; Belchior, Carlos R. Pereira

    2017-01-01

    Highlights: • A 0-D model for performance prediction of SI ICE fueled with biogas is proposed. • Relative difference between simulated and experimental values was under 5%. • Can be adapted for different biogas compositions and operating ranges. • Could be a valuable tool for predicting trends and guiding experimentation. • Is suitable for use with biogas supplies in developing regions. - Abstract: Biogas found its way from developing countries and is now an alternative to fossil fuels in internal combustion engines and with the advantage of lower greenhouse gas emissions. However, its use in gas engines requires engine modifications or adaptations that may be costly. This paper reports the results of experimental performance and emissions tests of an engine-generator unit fueled with biogas produced in a sewage plant in Brazil, operating under different loads, and with suitable engine modifications. These emissions and performance results were in agreement with the literature and it was confirmed that the penalties to engine performance were more significant than emission reduction in the operating range tested. Furthermore, a zero dimensional simulation model was employed to predict performance characteristics. Moreover, a differential thermodynamic equation system was solved, obtaining the pressure inside the cylinder as a function of the crank angle for different engine conditions. Mean effective pressure and indicated power were also obtained. The results of simulation and experimental tests of the engine in similar conditions were compared and the model validated. Although several simplifying assumptions were adopted and empirical correlations were used for Wiebe function, the model was adequate in predicting engine performance as the relative difference between simulated and experimental values was lower than 5%. The model can be adapted for use with different raw or enriched biogas compositions and could prove to be a valuable tool to guide

  4. Manufacturing plant performance evaluation by discrete event simulation

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mohd Rasid Osman; Rosnah Mohd Yusuff; Napsiah Ismail; Zulkiflie Leman

    2002-01-01

    A case study was conducted to evaluate the performance of a manufacturing plant using discrete event simulation technique. The study was carried out on animal feed production plant. Sterifeed plant at Malaysian Institute for Nuclear Technology Research (MINT), Selangor, Malaysia. The plant was modelled base on the actual manufacturing activities recorded by the operators. The simulation was carried out using a discrete event simulation software. The model was validated by comparing the simulation results with the actual operational data of the plant. The simulation results show some weaknesses with the current plant design and proposals were made to improve the plant performance. (Author)

  5. Maintenance Personnel Performance Simulation (MAPPS) model: a human reliability analysis tool

    International Nuclear Information System (INIS)

    Knee, H.E.

    1985-01-01

    The Maintenance Personnel Performance Simulation (MAPPS) model is a computerized, stochastic, task-oriented human behavioral model developed to provide estimates of nuclear power plant (NPP) maintenance team performance measures. It is capable of addressing person-machine, person-environment, and person-person relationships, and accounts for interdependencies that exist between the subelements that make up the maintenance task of interest. The primary measures of performance estimated by MAPPS are: (1) the probability of successfully completing the task of interest; and (2) the task duration time. MAPPS also estimates a host of other performance indices, including the probability of an undetected error, identification of the most- and least-likely error-prone subelements, and maintenance team stress profiles during task execution. The MAPPS model was subjected to a number of evaluation efforts that focused upon its practicality, acceptability, usefulness, and validity. Methods used for these efforts included a case method approach, consensus estimation, and comparison with observed task performance measures at a NPP. Favorable results, such as close agreement between task duration times for two tasks observed in the field (67.0 and 119.8 minutes, respectively), and estimates by MAPPS (72.0 and 124.0 minutes, respectively) enhance the confidence in the future use of MAPPS. 8 refs., 1 fig

  6. A Dynamic Simulation Model of Organizational Culture and Business Strategy Effects on Performance

    Science.gov (United States)

    Trivellas, Panagiotis; Reklitis, Panagiotis; Konstantopoulos, Nikolaos

    2007-12-01

    In the past two decades, organizational culture literature has gained tremendous interest for both academic and practitioners. This is based not only on the suggestion that culture is related to performance, but also on the view that it is subject of direct managerial control and manipulation to the desired direction. In the present paper, we adopt Competing Values Framework (CVF) to operationalise organizational culture and Porter's typology to conceptualize business strategy (cost leadership, innovative and marketing differentiation, and focus). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic model to simulate the organizational culture and strategy effects on financial performance. Data obtained from a six-year survey in the banking sector of a European developing economy was used for the proposed dynamic model development.

  7. Teamwork skills, shared mental models, and performance in simulated trauma teams: an independent group design

    Directory of Open Access Journals (Sweden)

    Westli Heidi

    2010-08-01

    Full Text Available Abstract Background Non-technical skills are seen as an important contributor to reducing adverse events and improving medical management in healthcare teams. Previous research on the effectiveness of teams has suggested that shared mental models facilitate coordination and team performance. The purpose of the study was to investigate whether demonstrated teamwork skills and behaviour indicating shared mental models would be associated with observed improved medical management in trauma team simulations. Methods Revised versions of the 'Anesthetists' Non-Technical Skills Behavioural marker system' and 'Anti-Air Teamwork Observation Measure' were field tested in moment-to-moment observation of 27 trauma team simulations in Norwegian hospitals. Independent subject matter experts rated medical management in the teams. An independent group design was used to explore differences in teamwork skills between higher-performing and lower-performing teams. Results Specific teamwork skills and behavioural markers were associated with indicators of good team performance. Higher and lower-performing teams differed in information exchange, supporting behaviour and communication, with higher performing teams showing more effective information exchange and communication, and less supporting behaviours. Behavioural markers of shared mental models predicted effective medical management better than teamwork skills. Conclusions The present study replicates and extends previous research by providing new empirical evidence of the significance of specific teamwork skills and a shared mental model for the effective medical management of trauma teams. In addition, the study underlines the generic nature of teamwork skills by demonstrating their transferability from different clinical simulations like the anaesthesia environment to trauma care, as well as the potential usefulness of behavioural frequency analysis in future research on non-technical skills.

  8. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  9. Visualization and Analysis of Climate Simulation Performance Data

    Science.gov (United States)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  10. Performance and Uncertainty Evaluation of Snow Models on Snowmelt Flow Simulations over a Nordic Catchment (Mistassibi, Canada

    Directory of Open Access Journals (Sweden)

    Magali Troin

    2015-11-01

    Full Text Available An analysis of hydrological response to a multi-model approach based on an ensemble of seven snow models (SM; degree-day and mixed degree-day/energy balance models coupled with three hydrological models (HM is presented for a snowmelt-dominated basin in Canada. The present study aims to compare the performance and the reliability of different types of SM-HM combinations at simulating snowmelt flows over the 1961–2000 historical period. The multi-model approach also allows evaluating the uncertainties associated with the structure of the SM-HM ensemble to better predict river flows in Nordic environments. The 20-year calibration shows a satisfactory performance of the ensemble of 21 SM-HM combinations at simulating daily discharges and snow water equivalents (SWEs, with low streamflow volume biases. The validation of the ensemble of 21 SM-HM combinations is conducted over a 20-year period. Performances are similar to the calibration in simulating the daily discharges and SWEs, again with low model biases for streamflow. The spring-snowmelt-generated peak flow is captured only in timing by the ensemble of 21 SM-HM combinations. The results of specific hydrologic indicators show that the uncertainty related to the choice of the given HM in the SM-HM combinations cannot be neglected in a more quantitative manner in simulating snowmelt flows. The selection of the SM plays a larger role than the choice of the SM approach (degree-day versus mixed degree-day/energy balance in simulating spring flows. Overall, the snow models provide a low degree of uncertainty to the total uncertainty in hydrological modeling for snow hydrology studies.

  11. Simulation and performance of brushless DC motor actuators

    OpenAIRE

    Gerba, Alex

    1985-01-01

    The simulation model for a Brushless D.C. Motor and the associated commutation power conditioner transistor model are presented. The necessary conditions for maximum power output while operating at steady-state speed and sinusoidally distributed air-gap flux are developed. Comparisons of simulated model with the measured performance of a typical motor are done both on time response waveforms and on average performance characteristics. These preliminary results indicate good ...

  12. Simulation and performance of brushless dc motor actuators

    Science.gov (United States)

    Gerba, A., Jr.

    1985-12-01

    The simulation model for a Brushless D.C. Motor and the associated commutation power conditioner transistor model are presented. The necessary conditions for maximum power output while operating at steady-state speed and sinusoidally distributed air-gap flux are developed. Comparison of simulated model with the measured performance of a typical motor are done both on time response waveforms and on average performance characteristics. These preliminary results indicate good agreement. Plans for model improvement and testing of a motor-driven positioning device for model evaluation are outlined.

  13. Comparison of the development of performance skills in ultrasound-guided regional anesthesia simulations with different phantom models.

    Science.gov (United States)

    Liu, Yang; Glass, Nancy L; Glover, Chris D; Power, Robert W; Watcha, Mehernoor F

    2013-12-01

    Ultrasound-guided regional anesthesia (UGRA) skills are traditionally obtained by supervised performance on patients, but practice on phantom models improves success. Currently available models are expensive or use perishable products, for example, olive-in-chicken breasts (OCB). We constructed 2 inexpensive phantom (transparent and opaque) models with readily available nonperishable products and compared the process of learning UGRA skills by novice practitioners on these models with the OCB model. Three experts first established criteria for a satisfactory completion of the simulated UGRA task in the 3 models. Thirty-six novice trainees (simulations was accomplished. The number of errors, needle passes, and time for task completion per attempt progressively decreased in all 3 groups. However, failure to identify the target and to visualize the needle on the ultrasound image occurred more frequently with the OCB model. The time to complete simulator training was shortest with the transparent model, owing to shorter target identification times. However, trainees were less likely to agree strongly that this model was realistic for teaching UGRA skills. Training on inexpensive synthetic simulation models with no perishable products permits learning of UGRA skills by novices. The OCB model has disadvantages of containing potentially infective material, requires refrigeration, cannot be used after multiple needle punctures, and is associated with more failures during simulated UGRA. Direct visualization of the target in the transparent model allows the trainee to focus on needle insertion skills, but the opaque model may be more realistic for learning target identification skills required when UGRA is performed on real patients in the operating room.

  14. High-performance modeling of CO2 sequestration by coupling reservoir simulation and molecular dynamics

    KAUST Repository

    Bao, Kai

    2013-01-01

    The present work describes a parallel computational framework for CO2 sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel HPC systems. In this framework, a parallel reservoir simulator, Reservoir Simulation Toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, while the molecular dynamics simulations are performed to provide the required physical parameters. Numerous technologies from different fields are employed to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large scale CO2 sequestration for long-term storage in the subsurface geological formations, such as depleted reservoirs and deep saline aquifers, which has been proposed as one of the most attractive and practical solutions to reduce the CO2 emission problem to address the global-warming threat. To effectively solve such problems, fine grids and accurate prediction of the properties of fluid mixtures are essential for accuracy. In this work, the CO2 sequestration is presented as our first example to couple the reservoir simulation and molecular dynamics, while the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical process in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability are observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well demonstrated with several experiments with hundreds of millions to a billion cells. To our best knowledge, the work represents the first attempt to couple the reservoir simulation and molecular simulation for large scale modeling. Due to the complexity of the subsurface systems

  15. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  16. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  17. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  18. Numerical simulation of Higgs models

    International Nuclear Information System (INIS)

    Jaster, A.

    1995-10-01

    The SU(2) Higgs and the Schwinger model on the lattice were analysed. Numerical simulations of the SU(2) Higgs model were performed to study the finite temperature electroweak phase transition. With the help of the multicanonical method the distribution of an order parameter at the phase transition point was measured. This was used to obtain the order of the phase transition and the value of the interface tension with the histogram method. Numerical simulations were also performed at zero temperature to perform renormalization. The measured values for the Wilson loops were used to determine the static potential and from this the renormalized gauge coupling. The Schwinger model was simulated at different gauge couplings to analyse the properties of the Kaplan-Shamir fermions. The prediction that the mass parameter gets only multiplicative renormalization was tested and verified. (orig.)

  19. A VRLA battery simulation model

    International Nuclear Information System (INIS)

    Pascoe, Phillip E.; Anbuky, Adnan H.

    2004-01-01

    A valve regulated lead acid (VRLA) battery simulation model is an invaluable tool for the standby power system engineer. The obvious use for such a model is to allow the assessment of battery performance. This may involve determining the influence of cells suffering from state of health (SOH) degradation on the performance of the entire string, or the running of test scenarios to ascertain the most suitable battery size for the application. In addition, it enables the engineer to assess the performance of the overall power system. This includes, for example, running test scenarios to determine the benefits of various load shedding schemes. It also allows the assessment of other power system components, either for determining their requirements and/or vulnerabilities. Finally, a VRLA battery simulation model is vital as a stand alone tool for educational purposes. Despite the fundamentals of the VRLA battery having been established for over 100 years, its operating behaviour is often poorly understood. An accurate simulation model enables the engineer to gain a better understanding of VRLA battery behaviour. A system level multipurpose VRLA battery simulation model is presented. It allows an arbitrary battery (capacity, SOH, number of cells and number of strings) to be simulated under arbitrary operating conditions (discharge rate, ambient temperature, end voltage, charge rate and initial state of charge). The model accurately reflects the VRLA battery discharge and recharge behaviour. This includes the complex start of discharge region known as the coup de fouet

  20. Contribution to the modelling and analysis of logistics system performance by Petri nets and simulation models: Application in a supply chain

    Science.gov (United States)

    Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said

    2016-02-01

    In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.

  1. Using queuing theory and simulation model to optimize hospital pharmacy performance.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan

    2014-03-01

    Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation results showed that in the evening, decreasing the staff

  2. Integrating Soft Set Theory and Fuzzy Linguistic Model to Evaluate the Performance of Training Simulation Systems.

    Science.gov (United States)

    Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu

    2016-01-01

    The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance-performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system.

  3. The cognitive environment simulation as a tool for modeling human performance and reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Pople, H. Jr.; Roth, E.M.

    1990-01-01

    The US Nuclear Regulatory Commission is sponsoring a research program to develop improved methods to model the cognitive behavior of nuclear power plant (NPP) personnel. Under this program, a tool for simulating how people form intentions to act in NPP emergency situations was developed using artificial intelligence (AI) techniques. This tool is called Cognitive Environment Simulation (CES). The Cognitive Reliability Assessment Technique (or CREATE) was also developed to specify how CBS can be used to enhance the measurement of the human contribution to risk in probabilistic risk assessment (PRA) studies. The next step in the research program was to evaluate the modeling tool and the method for using the tool for Human Reliability Analysis (HRA) in PRAs. Three evaluation activities were conducted. First, a panel of highly distinguished experts in cognitive modeling, AI, PRA and HRA provided a technical review of the simulation development work. Second, based on panel recommendations, CES was exercised on a family of steam generator tube rupture incidents where empirical data on operator performance already existed. Third, a workshop with HRA practitioners was held to analyze a worked example of the CREATE method to evaluate the role of CES/CREATE in HRA. The results of all three evaluations indicate that CES/CREATE represents a promising approach to modeling operator intention formation during emergency operations

  4. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1990-12-01

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  5. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, M S

    1990-12-15

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  6. Modeling and simulation of blood collection systems.

    Science.gov (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier

    2012-03-01

    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  7. Verification of Temperature and Precipitation Simulated Data by Individual and Ensemble Performance of Five AOGCM Models for North East of Iran

    Directory of Open Access Journals (Sweden)

    B. Ashraf

    2014-08-01

    Full Text Available Scince climatic models are the basic tools to study climate change and because of the multiplicity of these models, selecting the most appropriate model for the studying location is very considerable. In this research the temperature and precipitation simulated data by BCM2, CGCM3, CNRMCM3, MRICGCM2.3 and MIROC3 models are downscaled with proportional method according A1B, A2 and B1 emission scenarios for Torbat-heydariye, Sabzevar and Mashhad initially. Then using coefficient of determination (R2, index of agreement (D and mean-square deviations (MSD, models were verified individually and as ensemble performance. The results showed that, based on individual performance and three emission scenarios, MRICGCM2.3 model in Torbat-heydariye and Mashhad and MIROC3.2 model in Sabzevar had the best performance in simulation of temperature and MIROC3.2, MRICGCM2.3 and CNRMCM3 models have provided the most accurate predictions for precipitation in Torbat-heydariye, Sabzevar and Mashahad respectively. Also simulated temperature by all models in Torbat-heydariye and Sabzevar base on B1 scenario and, in Mashhad based on A2 scenario had the lowest uncertainty. The most accuracy in modeling of precipitation was resulted based on A2 scenario in Torbat-heydariye and, B1 scenario in Sabzevar and Mashhad. Investigation of calculated statistics driven from ensemble performance of 5 selected models caused notable reduction of simulation error and thus increase the accuracy of predictions based on all emission scenarios generally. In this case, the best fitting of simulated and observed temperature data were achieved based on B1 scenario in Torbat-heydariye and Sabzevar and, A2 scenario in Mashhad. And the best fitting simulated and observed precipitation data were obtained based on A2 scenario in Torbat-heydariye and, B1 scenario in Sabzevar and Mashhad. According to the results of this research, before any climate change research it is necessary to select the

  8. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  9. Modeling of electrochemistry and steam-methane reforming performance for simulating pressurized solid oxide fuel cell stacks

    Energy Technology Data Exchange (ETDEWEB)

    Recknagle, Kurtis P.; Ryan, Emily M.; Koeppel, Brian J.; Mahoney, Lenna A.; Khaleel, Moe A. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States)

    2010-10-01

    This paper examines the electrochemical and direct internal steam-methane reforming performance of the solid oxide fuel cell when subjected to pressurization. Pressurized operation boosts the Nernst potential and decreases the activation polarization, both of which serve to increase cell voltage and power while lowering the heat load and operating temperature. A model considering the activation polarization in both the fuel and the air electrodes was adopted to address this effect on the electrochemical performance. The pressurized methane conversion kinetics and the increase in equilibrium methane concentration are considered in a new rate expression. The models were then applied in simulations to predict how the distributions of direct internal reforming rate, temperature, and current density are effected within stacks operating at elevated pressure. A generic 10 cm counter-flow stack model was created and used for the simulations of pressurized operation. The predictions showed improved thermal and electrical performance with increased operating pressure. The average and maximum cell temperatures decreased by 3% (20 C) while the cell voltage increased by 9% as the operating pressure was increased from 1 to 10 atm. (author)

  10. Performance of Regional Climate Model in Simulating Monsoon Onset Over Indian Subcontinent

    Science.gov (United States)

    Bhatla, R.; Mandal, B.; Verma, Shruti; Ghosh, Soumik; Mall, R. K.

    2018-06-01

    The performance of various Convective Parameterization Schemes (CPSs) of Regional Climate Model version 4.3 (RegCM-4.3) for simulation of onset phase of Indian summer monsoon (ISM) over Kerala was studied for the period of 2001-2010. The onset date and its associated spatial variation were simulated using RegCM-4.3 four core CPS, namely Kuo, Tiedtke, Emanuel and Grell; and with two mixed convection schemes Mix98 (Emanuel over land and Grell over ocean) and Mix99 (Grell over land and Emanuel over ocean) on the basis of criteria given by the India Meteorological Department (IMD) (Pai and Rajeevan in Indian summer monsoon onset: variability and prediction. National Climate Centre, India Meteorological Department, 2007). It has been found that out of six CPS, two schemes, namely Tiedtke and Mix99 simulated the onset date properly. The onset phase is characterized with several transition phases of atmosphere. Therefore, to study the thermal response or the effect of different sea surface temperature (SST), namely ERA interim (ERSST) and weekly optimal interpolation (OI_WK SST) on Indian summer monsoon, the role of two different types of SST has been used to investigate the simulated onset date. In addition, spatial atmospheric circulation pattern during onset phase were analyzed using reanalyze dataset of ERA Interim (EIN15) and National Oceanic and Atmospheric Administration (NOAA), respectively, for wind and outgoing long-wave radiation (OLR) pattern. Among the six convective schemes of RegCM-4.3 model, Tiedtke is in good agreement with actual onset dates and OI_WK SST forcing is better for simulating onset of ISM over Kerala.

  11. Using Queuing Theory and Simulation Model to Optimize Hospital Pharmacy Performance

    Science.gov (United States)

    Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan

    2014-01-01

    Background: Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. Objectives: This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. Patients and Methods: A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results: Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation

  12. High performance MRI simulations of motion on multi-GPU systems.

    Science.gov (United States)

    Xanthis, Christos G; Venetis, Ioannis E; Aletras, Anthony H

    2014-07-04

    MRI physics simulators have been developed in the past for optimizing imaging protocols and for training purposes. However, these simulators have only addressed motion within a limited scope. The purpose of this study was the incorporation of realistic motion, such as cardiac motion, respiratory motion and flow, within MRI simulations in a high performance multi-GPU environment. Three different motion models were introduced in the Magnetic Resonance Imaging SIMULator (MRISIMUL) of this study: cardiac motion, respiratory motion and flow. Simulation of a simple Gradient Echo pulse sequence and a CINE pulse sequence on the corresponding anatomical model was performed. Myocardial tagging was also investigated. In pulse sequence design, software crushers were introduced to accommodate the long execution times in order to avoid spurious echoes formation. The displacement of the anatomical model isochromats was calculated within the Graphics Processing Unit (GPU) kernel for every timestep of the pulse sequence. Experiments that would allow simulation of custom anatomical and motion models were also performed. Last, simulations of motion with MRISIMUL on single-node and multi-node multi-GPU systems were examined. Gradient Echo and CINE images of the three motion models were produced and motion-related artifacts were demonstrated. The temporal evolution of the contractility of the heart was presented through the application of myocardial tagging. Better simulation performance and image quality were presented through the introduction of software crushers without the need to further increase the computational load and GPU resources. Last, MRISIMUL demonstrated an almost linear scalable performance with the increasing number of available GPU cards, in both single-node and multi-node multi-GPU computer systems. MRISIMUL is the first MR physics simulator to have implemented motion with a 3D large computational load on a single computer multi-GPU configuration. The incorporation

  13. Designing Citizen Business Loan Model to Reduce Non-Performing Loan: An Agent-based Modeling and Simulation Approach in Regional Development

    Directory of Open Access Journals (Sweden)

    Moses L Singgih

    2015-09-01

    Full Text Available Citizen Business Loan (CBL constitutes a program poverty alleviation based on economic empowerment of small and medium enterprise. This study focuses on implementation of CBL at Regional Development Bank branch X. The problem is the existing of interdependencies between CBL’s implements (Bank and the uncertainty of debtor’s capability in returning the credit. The impact of this circumstance is non-performing loan (NPL becomes relatively high (22%. The ultimate objective is to minimize NPL by designing the model based on the agent that can represent the problem through a simulation using agent-based modeling and simulation (ABMS. The model is considered by managing the probability of the debtor to pay or not based on 5 C categories, they are: character, capacity, capital, condition, and collateral that inherent to each debtor. There are two improvement scenarios proposed in this model. The first scenario only involves the first category of debtor in simulation. The result of this scenario is NPL value as 0%. The second scenario includes the first and second of debtor’s category in simulation and resulting NPL value between 4.6% and 11.4%.

  14. Predictive performance models and multiple task performance

    Science.gov (United States)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  15. An evidence accumulation model for conflict detection performance in a simulated air traffic control task.

    Science.gov (United States)

    Neal, Andrew; Kwantes, Peter J

    2009-04-01

    The aim of this article is to develop a formal model of conflict detection performance. Our model assumes that participants iteratively sample evidence regarding the state of the world and accumulate it over time. A decision is made when the evidence reaches a threshold that changes over time in response to the increasing urgency of the task. Two experiments were conducted to examine the effects of conflict geometry and timing on response proportions and response time. The model is able to predict the observed pattern of response times, including a nonmonotonic relationship between distance at point of closest approach and response time, as well as effects of angle of approach and relative velocity. The results demonstrate that evidence accumulation models provide a good account of performance on a conflict detection task. Evidence accumulation models are a form of dynamic signal detection theory, allowing for the analysis of response times as well as response proportions, and can be used for simulating human performance on dynamic decision tasks.

  16. Modeling and Simulation for Safeguards

    International Nuclear Information System (INIS)

    Swinhoe, Martyn T.

    2012-01-01

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R and D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  17. Software-In-the-Loop based Modeling and Simulation of Unmanned Semi-submersible Vehicle for Performance Verification of Autonomous Navigation

    Science.gov (United States)

    Lee, Kwangkook; Jeong, Mijin; Kim, Dong Hun

    2017-12-01

    Since an unmanned semi-submersible is mainly used for the purpose of carrying out dangerous missions in the sea, it is possible to work in a region where it is difficult to access due to safety reasons. In this study, an USV hull design was determined using Myring hull profile, and reinforcement work was performed by designing and implementing inner stiffener member for 3D printing. In order to simulate a sea state 5.0 or more at sea, which is difficult to implement in practice, a regular and irregular wave equation was implemented in Matlab / Simulink. We performed modeling and simulation of semi - submersible simulation based on DMWorks considering the rolling motion in wave. To verify and improve unpredicted errors, we implemented a numeric and physical simulation model of the USV based on software-in-the-loop (SIL) method. This simulation allows shipbuilders to participate in new value-added markets such as engineering, procurement, construction, installation, commissioning, operation, and maintenance for the USV.

  18. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  19. Multidisciplinary Energy Assessment of Tertiary Buildings: Automated Geomatic Inspection, Building Information Modeling Reconstruction and Building Performance Simulation

    Directory of Open Access Journals (Sweden)

    Faustino Patiño-Cambeiro

    2017-07-01

    Full Text Available There is an urgent need for energy efficiency in buildings within the European framework, considering its environmental implications, and Europe’s energy dependence. Furthermore, the need for enhancing and increasing productivity in the building industry turns new technologies and building energy performance simulation environments into extremely interesting solutions towards rigorous analysis and decision making in renovation within acceptable risk levels. The present work describes a multidisciplinary approach for the estimation of the energy performance of an educational building. The research involved data acquisition with advanced geomatic tools, the development of an optimized building information model, and energy assessment in Building Performance Simulation (BPS software. Interoperability issues were observed in the different steps of the process. The inspection and diagnostic phases were conducted in a timely, accurate manner thanks to automated data acquisition and subsequent analysis using Building Information Modeling based tools (BIM-based tools. Energy simulation was performed using Design Builder, and the results obtained were compared with those yielded by the official software tool established by Spanish regulations for energy certification. The discrepancies between the results of both programs have proven that the official software program is conservative in this sense. This may cause the depreciation of the assessed buildings.

  20. Simulating and stimulating performance: Introducing distributed simulation to enhance musical learning and performance

    Directory of Open Access Journals (Sweden)

    Aaron eWilliamon

    2014-02-01

    Full Text Available Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of real performance could be recreated. Advanced violin students (n=11 were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three expert virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for

  1. Simulating and stimulating performance: introducing distributed simulation to enhance musical learning and performance.

    Science.gov (United States)

    Williamon, Aaron; Aufegger, Lisa; Eiholzer, Hubert

    2014-01-01

    Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all) how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of "real" performance could be recreated. Advanced violin students (n = 11) were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three "expert" virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for performance training.

  2. Improving the performance of a filling line based on simulation

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  3. Determining the energy performance of manually controlled solar shades: A stochastic model based co-simulation analysis

    International Nuclear Information System (INIS)

    Yao, Jian

    2014-01-01

    Highlights: • Driving factor for adjustment of manually controlled solar shades was determined. • A stochastic model for manual solar shades was constructed using Markov method. • Co-simulation with Energyplus was carried out in BCVTB. • External shading even manually controlled should be used prior to LOW-E windows. • Previous studies on manual solar shades may overestimate energy savings. - Abstract: Solar shading devices play a significant role in reducing building energy consumption and maintaining a comfortable indoor condition. In this paper, a typical office building with internal roller shades in hot summer and cold winter zone was selected to determine the driving factor of control behavior of manual solar shades. Solar radiation was determined as the major factor in driving solar shading adjustment based on field measurements and logit analysis and then a stochastic model for manually adjusted solar shades was constructed by using Markov method. This model was used in BCVTB for further co-simulation with Energyplus to determine the impact of the control behavior of solar shades on energy performance. The results show that manually adjusted solar shades, whatever located inside or outside, have a relatively high energy saving performance than clear-pane windows while only external shades perform better than regularly used LOW-E windows. Simulation also indicates that using an ideal assumption of solar shade adjustment as most studies do in building simulation may lead to an overestimation of energy saving by about 16–30%. There is a need to improve occupants’ actions on shades to more effectively respond to outdoor conditions in order to lower energy consumption, and this improvement can be easily achieved by using simple strategies as a guide to control manual solar shades

  4. submitter Simulation-Based Performance Analysis of the ALICE Mass Storage System

    CERN Document Server

    Vickovic, L; Celar, S

    2016-01-01

    CERN – the European Organization for Nuclear Research today, in the era of big data, is one of the biggest data generators in the world. Especially interesting is transient data storage system in the ALICE experiment. With the goal to optimize its performance this paper discusses a dynamic, discrete event simulation model of disk based Storage Area Network (SAN) and its usage for the performance analyses. Storage system model is based on modular, bottom up approach and the differences between measured and simulated values vary between 1.5 % and 4 % depending on the simulated component. Once finished, simulation model was used for detailed performance analyses. Among other findings it showed that system performances can be seriously affected if the array stripe size is larger than the size of cache on individual disks in the array, which so far has been completely ignored in the literature.

  5. Trickle bed reactor model to simulate the performance of commercial diesel hydrotreating unit

    Energy Technology Data Exchange (ETDEWEB)

    C. Murali; R.K. Voolapalli; N. Ravichander; D.T. Gokak; N.V. Choudary [Bharat Petroleum Corporation Ltd., Udyog Kendra (India). Corporate R& amp; D Centre

    2007-05-15

    A two phase mathematical model was developed to simulate the performance of bench scale and commercial hydrotreating reactors. Major hydrotreating reactions, namely, hydrodesulphurization, hydrodearomatization and olefins saturation were modeled. Experiments were carried out in a fixed bed reactor to study the effect of different process variables and these results were used for estimating kinetic parameters. Significant amount of feed vaporization (20-50%) was estimated under normal operating conditions of DHDS suggesting the importance of considering feed vaporization in DHDS modeling. The model was validated with plant operating data, under close to ultra low sulphur levels by correctly accounting for feed vaporization in heat balance relations and appropriate use of hydrodynamic correlations. The model could predict the product quality, reactor bed temperature profiles and chemical hydrogen consumption in commercial plant adequately. 14 refs., 7 figs., 6 tabs.

  6. Identifying a key physical factor sensitive to the performance of Madden-Julian oscillation simulation in climate models

    Science.gov (United States)

    Kim, Go-Un; Seo, Kyong-Hwan

    2018-01-01

    A key physical factor in regulating the performance of Madden-Julian oscillation (MJO) simulation is examined by using 26 climate model simulations from the World Meteorological Organization's Working Group for Numerical Experimentation/Global Energy and Water Cycle Experiment Atmospheric System Study (WGNE and MJO-Task Force/GASS) global model comparison project. For this, intraseasonal moisture budget equation is analyzed and a simple, efficient physical quantity is developed. The result shows that MJO skill is most sensitive to vertically integrated intraseasonal zonal wind convergence (ZC). In particular, a specific threshold value of the strength of the ZC can be used as distinguishing between good and poor models. An additional finding is that good models exhibit the correct simultaneous convection and large-scale circulation phase relationship. In poor models, however, the peak circulation response appears 3 days after peak rainfall, suggesting unfavorable coupling between convection and circulation. For an improving simulation of the MJO in climate models, we propose that this delay of circulation in response to convection needs to be corrected in the cumulus parameterization scheme.

  7. Development of a simplified simulation model for performance characterization of a pixellated CdZnTe multimodality imaging system

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, P; Santos, A [Departamento de IngenierIa Electronica, Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Darambara, D G [Joint Department of Physics, Royal Marsden NHS Foundation Trust and The Institute of Cancer Research, Fulham Road, London SW3 6JJ (United Kingdom)], E-mail: pguerra@die.um.es

    2008-02-21

    Current requirements of molecular imaging lead to the complete integration of complementary modalities in a single hybrid imaging system to correlate function and structure. Among the various existing detector technologies, which can be implemented to integrate nuclear modalities (PET and/or single-photon emission computed tomography with x-rays (CT) and most probably with MR, pixellated wide bandgap room temperature semiconductor detectors, such as CdZnTe and/or CdTe, are promising candidates. This paper deals with the development of a simplified simulation model for pixellated semiconductor radiation detectors, as a first step towards the performance characterization of a multimodality imaging system based on CdZnTe. In particular, this work presents a simple computational model, based on a 1D approximate solution of the Schockley-Ramo theorem, and its integration into the Geant4 application for tomographic emission (GATE) platform in order to perform accurately and, therefore, improve the simulations of pixellated detectors in different configurations with a simultaneous cathode and anode pixel readout. The model presented here is successfully validated against an existing detailed finite element simulator, the multi-geometry simulation code, with respect to the charge induced at the anode, taking into consideration interpixel charge sharing and crosstalk, and to the detector charge induction efficiency. As a final point, the model provides estimated energy spectra and time resolution for {sup 57}Co and {sup 18}F sources obtained with the GATE code after the incorporation of the proposed model.

  8. Performance assessment of Large Eddy Simulation (LES) for modeling dispersion in an urban street canyon with tree planting

    Science.gov (United States)

    Moonen, P.; Gromke, C.; Dorer, V.

    2013-08-01

    The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are considered. The model performance is assessed in several steps, ranging from a qualitative comparison to measured concentrations, over statistical data analysis by means of scatter plots and box plots, up to the calculation of objective validation metrics. The extensive validation effort highlights and quantifies notable features and shortcomings of the model, which would otherwise remain unnoticed. The model performance is found to be spatially non-uniform. Closer agreement with measurement data is achieved near the canyon ends than for the central part of the canyon, and typical model acceptance criteria are satisfied more easily for the leeward than for the windward canyon wall. This demonstrates the need for rigorous model evaluation. Only quality-assured models can be used with confidence to support assessment, planning and implementation of pollutant mitigation strategies.

  9. Towards a benchmark simulation model for plant-wide control strategy performance evaluation of WWTPs

    DEFF Research Database (Denmark)

    Jeppsson, Ulf; Rosen, Christian; Alex, Jens

    2006-01-01

    The COST/IWA benchmark simulation model has been available for seven years. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the benchmark has resulted in more than 100 publications, not only in Europe but also...... worldwide, demonstrates the interest in such a tool within the research community In this paper, an extension of the benchmark simulation model no 1 (BSM1) is proposed. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently...... the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one-week BSM1 evaluation period. In the paper, the extended plant...

  10. A Network Contention Model for the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2015-01-01

    The Extreme-scale Simulator (xSim) is a performance investigation toolkit for high-performance computing (HPC) hardware/software co-design. It permits running a HPC application with millions of concurrent execution threads, while observing its performance in a simulated extreme-scale system. This paper details a newly developed network modeling feature for xSim, eliminating the shortcomings of the existing network modeling capabilities. The approach takes a different path for implementing network contention and bandwidth capacity modeling using a less synchronous and accurate enough model design. With the new network modeling feature, xSim is able to simulate on-chip and on-node networks with reasonable accuracy and overheads.

  11. High performance real-time flight simulation at NASA Langley

    Science.gov (United States)

    Cleveland, Jeff I., II

    1994-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be deterministic and be completed in as short a time as possible. This includes simulation mathematical model computational and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, personnel at NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to a standard input/output system to provide for high bandwidth, low latency data acquisition and distribution. The Computer Automated Measurement and Control technology (IEEE standard 595) was extended to meet the performance requirements for real-time simulation. This technology extension increased the effective bandwidth by a factor of ten and increased the performance of modules necessary for simulator communications. This technology is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications of this technology are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC have completed the development of the use of supercomputers for simulation mathematical model computational to support real-time flight simulation. This includes the development of a real-time operating system and the development of specialized software and hardware for the CAMAC simulator network. This work, coupled with the use of an open systems software architecture, has advanced the state of the art in real time flight simulation. The data acquisition technology innovation and experience with recent developments in this technology are described.

  12. Microcomputer simulation model for facility performance assessment: a case study of nuclear spent fuel handling facility operations

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.; Otis, P.T.

    1985-10-01

    A microcomputer based simulation model was recently developed at the Pacific Northwest Laboratory (PNL) to assist in the evaluation of design alternatives for a proposed facility to receive, consolidate and store nuclear spent fuel from US commercial power plants. Previous performance assessments were limited to deterministic calculations and Gantt chart representations of the facility operations. To insure that the design of the facility will be adequate to meet the specified throughput requirements, the simulation model was used to analyze such factors as material flow, equipment capability and the interface between the MRS facility and the nuclear waste transportation system. The simulation analysis model was based on commercially available software and application programs designed to represent the MRS waste handling facility operations. The results of the evaluation were used by the design review team at PNL to identify areas where design modifications should be considered. 4 figs

  13. Cognitive load predicts point-of-care ultrasound simulator performance.

    Science.gov (United States)

    Aldekhyl, Sara; Cavalcanti, Rodrigo B; Naismith, Laura M

    2018-02-01

    The ability to maintain good performance with low cognitive load is an important marker of expertise. Incorporating cognitive load measurements in the context of simulation training may help to inform judgements of competence. This exploratory study investigated relationships between demographic markers of expertise, cognitive load measures, and simulator performance in the context of point-of-care ultrasonography. Twenty-nine medical trainees and clinicians at the University of Toronto with a range of clinical ultrasound experience were recruited. Participants answered a demographic questionnaire then used an ultrasound simulator to perform targeted scanning tasks based on clinical vignettes. Participants were scored on their ability to both acquire and interpret ultrasound images. Cognitive load measures included participant self-report, eye-based physiological indices, and behavioural measures. Data were analyzed using a multilevel linear modelling approach, wherein observations were clustered by participants. Experienced participants outperformed novice participants on ultrasound image acquisition. Ultrasound image interpretation was comparable between the two groups. Ultrasound image acquisition performance was predicted by level of training, prior ultrasound training, and cognitive load. There was significant convergence between cognitive load measurement techniques. A marginal model of ultrasound image acquisition performance including prior ultrasound training and cognitive load as fixed effects provided the best overall fit for the observed data. In this proof-of-principle study, the combination of demographic and cognitive load measures provided more sensitive metrics to predict ultrasound simulator performance. Performance assessments which include cognitive load can help differentiate between levels of expertise in simulation environments, and may serve as better predictors of skill transfer to clinical practice.

  14. Computer simulation of steady-state performance of air-to-air heat pumps

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, R D; Creswick, F A

    1978-03-01

    A computer model by which the performance of air-to-air heat pumps can be simulated is described. The intended use of the model is to evaluate analytically the improvements in performance that can be effected by various component improvements. The model is based on a trio of independent simulation programs originated at the Massachusetts Institute of Technology Heat Transfer Laboratory. The three programs have been combined so that user intervention and decision making between major steps of the simulation are unnecessary. The program was further modified by substituting a new compressor model and adding a capillary tube model, both of which are described. Performance predicted by the computer model is shown to be in reasonable agreement with performance data observed in our laboratory. Planned modifications by which the utility of the computer model can be enhanced in the future are described. User instructions and a FORTRAN listing of the program are included.

  15. A high-performance model for shallow-water simulations in distributed and heterogeneous architectures

    Science.gov (United States)

    Conde, Daniel; Canelas, Ricardo B.; Ferreira, Rui M. L.

    2017-04-01

    unstructured nature of the mesh topology with the corresponding employed solution, based on space-filling curves, being analyzed and discussed. Intra-node parallelism is achieved through OpenMP for CPUs and CUDA for GPUs, depending on which kind of device the process is running. Here the main difficulty is associated with the Object-Oriented approach, where the presence of complex data structures can degrade model performance considerably. STAV-2D now supports fully distributed and heterogeneous simulations where multiple different devices can be used to accelerate computation time. The advantages, short-comings and specific solutions for the employed unified Object-Oriented approach, where the source code for CPU and GPU has the same compilation units (no device specific branches like seen in available models), are discussed and quantified with a thorough scalability and performance analysis. The assembled parallel model is expected to achieve faster than real-time simulations for high resolutions (from meters to sub-meter) in large scaled problems (from cities to watersheds), effectively bridging the gap between detailed and timely simulation results. Acknowledgements This research as partially supported by Portuguese and European funds, within programs COMPETE2020 and PORL-FEDER, through project PTDC/ECM-HID/6387/2014 and Doctoral Grant SFRH/BD/97933/2013 granted by the National Foundation for Science and Technology (FCT). References Canelas, R.; Murillo, J. & Ferreira, R.M.L. (2013), Two-dimensional depth-averaged modelling of dam-break flows over mobile beds. Journal of Hydraulic Research, 51(4), 392-407. Conde, D. A. S.; Baptista, M. A. V.; Sousa Oliveira, C. & Ferreira, R. M. L. (2013), A shallow-flow model for the propagation of tsunamis over complex geometries and mobile beds, Nat. Hazards and Earth Syst. Sci., 13, 2533-2542. Conde, D. A. S.; Telhado, M. J.; Viana Baptista, M. A. & Ferreira, R. M. L. (2015) Severity and exposure associated with tsunami actions in

  16. A satellite simulator for TRMM PR applied to climate model simulations

    Science.gov (United States)

    Spangehl, T.; Schroeder, M.; Bodas-Salcedo, A.; Hollmann, R.; Riley Dellaripa, E. M.; Schumacher, C.

    2017-12-01

    Climate model simulations have to be compared against observation based datasets in order to assess their skill in representing precipitation characteristics. Here we use a satellite simulator for TRMM PR in order to evaluate simulations performed with MPI-ESM (Earth system model of the Max Planck Institute for Meteorology in Hamburg, Germany) performed within the MiKlip project (https://www.fona-miklip.de/, funded by Federal Ministry of Education and Research in Germany). While classical evaluation methods focus on geophysical parameters such as precipitation amounts, the application of the satellite simulator enables an evaluation in the instrument's parameter space thereby reducing uncertainties on the reference side. The CFMIP Observation Simulator Package (COSP) provides a framework for the application of satellite simulators to climate model simulations. The approach requires the introduction of sub-grid cloud and precipitation variability. Radar reflectivities are obtained by applying Mie theory, with the microphysical assumptions being chosen to match the atmosphere component of MPI-ESM (ECHAM6). The results are found to be sensitive to the methods used to distribute the convective precipitation over the sub-grid boxes. Simple parameterization methods are used to introduce sub-grid variability of convective clouds and precipitation. In order to constrain uncertainties a comprehensive comparison with sub-grid scale convective precipitation variability which is deduced from TRMM PR observations is carried out.

  17. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  18. Repository simulation model: Final report

    International Nuclear Information System (INIS)

    1988-03-01

    This report documents the application of computer simulation for the design analysis of the nuclear waste repository's waste handling and packaging operations. The Salt Repository Simulation Model was used to evaluate design alternatives during the conceptual design phase of the Salt Repository Project. Code development and verification was performed by the Office of Nuclear Waste Isolation (ONWL). The focus of this report is to relate the experience gained during the development and application of the Salt Repository Simulation Model to future repository design phases. Design of the repository's waste handling and packaging systems will require sophisticated analysis tools to evaluate complex operational and logistical design alternatives. Selection of these design alternatives in the Advanced Conceptual Design (ACD) and License Application Design (LAD) phases must be supported by analysis to demonstrate that the repository design will cost effectively meet DOE's mandated emplacement schedule and that uncertainties in the performance of the repository's systems have been objectively evaluated. Computer simulation of repository operations will provide future repository designers with data and insights that no other analytical form of analysis can provide. 6 refs., 10 figs

  19. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  20. Proceedings of eSim 2006 : IBPSA-Canada's 4. biennial building performance simulation conference

    International Nuclear Information System (INIS)

    Kesik, T.

    2006-01-01

    This conference was attended by professionals, academics and students interested in promoting the science of building performance simulation in order to optimize design, construction, operation and maintenance of new and existing buildings around the world. This biennial conference and exhibition covered all topics related to computerized simulation of a building's energy performance and energy efficiency. Computerized simulation is widely used to predict the environmental performance of buildings during all stages of a building's life cycle, from the design, commissioning, construction, occupancy and management stages. Newly developed simulation methods for optimal comfort in new and existing buildings were evaluated. The themes of the conference were: recent developments for modelling the physical processes relevant to buildings; algorithms for modelling conventional and innovative HVAC systems; methods for modelling whole-building performance; building simulation software development; the use of building simulation tools in code compliance; moving simulation into practice; validation of building simulation software; architectural design; and optimization approaches in building design. The conference also covered the modeling of energy supply systems with reference to renewable energy sources such as ground source heat pumps or hybrid systems incorporating solar energy. The conference featured 32 presentations, of which 28 have been catalogued separately for inclusion in this database. refs., tabs., figs

  1. Simulations of KSTAR high performance steady state operation scenarios

    International Nuclear Information System (INIS)

    Na, Yong-Su; Kessel, C.E.; Park, J.M.; Yi, Sumin; Kim, J.Y.; Becoulet, A.; Sips, A.C.C.

    2009-01-01

    We report the results of predictive modelling of high performance steady state operation scenarios in KSTAR. Firstly, the capabilities of steady state operation are investigated with time-dependent simulations using a free-boundary plasma equilibrium evolution code coupled with transport calculations. Secondly, the reproducibility of high performance steady state operation scenarios developed in the DIII-D tokamak, of similar size to that of KSTAR, is investigated using the experimental data taken from DIII-D. Finally, the capability of ITER-relevant steady state operation is investigated in KSTAR. It is found that KSTAR is able to establish high performance steady state operation scenarios; β N above 3, H 98 (y, 2) up to 2.0, f BS up to 0.76 and f NI equals 1.0. In this work, a realistic density profile is newly introduced for predictive simulations by employing the scaling law of a density peaking factor. The influence of the current ramp-up scenario and the transport model is discussed with respect to the fusion performance and non-inductive current drive fraction in the transport simulations. As observed in the experiments, both the heating and the plasma current waveforms in the current ramp-up phase produce a strong effect on the q-profile, the fusion performance and also on the non-inductive current drive fraction in the current flattop phase. A criterion in terms of q min is found to establish ITER-relevant steady state operation scenarios. This will provide a guideline for designing the current ramp-up phase in KSTAR. It is observed that the transport model also affects the predictive values of fusion performance as well as the non-inductive current drive fraction. The Weiland transport model predicts the highest fusion performance as well as non-inductive current drive fraction in KSTAR. In contrast, the GLF23 model exhibits the lowest ones. ITER-relevant advanced scenarios cannot be obtained with the GLF23 model in the conditions given in this work

  2. Power converter topologies for wind energy conversion systems: Integrated modeling, control strategy and performance simulation

    Energy Technology Data Exchange (ETDEWEB)

    Melicio, R.; Catalao, J.P.S. [Department of Electromechanical Engineering, University of Beira Interior, R. Fonte do Lameiro, 6201-001 Covilha (Portugal); Mendes, V.M.F. [Department of Electrical Engineering and Automation, Instituto Superior de Engenharia de Lisboa, R. Conselheiro Emidio Navarro, 1950-062 Lisbon (Portugal)

    2010-10-15

    This paper presents new integrated model for variable-speed wind energy conversion systems, considering a more accurate dynamic of the wind turbine, rotor, generator, power converter and filter. Pulse width modulation by space vector modulation associated with sliding mode is used for controlling the power converters. Also, power factor control is introduced at the output of the power converters. Comprehensive performance simulation studies are carried out with matrix, two-level and multilevel power converter topologies in order to adequately assert the system performance. Conclusions are duly drawn. (author)

  3. Dual Arm Work Package performance estimates and telerobot task network simulation

    International Nuclear Information System (INIS)

    Draper, J.V.

    1997-01-01

    This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy's Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collected to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations

  4. Modeling and simulation of pressurized water reactor power plant

    International Nuclear Information System (INIS)

    Wang, S.J.

    1983-01-01

    Two kinds of balance of plant (BOP) models of a pressurized water reactor (PWR) system are developed in this work - the detailed BOP model and the simple BOP model. The detailed model is used to simulate the normal operational performance of a whole BOP system. The simple model is used to combine with the NSSS model for a whole plant simulation. The trends of the steady state values of the detailed model are correct and the dynamic responses are reasonable. The simple BOP model approach starts the modelling work from the overall point of view. The response of the normalized turbine power and the feedwater inlet temperature to the steam generator of the simple model are compared with those of the detailed model. Both the steady state values and the dynamic responses are close to those of the detailed model. The simple BOP model is found adequate to represent the main performance of the BOP system. The simple balance of plant model was coupled with a NSSS model for a whole plant simulation. The NSSS model consists of the reactor core model, the steam generator model, and the coolant temperature control system. A closed loop whole plant simulation for an electric load perturbation was performed. The results are plausible. The coupling effect between the NSSS system and the BOP system was analyzed. The feedback of the BOP system has little effect on the steam generator performance, while the performance of the BOP system is strongly affected by the steam flow rate from the NSSS

  5. Scale-dependent performances of CMIP5 earth system models in simulating terrestrial vegetation carbon

    Science.gov (United States)

    Jiang, L.; Luo, Y.; Yan, Y.; Hararuk, O.

    2013-12-01

    Mitigation of global changes will depend on reliable projection for the future situation. As the major tools to predict future climate, Earth System Models (ESMs) used in Coupled Model Intercomparison Project Phase 5 (CMIP5) for the IPCC Fifth Assessment Report have incorporated carbon cycle components, which account for the important fluxes of carbon between the ocean, atmosphere, and terrestrial biosphere carbon reservoirs; and therefore are expected to provide more detailed and more certain projections. However, ESMs are never perfect; and evaluating the ESMs can help us to identify uncertainties in prediction and give the priorities for model development. In this study, we benchmarked carbon in live vegetation in the terrestrial ecosystems simulated by 19 ESMs models from CMIP5 with an observationally estimated data set of global carbon vegetation pool 'Olson's Major World Ecosystem Complexes Ranked by Carbon in Live Vegetation: An Updated Database Using the GLC2000 Land Cover Product' by Gibbs (2006). Our aim is to evaluate the ability of ESMs to reproduce the global vegetation carbon pool at different scales and what are the possible causes for the bias. We found that the performance CMIP5 ESMs is very scale-dependent. While CESM1-BGC, CESM1-CAM5, CESM1-FASTCHEM and CESM1-WACCM, and NorESM1-M and NorESM1-ME (they share the same model structure) have very similar global sums with the observation data but they usually perform poorly at grid cell and biome scale. In contrast, MIROC-ESM and MIROC-ESM-CHEM simulate the best on at grid cell and biome scale but have larger differences in global sums than others. Our results will help improve CMIP5 ESMs for more reliable prediction.

  6. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  7. Performance of process-based models for simulation of grain N in crop rotations across Europe

    DEFF Research Database (Denmark)

    Yin, Xiaogang; Kersebaum, KC; Kollas, C

    2017-01-01

    The accurate estimation of crop grain nitrogen (N; N in grain yield) is crucial for optimizing agricultural N management, especially in crop rotations. In the present study, 12 process-based models were applied to simulate the grain N of i) seven crops in rotations, ii) across various pedo...... (Brassica napus L.). These differences are linked to the intensity of parameterization with better parameterized crops showing lower prediction errors. The model performance was influenced by N fertilization and irrigation treatments, and a majority of the predictions were more accurate under low N...

  8. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation.

    Science.gov (United States)

    Palme, M; Inostroza, L; Villacreses, G; Lobato, A; Carrasco, C

    2017-10-01

    This data article presents files supporting calculation for urban heat island (UHI) inclusion in building performance simulation (BPS). Methodology is used in the research article "From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect" (Palme et al., 2017) [1]. In this research, a Geographical Information System (GIS) study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso). Then, a Principal Component Analysis (PCA) is done to obtain reference Urban Tissues Categories (UTC) to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG) software (version 4.1 beta). Finally, BPS is run out with the Transient System Simulation (TRNSYS) software (version 17). In this data paper, four sets of data are presented: 1) PCA data (excel) to explain how to group different urban samples in representative UTC; 2) UWG data (text) to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso); 3) weather data (text) with the resulting rural and urban weather; 4) BPS models (text) data containing the TRNSYS models (four building models).

  9. Transient performance simulation of aircraft engine integrated with fuel and control systems

    International Nuclear Information System (INIS)

    Wang, C.; Li, Y.G.; Yang, B.Y.

    2017-01-01

    Highlights: • A new performance simulation method for engine hydraulic fuel systems is introduced. • Time delay of engine performance due to fuel system model is noticeable but small. • The method provides details of fuel system behavior in engine transient processes. • The method could be used to support engine and fuel system designs. - Abstract: A new method for the simulation of gas turbine fuel systems based on an inter-component volume method has been developed. It is able to simulate the performance of each of the hydraulic components of a fuel system using physics-based models, which potentially offers more accurate results compared with those using transfer functions. A transient performance simulation system has been set up for gas turbine engines based on an inter-component volume (ICV) method. A proportional-integral (PI) control strategy is used for the simulation of engine controller. An integrated engine and its control and hydraulic fuel systems has been set up to investigate their coupling effect during engine transient processes. The developed simulation system has been applied to a model aero engine. The results show that the delay of the engine transient response due to the inclusion of the fuel system model is noticeable although relatively small. The developed method is generic and can be applied to any other gas turbines and their control and fuel systems.

  10. Model and tool requirements for co-simulation of building performance

    NARCIS (Netherlands)

    Trcka, M.; Hensen, J.L.M.

    2006-01-01

    The use of building performance simulation (BPS) can substantially help in improving building design towards higher occupant comfort and lower fuel consumption, while reducing emission of greenhouse gasses. Unfortunately, current BPS tools do not allow inter-tool communication and thus limit a

  11. A predictive model of nuclear power plant crew decision-making and performance in a dynamic simulation environment

    Science.gov (United States)

    Coyne, Kevin Anthony

    The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional

  12. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  13. Facility/equipment performance evaluation using microcomputer simulation analysis

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered

  14. Predictive Maturity of Multi-Scale Simulation Models for Fuel Performance

    International Nuclear Information System (INIS)

    Atamturktur, Sez; Unal, Cetin; Hemez, Francois; Williams, Brian; Tome, Carlos

    2015-01-01

    The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy's resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed framework is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this

  15. Predictive Maturity of Multi-Scale Simulation Models for Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Atamturktur, Sez [Clemson Univ., SC (United States); Unal, Cetin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hemez, Francois [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Brian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tome, Carlos [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-16

    The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed framework is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this

  16. Investigation the performance of 0-D and 3-d combustion simulation softwares for modelling HCCI engine with high air excess ratios

    Directory of Open Access Journals (Sweden)

    Gökhan Coşkun

    2017-10-01

    Full Text Available In this study, performance of zero and three dimensional simulations codes that used for simulate a homogenous charge compression ignition (HCCI engine fueled with Primary Reference Fuel PRF (85% iso-octane and 15% n-heptane were investigated. 0-D code, called as SRM Suite (Stochastic Reactor Model which can simulate engine combustion by using stochastic reactor model technique were used. Ansys-Fluent which can simulate computational fluid dynamics (CFD was used for 3-D engine combustion simulations. Simulations were evaluated for both commercial codes in terms of combustion, heat transfer and emissions in a HCCI engine. Chemical kinetic mechanisms which developed by Tsurushima including 33 species and 38 reactions for surrogate PRF fuel were used for combustion simulations. Analysis showed that both codes have advantages over each other.

  17. Coupling of Large Eddy Simulations with Meteorological Models to simulate Methane Leaks from Natural Gas Storage Facilities

    Science.gov (United States)

    Prasad, K.

    2017-12-01

    Atmospheric transport is usually performed with weather models, e.g., the Weather Research and Forecasting (WRF) model that employs a parameterized turbulence model and does not resolve the fine scale dynamics generated by the flow around buildings and features comprising a large city. The NIST Fire Dynamics Simulator (FDS) is a computational fluid dynamics model that utilizes large eddy simulation methods to model flow around buildings at length scales much smaller than is practical with models like WRF. FDS has the potential to evaluate the impact of complex topography on near-field dispersion and mixing that is difficult to simulate with a mesoscale atmospheric model. A methodology has been developed to couple the FDS model with WRF mesoscale transport models. The coupling is based on nudging the FDS flow field towards that computed by WRF, and is currently limited to one way coupling performed in an off-line mode. This approach allows the FDS model to operate as a sub-grid scale model with in a WRF simulation. To test and validate the coupled FDS - WRF model, the methane leak from the Aliso Canyon underground storage facility was simulated. Large eddy simulations were performed over the complex topography of various natural gas storage facilities including Aliso Canyon, Honor Rancho and MacDonald Island at 10 m horizontal and vertical resolution. The goal of these simulations included improving and validating transport models as well as testing leak hypotheses. Forward simulation results were compared with aircraft and tower based in-situ measurements as well as methane plumes observed using the NASA Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) and the next generation instrument AVIRIS-NG. Comparison of simulation results with measurement data demonstrate the capability of the coupled FDS-WRF models to accurately simulate the transport and dispersion of methane plumes over urban domains. Simulated integrated methane enhancements will be presented and

  18. Performance evaluation of RANS-based turbulence models in simulating a honeycomb heat sink

    Science.gov (United States)

    Subasi, Abdussamet; Ozsipahi, Mustafa; Sahin, Bayram; Gunes, Hasan

    2017-07-01

    As well-known, there is not a universal turbulence model that can be used to model all engineering problems. There are specific applications for each turbulence model that make it appropriate to use, and it is vital to select an appropriate model and wall function combination that matches the physics of the problem considered. Therefore, in this study, performance of six well-known Reynolds-Averaged Navier-Stokes ( RANS) based turbulence models which are the Standard k {{-}} ɛ, the Renormalized Group k- ɛ, the Realizable k- ɛ, the Reynolds Stress Model, the k- ω and the Shear Stress Transport k- ω and accompanying wall functions which are the standard, the non-equilibrium and the enhanced are evaluated via 3D simulation of a honeycomb heat sink. The CutCell method is used to generate grid for the part including heat sink called test section while a hexahedral mesh is employed to discretize to inlet and outlet sections. A grid convergence study is conducted for verification process while experimental data and well-known correlations are used to validate the numerical results. Prediction of pressure drop along the test section, mean base plate temperature of the heat sink and temperature at the test section outlet are regarded as a measure of the performance of employed models and wall functions. The results indicate that selection of turbulence models and wall functions has a great influence on the results and, therefore, need to be selected carefully. Hydraulic and thermal characteristics of the honeycomb heat sink can be determined in a reasonable accuracy using RANS- based turbulence models provided that a suitable turbulence model and wall function combination is selected.

  19. A predictive analytic model for high-performance tunneling field-effect transistors approaching non-equilibrium Green's function simulations

    International Nuclear Information System (INIS)

    Salazar, Ramon B.; Appenzeller, Joerg; Ilatikhameneh, Hesameddin; Rahman, Rajib; Klimeck, Gerhard

    2015-01-01

    A new compact modeling approach is presented which describes the full current-voltage (I-V) characteristic of high-performance (aggressively scaled-down) tunneling field-effect-transistors (TFETs) based on homojunction direct-bandgap semiconductors. The model is based on an analytic description of two key features, which capture the main physical phenomena related to TFETs: (1) the potential profile from source to channel and (2) the elliptic curvature of the complex bands in the bandgap region. It is proposed to use 1D Poisson's equations in the source and the channel to describe the potential profile in homojunction TFETs. This allows to quantify the impact of source/drain doping on device performance, an aspect usually ignored in TFET modeling but highly relevant in ultra-scaled devices. The compact model is validated by comparison with state-of-the-art quantum transport simulations using a 3D full band atomistic approach based on non-equilibrium Green's functions. It is shown that the model reproduces with good accuracy the data obtained from the simulations in all regions of operation: the on/off states and the n/p branches of conduction. This approach allows calculation of energy-dependent band-to-band tunneling currents in TFETs, a feature that allows gaining deep insights into the underlying device physics. The simplicity and accuracy of the approach provide a powerful tool to explore in a quantitatively manner how a wide variety of parameters (material-, size-, and/or geometry-dependent) impact the TFET performance under any bias conditions. The proposed model presents thus a practical complement to computationally expensive simulations such as the 3D NEGF approach

  20. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-11-30

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  1. SEMI Modeling and Simulation Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Hermina, W.L.

    2000-10-02

    With the exponential growth in the power of computing hardware and software, modeling and simulation is becoming a key enabler for the rapid design of reliable Microsystems. One vision of the future microsystem design process would include the following primary software capabilities: (1) The development of 3D part design, through standard CAD packages, with automatic design rule checks that guarantee the manufacturability and performance of the microsystem. (2) Automatic mesh generation, for 3D parts as manufactured, that permits computational simulation of the process steps, and the performance and reliability analysis for the final microsystem. (3) Computer generated 2D layouts for process steps that utilize detailed process models to generate the layout and process parameter recipe required to achieve the desired 3D part. (4) Science-based computational tools that can simulate the process physics, and the coupled thermal, fluid, structural, solid mechanics, electromagnetic and material response governing the performance and reliability of the microsystem. (5) Visualization software that permits the rapid visualization of 3D parts including cross-sectional maps, performance and reliability analysis results, and process simulation results. In addition to these desired software capabilities, a desired computing infrastructure would include massively parallel computers that enable rapid high-fidelity analysis, coupled with networked compute servers that permit computing at a distance. We now discuss the individual computational components that are required to achieve this vision. There are three primary areas of focus: design capabilities, science-based capabilities and computing infrastructure. Within each of these areas, there are several key capability requirements.

  2. Numerical Simulation and Performance Analysis of Twin Screw Air Compressors

    Directory of Open Access Journals (Sweden)

    W. S. Lee

    2001-01-01

    Full Text Available A theoretical model is proposed in this paper in order to study the performance of oil-less and oil-injected twin screw air compressors. Based on this model, a computer simulation program is developed and the effects of different design parameters including rotor profile, geometric clearance, oil-injected angle, oil temperature, oil flow rate, built-in volume ratio and other operation conditions on the performance of twin screw air compressors are investigated. The simulation program gives us output variables such as specific power, compression ratio, compression efficiency, volumetric efficiency, and discharge temperature. Some of the above results are then compared with experimentally measured data and good agreement is found between the simulation results and the measured data.

  3. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation

    Directory of Open Access Journals (Sweden)

    M. Palme

    2017-10-01

    Full Text Available This data article presents files supporting calculation for urban heat island (UHI inclusion in building performance simulation (BPS. Methodology is used in the research article “From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect” (Palme et al., 2017 [1]. In this research, a Geographical Information System (GIS study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso. Then, a Principal Component Analysis (PCA is done to obtain reference Urban Tissues Categories (UTC to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG software (version 4.1 beta. Finally, BPS is run out with the Transient System Simulation (TRNSYS software (version 17. In this data paper, four sets of data are presented: 1 PCA data (excel to explain how to group different urban samples in representative UTC; 2 UWG data (text to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso; 3 weather data (text with the resulting rural and urban weather; 4 BPS models (text data containing the TRNSYS models (four building models.

  4. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  5. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  6. A cycle simulation model for predicting the performance of a diesel engine fuelled by diesel and biodiesel blends

    International Nuclear Information System (INIS)

    Gogoi, T.K.; Baruah, D.C.

    2010-01-01

    Among the alternative fuels, biodiesel and its blends are considered suitable and the most promising fuel for diesel engine. The properties of biodiesel are found similar to that of diesel. Many researchers have experimentally evaluated the performance characteristics of conventional diesel engines fuelled by biodiesel and its blends. However, experiments require enormous effort, money and time. Hence, a cycle simulation model incorporating a thermodynamic based single zone combustion model is developed to predict the performance of diesel engine. The effect of engine speed and compression ratio on brake power and brake thermal efficiency is analysed through the model. The fuel considered for the analysis are diesel, 20%, 40%, 60% blending of diesel and biodiesel derived from Karanja oil (Pongamia Glabra). The model predicts similar performance with diesel, 20% and 40% blending. However, with 60% blending, it reveals better performance in terms of brake power and brake thermal efficiency.

  7. High performance cellular level agent-based simulation with FLAME for the GPU.

    Science.gov (United States)

    Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela

    2010-05-01

    Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.

  8. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  9. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  10. Protein Simulation Data in the Relational Model.

    Science.gov (United States)

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  11. Switching performance of OBS network model under prefetched real traffic

    Science.gov (United States)

    Huang, Zhenhua; Xu, Du; Lei, Wen

    2005-11-01

    Optical Burst Switching (OBS) [1] is now widely considered as an efficient switching technique in building the next generation optical Internet .So it's very important to precisely evaluate the performance of the OBS network model. The performance of the OBS network model is variable in different condition, but the most important thing is that how it works under real traffic load. In the traditional simulation models, uniform traffics are usually generated by simulation software to imitate the data source of the edge node in the OBS network model, and through which the performance of the OBS network is evaluated. Unfortunately, without being simulated by real traffic, the traditional simulation models have several problems and their results are doubtable. To deal with this problem, we present a new simulation model for analysis and performance evaluation of the OBS network, which uses prefetched IP traffic to be data source of the OBS network model. The prefetched IP traffic can be considered as real IP source of the OBS edge node and the OBS network model has the same clock rate with a real OBS system. So it's easy to conclude that this model is closer to the real OBS system than the traditional ones. The simulation results also indicate that this model is more accurate to evaluate the performance of the OBS network system and the results of this model are closer to the actual situation.

  12. Calibration and validation of a model for simulating thermal and electric performance of an internal combustion engine-based micro-cogeneration device

    International Nuclear Information System (INIS)

    Rosato, A.; Sibilio, S.

    2012-01-01

    The growing worldwide demand for more efficient and less polluting forms of energy production has led to a renewed interest in the use of micro-cogeneration technologies in the residential. Among the others technologies, internal combustion engine-based micro-cogeneration devices are a market-ready technology gaining an increasing appeal thanks to their high efficiency, fuel flexibility, low emissions, low noise and vibration. In order to explore and assess the feasibility of using internal combustion engine-based cogeneration systems in the residential sector, an accurate and practical simulation model that can be used to conduct sensitivity and what-if analyses is needed. A residential cogeneration device model has been developed within IEA/ECBCS Annex 42 and implemented into a number of building simulation programs. This model is potentially able to accurately predict the thermal and electrical outputs of the residential cogeneration devices, but it relies almost entirely on empirical data because the model specification uses experimental measurements contained within a performance map to represent the device specific performance characteristics coupled with thermally massive elements to characterize the device's dynamic thermal performance. At the Built Environment Control Laboratory of Seconda Università degli studi di Napoli, an AISIN SEIKI micro-cogeneration device based on natural gas fuelled reciprocating internal combustion engine is available. This unit has been intensively tested in order to calibrate and validate the Annex 42 model. This paper shows in detail the series of experiments conducted for the calibration activity and examines the validity of this model by contrasting simulation predictions to measurements derived by operating the system in electric load following control strategy. The statistical comparison was made both for the whole database and the segregated data by system mode operation. The good agreement found in the predictions of

  13. HVDC System Characteristics and Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.I.; Han, B.M.; Jang, G.S. [Electric Enginnering and Science Research Institute, Seoul (Korea)

    2001-07-01

    This report deals with the AC-DC power system simulation method by PSS/E and EUROSTAG for the development of a strategy for the reliable operation of the Cheju-Haenam interconnected system. The simulation using both programs is performed to analyze HVDC simulation models. In addition, the control characteristics of the Cheju-Haenam HVDC system as well as Cheju AC system characteristics are described in this work. (author). 104 figs., 8 tabs.

  14. Performance modelling and simulation of an absorption solar cooling system for Malaysia

    International Nuclear Information System (INIS)

    Assilzadeh, F.; Ali, Y.; Kamaruzzaman Sopian

    2006-01-01

    Solar radiation contains huge amounts of energy and is required for almost all the natural processes on earth. Solar-powered air-conditioning has many advantages when compared to normal electricity system. This paper presents a solar cooling system that has been designed for Malaysia and other tropical regions using evacuated tube solar collector and LiBr absorption system. A modelling and simulation of absorption solar cooling system is modeled in Transient System Simulation (TRNSYS) environment. The typical meteorological year file containing the weather parameters is used to simulate the system. Then a system optimization is carried out in order to select the appropriate type of collector, the optimum size of storage tank, the optimum collector slope and area and the optimum thermostat setting of the auxiliary boiler

  15. Signal and image processing systems performance evaluation, simulation, and modeling; Proceedings of the Meeting, Orlando, FL, Apr. 4, 5, 1991

    Science.gov (United States)

    Nasr, Hatem N.; Bazakos, Michael E.

    The various aspects of the evaluation and modeling problems in algorithms, sensors, and systems are addressed. Consideration is given to a generic modular imaging IR signal processor, real-time architecture based on the image-processing module family, application of the Proto Ware simulation testbed to the design and evaluation of advanced avionics, development of a fire-and-forget imaging infrared seeker missile simulation, an adaptive morphological filter for image processing, laboratory development of a nonlinear optical tracking filter, a dynamic end-to-end model testbed for IR detection algorithms, wind tunnel model aircraft attitude and motion analysis, an information-theoretic approach to optimal quantization, parametric analysis of target/decoy performance, neural networks for automated target recognition parameters adaptation, performance evaluation of a texture-based segmentation algorithm, evaluation of image tracker algorithms, and multisensor fusion methodologies. (No individual items are abstracted in this volume)

  16. Problem reporting management system performance simulation

    Science.gov (United States)

    Vannatta, David S.

    1993-01-01

    This paper proposes the Problem Reporting Management System (PRMS) model as an effective discrete simulation tool that determines the risks involved during the development phase of a Trouble Tracking Reporting Data Base replacement system. The model considers the type of equipment and networks which will be used in the replacement system as well as varying user loads, size of the database, and expected operational availability. The paper discusses the dynamics, stability, and application of the PRMS and addresses suggested concepts to enhance the service performance and enrich them.

  17. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    Science.gov (United States)

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  18. Landscape Modelling and Simulation Using Spatial Data

    Directory of Open Access Journals (Sweden)

    Amjed Naser Mohsin AL-Hameedawi

    2017-08-01

    Full Text Available In this paper a procedure was performed for engendering spatial model of landscape acclimated to reality simulation. This procedure based on combining spatial data and field measurements with computer graphics reproduced using Blender software. Thereafter that we are possible to form a 3D simulation based on VIS ALL packages. The objective was to make a model utilising GIS, including inputs to the feature attribute data. The objective of these efforts concentrated on coordinating a tolerable spatial prototype, circumscribing facilitation scheme and outlining the intended framework. Thus; the eventual result was utilized in simulation form. The performed procedure contains not only data gathering, fieldwork and paradigm providing, but extended to supply a new method necessary to provide the respective 3D simulation mapping production, which authorises the decision makers as well as investors to achieve permanent acceptance an independent navigation system for Geoscience applications.

  19. Meta-stochastic simulation of biochemical models for systems and synthetic biology.

    Science.gov (United States)

    Sanassy, Daven; Widera, Paweł; Krasnogor, Natalio

    2015-01-16

    Stochastic simulation algorithms (SSAs) are used to trace realistic trajectories of biochemical systems at low species concentrations. As the complexity of modeled biosystems increases, it is important to select the best performing SSA. Numerous improvements to SSAs have been introduced but they each only tend to apply to a certain class of models. This makes it difficult for a systems or synthetic biologist to decide which algorithm to employ when confronted with a new model that requires simulation. In this paper, we demonstrate that it is possible to determine which algorithm is best suited to simulate a particular model and that this can be predicted a priori to algorithm execution. We present a Web based tool ssapredict that allows scientists to upload a biochemical model and obtain a prediction of the best performing SSA. Furthermore, ssapredict gives the user the option to download our high performance simulator ngss preconfigured to perform the simulation of the queried biochemical model with the predicted fastest algorithm as the simulation engine. The ssapredict Web application is available at http://ssapredict.ico2s.org. It is free software and its source code is distributed under the terms of the GNU Affero General Public License.

  20. Contribution to the Development of Simulation Model of Ship Turbine

    Directory of Open Access Journals (Sweden)

    Božić Ratko

    2015-01-01

    Full Text Available Simulation modelling, performed by System Dynamics Modelling Approach and intensive use of computers, is one of the most convenient and most successful scientific methods of analysis of performance dynamics of nonlinear and very complex natural technical and organizational systems [1]. The purpose of this work is to demonstrate the successful application of system dynamics simulation modelling at analyzing performance dynamics of a complex system of ship’s propulsion system. Gas turbine is a complex non-linear system, which needs to be systematically investigated as a unit consisting of a number of subsystems and elements, which are linked by cause-effect (UPV feedback loops (KPD, both within the propulsion system and with the relevant surrounding. In this paper the authors will present an efficient application of scientific methods for the study of complex dynamic systems called qualitative and quantitative simulation System Dynamics Methodology. Gas turbine will be presented by a set of non-linear differential equations, after which mental-verbal structural models and flowcharts in System dynamics symbols will be produced, and the performance dynamics in load condition will be simulated in POWERSIM simulation language.

  1. Analysing the temporal dynamics of model performance for hydrological models

    NARCIS (Netherlands)

    Reusser, D.E.; Blume, T.; Schaefli, B.; Zehe, E.

    2009-01-01

    The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or

  2. Optimization Model for Web Based Multimodal Interactive Simulations.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  3. Simulation and analysis on thermodynamic performance of surface water source heat pump system

    Institute of Scientific and Technical Information of China (English)

    Nan Lv; Qing Zhang; Zhenqian Chen; Dongsheng Wu

    2017-01-01

    This work established a thermodynamic performance model of a heat pump system containing a heat pump unit model, an air conditioning cooling and heating load calculation model, a heat exchanger model and a water pump performance model based on mass and energy balances. The thermodynamic performance of a surface water source heat pump air conditioning system was simulated and verified by comparing the simulation results to an actual engineering project. In addition, the effects of the surface water temperature, heat exchanger structure and surface water pipeline transportation system on the thermodynamic performance of the heat pump air conditioning system were analyzed. Under the simulated conditions in this paper with a cooling load of 3400 kW, the results showed that a 1 ℃ decrease in the surface water temperature leads to a 2.3 percent increase in the coefficient of performance; furthermore, an additional 100 m of length for the closed-loop surface water heat exchanger tube leads to a 0.08 percent increase in the coefficient of performance. To decrease the system energy consumption, the optimal working point should be specified according to the surface water transportation length.

  4. Simulation of upward flux from shallow water-table using UPFLOW model

    Directory of Open Access Journals (Sweden)

    M. H. Ali

    2013-11-01

    Full Text Available The upward movement of water by capillary rise from shallow water-table to the root zone is an important incoming flux. For determining exact amount of irrigation requirement, estimation of capillary flux or upward flux is essential. Simulation model can provide a reliable estimate of upward flux under variable soil and climatic conditions. In this study, the performance of model UPFLOW to estimate upward flux was evaluated. Evaluation of model performance was performed with both graphical display and statistical criteria. In distribution of simulated capillary rise values against observed field data, maximum data points lie around the 1:1 line, which means that the model output is reliable and reasonable. The coefficient of determination between observed and simulated values was 0.806 (r = 0.93, which indicates a good inter-relation between observed and simulated values. The relative error, model efficiency, and index of agreement were found as 27.91%, 85.93% and 0.96, respectively. Considering the graphical display of observed and simulated upward flux and statistical indicators, it can be concluded that the overall performance of the UPFLOW model in simulating actual upward flux from a crop field under variable water-table condition is satisfactory. Thus, the model can be used to estimate capillary rise from shallow water-table for proper estimation of irrigation requirement, which would save valuable water from over-irrigation.

  5. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  6. A Simulation Model Articulation of the REA Ontology

    Science.gov (United States)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  7. Predictors of laparoscopic simulation performance among practicing obstetrician gynecologists.

    Science.gov (United States)

    Mathews, Shyama; Brodman, Michael; D'Angelo, Debra; Chudnoff, Scott; McGovern, Peter; Kolev, Tamara; Bensinger, Giti; Mudiraj, Santosh; Nemes, Andreea; Feldman, David; Kischak, Patricia; Ascher-Walsh, Charles

    2017-11-01

    While simulation training has been established as an effective method for improving laparoscopic surgical performance in surgical residents, few studies have focused on its use for attending surgeons, particularly in obstetrics and gynecology. Surgical simulation may have a role in improving and maintaining proficiency in the operating room for practicing obstetrician gynecologists. We sought to determine if parameters of performance for validated laparoscopic virtual simulation tasks correlate with surgical volume and characteristics of practicing obstetricians and gynecologists. All gynecologists with laparoscopic privileges (n = 347) from 5 academic medical centers in New York City were required to complete a laparoscopic surgery simulation assessment. The physicians took a presimulation survey gathering physician self-reported characteristics and then performed 3 basic skills tasks (enforced peg transfer, lifting/grasping, and cutting) on the LapSim virtual reality laparoscopic simulator (Surgical Science Ltd, Gothenburg, Sweden). The association between simulation outcome scores (time, efficiency, and errors) and self-rated clinical skills measures (self-rated laparoscopic skill score or surgical volume category) were examined with regression models. The average number of laparoscopic procedures per month was a significant predictor of total time on all 3 tasks (P = .001 for peg transfer; P = .041 for lifting and grasping; P simulation performance as it correlates to active physician practice, further studies may help assess skill and individualize training to maintain skill levels as case volumes fluctuate. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations

    Science.gov (United States)

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  9. Modeling and Simulation of Power Distribution System in More Electric Aircraft

    Directory of Open Access Journals (Sweden)

    Zhangang Yang

    2015-01-01

    Full Text Available The More Electric Aircraft concept is a fast-developing trend in modern aircraft industry. With this new concept, the performance of the aircraft can be further optimized and meanwhile the operating and maintenance cost will be decreased effectively. In order to optimize the power system integrity and have the ability to investigate the performance of the overall system in any possible situations, one accurate simulation model of the aircraft power system will be very helpful and necessary. This paper mainly introduces a method to build a simulation model for the power distribution system, which is based on detailed component models. The power distribution system model consists of power generation unit, transformer rectifier unit, DC-DC converter unit, and DC-AC inverter unit. In order to optimize the performance of the power distribution system and improve the quality of the distributed power, a feedback control network is designed based on the characteristics of the power distribution system. The simulation result indicates that this new simulation model is well designed and it works accurately. Moreover, steady state performance and transient state performance of the model can fulfill the requirements of aircraft power distribution system in the realistic application.

  10. A study for production simulation model generation system based on data model at a shipyard

    Directory of Open Access Journals (Sweden)

    Myung-Gi Back

    2016-09-01

    Full Text Available Simulation technology is a type of shipbuilding product lifecycle management solution used to support production planning or decision-making. Normally, most shipbuilding processes are consisted of job shop production, and the modeling and simulation require professional skills and experience on shipbuilding. For these reasons, many shipbuilding companies have difficulties adapting simulation systems, regardless of the necessity for the technology. In this paper, the data model for shipyard production simulation model generation was defined by analyzing the iterative simulation modeling procedure. The shipyard production simulation data model defined in this study contains the information necessary for the conventional simulation modeling procedure and can serve as a basis for simulation model generation. The efficacy of the developed system was validated by applying it to the simulation model generation of the panel block production line. By implementing the initial simulation model generation process, which was performed in the past with a simulation modeler, the proposed system substantially reduced the modeling time. In addition, by reducing the difficulties posed by different modeler-dependent generation methods, the proposed system makes the standardization of the simulation model quality possible.

  11. Performance and Evaluation of the Global Modeling and Assimilation Office Observing System Simulation Experiment

    Science.gov (United States)

    Prive, Nikki; Errico, R. M.; Carvalho, D.

    2018-01-01

    The National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASA/GMAO) has spent more than a decade developing and implementing a global Observing System Simulation Experiment framework for use in evaluting both new observation types as well as the behavior of data assimilation systems. The NASA/GMAO OSSE has constantly evolved to relect changes in the Gridpoint Statistical Interpolation data assimiation system, the Global Earth Observing System model, version 5 (GEOS-5), and the real world observational network. Software and observational datasets for the GMAO OSSE are publicly available, along with a technical report. Substantial modifications have recently been made to the NASA/GMAO OSSE framework, including the character of synthetic observation errors, new instrument types, and more sophisticated atmospheric wind vectors. These improvements will be described, along with the overall performance of the current OSSE. Lessons learned from investigations into correlated errors and model error will be discussed.

  12. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  13. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  14. Fracture network modeling and GoldSim simulation support

    International Nuclear Information System (INIS)

    Sugita, Kenichirou; Dershowitz, W.

    2005-01-01

    During Heisei-16, Golder Associates provided support for JNC Tokai through discrete fracture network data analysis and simulation of the Mizunami Underground Research Laboratory (MIU), participation in Task 6 of the AEspoe Task Force on Modeling of Groundwater Flow and Transport, and development of methodologies for analysis of repository site characterization strategies and safety assessment. MIU support during H-16 involved updating the H-15 FracMan discrete fracture network (DFN) models for the MIU shaft region, and developing improved simulation procedures. Updates to the conceptual model included incorporation of 'Step2' (2004) versions of the deterministic structures, and revision of background fractures to be consistent with conductive structure data from the DH-2 borehole. Golder developed improved simulation procedures for these models through the use of hybrid discrete fracture network (DFN), equivalent porous medium (EPM), and nested DFN/EPM approaches. For each of these models, procedures were documented for the entire modeling process including model implementation, MMP simulation, and shaft grouting simulation. Golder supported JNC participation in Task 6AB, 6D and 6E of the AEspoe Task Force on Modeling of Groundwater Flow and Transport during H-16. For Task 6AB, Golder developed a new technique to evaluate the role of grout in performance assessment time-scale transport. For Task 6D, Golder submitted a report of H-15 simulations to SKB. For Task 6E, Golder carried out safety assessment time-scale simulations at the block scale, using the Laplace Transform Galerkin method. During H-16, Golder supported JNC's Total System Performance Assessment (TSPA) strategy by developing technologies for the analysis of the use site characterization data in safety assessment. This approach will aid in the understanding of the use of site characterization to progressively reduce site characterization uncertainty. (author)

  15. NASA-STD-7009 Guidance Document for Human Health and Performance Models and Simulations

    Science.gov (United States)

    Walton, Marlei; Mulugeta, Lealem; Nelson, Emily S.; Myers, Jerry G.

    2014-01-01

    Rigorous verification, validation, and credibility (VVC) processes are imperative to ensure that models and simulations (MS) are sufficiently reliable to address issues within their intended scope. The NASA standard for MS, NASA-STD-7009 (7009) [1] was a resultant outcome of the Columbia Accident Investigation Board (CAIB) to ensure MS are developed, applied, and interpreted appropriately for making decisions that may impact crew or mission safety. Because the 7009 focus is engineering systems, a NASA-STD-7009 Guidance Document is being developed to augment the 7009 and provide information, tools, and techniques applicable to the probabilistic and deterministic biological MS more prevalent in human health and performance (HHP) and space biomedical research and operations.

  16. Multiprocessor performance modeling with ADAS

    Science.gov (United States)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  17. High performance ultrasonic field simulation on complex geometries

    Science.gov (United States)

    Chouh, H.; Rougeron, G.; Chatillon, S.; Iehl, J. C.; Farrugia, J. P.; Ostromoukhov, V.

    2016-02-01

    Ultrasonic field simulation is a key ingredient for the design of new testing methods as well as a crucial step for NDT inspection simulation. As presented in a previous paper [1], CEA-LIST has worked on the acceleration of these simulations focusing on simple geometries (planar interfaces, isotropic materials). In this context, significant accelerations were achieved on multicore processors and GPUs (Graphics Processing Units), bringing the execution time of realistic computations in the 0.1 s range. In this paper, we present recent works that aim at similar performances on a wider range of configurations. We adapted the physical model used by the CIVA platform to design and implement a new algorithm providing a fast ultrasonic field simulation that yields nearly interactive results for complex cases. The improvements over the CIVA pencil-tracing method include adaptive strategies for pencil subdivisions to achieve a good refinement of the sensor geometry while keeping a reasonable number of ray-tracing operations. Also, interpolation of the times of flight was used to avoid time consuming computations in the impulse response reconstruction stage. To achieve the best performance, our algorithm runs on multi-core superscalar CPUs and uses high performance specialized libraries such as Intel Embree for ray-tracing, Intel MKL for signal processing and Intel TBB for parallelization. We validated the simulation results by comparing them to the ones produced by CIVA on identical test configurations including mono-element and multiple-element transducers, homogeneous, meshed 3D CAD specimens, isotropic and anisotropic materials and wave paths that can involve several interactions with interfaces. We show performance results on complete simulations that achieve computation times in the 1s range.

  18. High-Performance Modeling of Carbon Dioxide Sequestration by Coupling Reservoir Simulation and Molecular Dynamics

    KAUST Repository

    Bao, Kai; Yan, Mi; Allen, Rebecca; Salama, Amgad; Lu, Ligang; Jordan, Kirk E.; Sun, Shuyu; Keyes, David E.

    2015-01-01

    The present work describes a parallel computational framework for carbon dioxide (CO2) sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel high-performance-computing (HPC) systems

  19. Performance Test of Core Protection and Monitoring Algorithm with DLL for SMART Simulator Implementation

    International Nuclear Information System (INIS)

    Koo, Bonseung; Hwang, Daehyun; Kim, Keungkoo

    2014-01-01

    A multi-purpose best-estimate simulator for SMART is being established, which is intended to be used as a tool to evaluate the impacts of design changes on the safety performance, and to improve and/or optimize the operating procedure of SMART. In keeping with these intentions, a real-time model of the digital core protection and monitoring systems was developed and the real-time performance of the models was verified for various simulation scenarios. In this paper, a performance test of the core protection and monitoring algorithm with a DLL file for the SMART simulator implementation was performed. A DLL file of the simulator application code was made and several real-time evaluation tests were conducted for the steady-state and transient conditions with simulated system variables. A performance test of the core protection and monitoring algorithms for the SMART simulator was performed. A DLL file of the simulator version code was made and several real-time evaluation tests were conducted for various scenarios with a DLL file and simulated system variables. The results of all test cases showed good agreement with the reference results and some features caused by algorithm change were properly reflected to the DLL results. Therefore, it was concluded that the SCOPS S SIM and SCOMS S SIM algorithms and calculational capabilities are appropriate for the core protection and monitoring program in the SMART simulator

  20. Comprehensive Performance Evaluation for Hydrological and Nutrients Simulation Using the Hydrological Simulation Program-Fortran in a Mesoscale Monsoon Watershed, China.

    Science.gov (United States)

    Li, Zhaofu; Luo, Chuan; Jiang, Kaixia; Wan, Rongrong; Li, Hengpeng

    2017-12-19

    The Hydrological Simulation Program-Fortran (HSPF) is a hydrological and water quality computer model that was developed by the United States Environmental Protection Agency. Comprehensive performance evaluations were carried out for hydrological and nutrient simulation using the HSPF model in the Xitiaoxi watershed in China. Streamflow simulation was calibrated from 1 January 2002 to 31 December 2007 and then validated from 1 January 2008 to 31 December 2010 using daily observed data, and nutrient simulation was calibrated and validated using monthly observed data during the period from July 2009 to July 2010. These results of model performance evaluation showed that the streamflows were well simulated over the study period. The determination coefficient ( R ²) was 0.87, 0.77 and 0.63, and the Nash-Sutcliffe coefficient of efficiency (Ens) was 0.82, 0.76 and 0.65 for the streamflow simulation in annual, monthly and daily time-steps, respectively. Although limited to monthly observed data, satisfactory performance was still achieved during the quantitative evaluation for nutrients. The R ² was 0.73, 0.82 and 0.92, and the Ens was 0.67, 0.74 and 0.86 for nitrate, ammonium and orthophosphate simulation, respectively. Some issues may affect the application of HSPF were also discussed, such as input data quality, parameter values, etc. Overall, the HSPF model can be successfully used to describe streamflow and nutrients transport in the mesoscale watershed located in the East Asian monsoon climate area. This study is expected to serve as a comprehensive and systematic documentation of understanding the HSPF model for wide application and avoiding possible misuses.

  1. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  2. Performance Evaluation of PBL Schemes of ARW Model in Simulating Thermo-Dynamical Structure of Pre-Monsoon Convective Episodes over Kharagpur Using STORM Data Sets

    Science.gov (United States)

    Madala, Srikanth; Satyanarayana, A. N. V.; Srinivas, C. V.; Tyagi, Bhishma

    2016-05-01

    In the present study, advanced research WRF (ARW) model is employed to simulate convective thunderstorm episodes over Kharagpur (22°30'N, 87°20'E) region of Gangetic West Bengal, India. High-resolution simulations are conducted using 1 × 1 degree NCEP final analysis meteorological fields for initial and boundary conditions for events. The performance of two non-local [Yonsei University (YSU), Asymmetric Convective Model version 2 (ACM2)] and two local turbulence kinetic energy closures [Mellor-Yamada-Janjic (MYJ), Bougeault-Lacarrere (BouLac)] are evaluated in simulating planetary boundary layer (PBL) parameters and thermodynamic structure of the atmosphere. The model-simulated parameters are validated with available in situ meteorological observations obtained from micro-meteorological tower as well has high-resolution DigiCORA radiosonde ascents during STORM-2007 field experiment at the study location and Doppler Weather Radar (DWR) imageries. It has been found that the PBL structure simulated with the TKE closures MYJ and BouLac are in better agreement with observations than the non-local closures. The model simulations with these schemes also captured the reflectivity, surface pressure patterns such as wake-low, meso-high, pre-squall low and the convective updrafts and downdrafts reasonably well. Qualitative and quantitative comparisons reveal that the MYJ followed by BouLac schemes better simulated various features of the thunderstorm events over Kharagpur region. The better performance of MYJ followed by BouLac is evident in the lesser mean bias, mean absolute error, root mean square error and good correlation coefficient for various surface meteorological variables as well as thermo-dynamical structure of the atmosphere relative to other PBL schemes. The better performance of the TKE closures may be attributed to their higher mixing efficiency, larger convective energy and better simulation of humidity promoting moist convection relative to non

  3. cellGPU: Massively parallel simulations of dynamic vertex models

    Science.gov (United States)

    Sussman, Daniel M.

    2017-10-01

    Vertex models represent confluent tissue by polygonal or polyhedral tilings of space, with the individual cells interacting via force laws that depend on both the geometry of the cells and the topology of the tessellation. This dependence on the connectivity of the cellular network introduces several complications to performing molecular-dynamics-like simulations of vertex models, and in particular makes parallelizing the simulations difficult. cellGPU addresses this difficulty and lays the foundation for massively parallelized, GPU-based simulations of these models. This article discusses its implementation for a pair of two-dimensional models, and compares the typical performance that can be expected between running cellGPU entirely on the CPU versus its performance when running on a range of commercial and server-grade graphics cards. By implementing the calculation of topological changes and forces on cells in a highly parallelizable fashion, cellGPU enables researchers to simulate time- and length-scales previously inaccessible via existing single-threaded CPU implementations. Program Files doi:http://dx.doi.org/10.17632/6j2cj29t3r.1 Licensing provisions: MIT Programming language: CUDA/C++ Nature of problem: Simulations of off-lattice "vertex models" of cells, in which the interaction forces depend on both the geometry and the topology of the cellular aggregate. Solution method: Highly parallelized GPU-accelerated dynamical simulations in which the force calculations and the topological features can be handled on either the CPU or GPU. Additional comments: The code is hosted at https://gitlab.com/dmsussman/cellGPU, with documentation additionally maintained at http://dmsussman.gitlab.io/cellGPUdocumentation

  4. Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.

    Science.gov (United States)

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew

    2017-09-01

    Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.

  5. Optical modeling and simulation of thin-film photovoltaic devices

    CERN Document Server

    Krc, Janez

    2013-01-01

    In wafer-based and thin-film photovoltaic (PV) devices, the management of light is a crucial aspect of optimization since trapping sunlight in active parts of PV devices is essential for efficient energy conversions. Optical modeling and simulation enable efficient analysis and optimization of the optical situation in optoelectronic and PV devices. Optical Modeling and Simulation of Thin-Film Photovoltaic Devices provides readers with a thorough guide to performing optical modeling and simulations of thin-film solar cells and PV modules. It offers insight on examples of existing optical models

  6. Dynamic Performance Comparison for MPPT-PV Systems using Hybrid Pspice/Matlab Simulation

    Science.gov (United States)

    Aouchiche, N.; Becherif, M.; HadjArab, A.; Aitcheikh, M. S.; Ramadan, H. S.; Cheknane, A.

    2016-10-01

    The power generated by solar photovoltaic (PV) module depends on the surrounding irradiance and temperature. This paper presents a hybrid Matlab™/Pspice™ simulation model of PV system, combined with Cadence software SLPS. The hybridization is performed in order to gain the advantages of both simulation tools such as accuracy and efficiency in both Pspice electronic circuit and Matlab™ mathematical modelling respectively. For this purpose, the PV panel and the boost converter are developed using Pspice™ and hybridized with the mathematical Matlab™ model of maximum power point method controller (MPPT) through SLPS. The main objective is verify the significance of using the proposed hybrid simulation techniques in comparing the different MPPT algorithms such as the perturbation and observation (P&O), incremental of conductance (Inc-Cond) and counter reaction voltage using pilot cell (Pilot-Cell). Various simulations are performed under different atmospheric conditions in order to evaluate the dynamic behaviour for the system under study in terms of stability, efficiency and rapidity.

  7. Improving streamflow simulations and forecasting performance of SWAT model by assimilating remotely sensed soil moisture observations

    Science.gov (United States)

    Patil, Amol; Ramsankaran, RAAJ

    2017-12-01

    This article presents a study carried out using EnKF based assimilation of coarser-scale SMOS soil moisture retrievals to improve the streamflow simulations and forecasting performance of SWAT model in a large catchment. This study has been carried out in Munneru river catchment, India, which is about 10,156 km2. In this study, an EnkF based new approach is proposed for improving the inherent vertical coupling of soil layers of SWAT hydrological model during soil moisture data assimilation. Evaluation of the vertical error correlation obtained between surface and subsurface layers indicates that the vertical coupling can be improved significantly using ensemble of soil storages compared to the traditional static soil storages based EnKF approach. However, the improvements in the simulated streamflow are moderate, which is due to the limitations in SWAT model in reflecting the profile soil moisture updates in surface runoff computations. Further, it is observed that the durability of streamflow improvements is longer when the assimilation system effectively updates the subsurface flow component. Overall, the results of the present study indicate that the passive microwave-based coarser-scale soil moisture products like SMOS hold significant potential to improve the streamflow estimates when assimilating into large-scale distributed hydrological models operating at a daily time step.

  8. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M

    2011-01-01

    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  9. Control-Oriented Models for Real-Time Simulation of Automotive Transmission Systems

    Directory of Open Access Journals (Sweden)

    Cavina N.

    2015-01-01

    Full Text Available A control-oriented model of a Dual Clutch Transmission (DCT was developed for real-time Hardware In the Loop (HIL applications, to support model-based development of the DCT controller and to systematically test its performance. The model is an innovative attempt to reproduce the fast dynamics of the actuation system while maintaining a simulation step size large enough for real-time applications. The model comprehends a detailed physical description of hydraulic circuit, clutches, synchronizers and gears, and simplified vehicle and internal combustion engine sub-models. As the oil circulating in the system has a large bulk modulus, the pressure dynamics are very fast, possibly causing instability in a real-time simulation; the same challenge involves the servo valves dynamics, due to the very small masses of the moving elements. Therefore, the hydraulic circuit model has been modified and simplified without losing physical validity, in order to adapt it to the real-time simulation requirements. The results of offline simulations have been compared to on-board measurements to verify the validity of the developed model, which was then implemented in a HIL system and connected to the Transmission Control Unit (TCU. Several tests have been performed on the HIL simulator, to verify the TCU performance: electrical failure tests on sensors and actuators, hydraulic and mechanical failure tests on hydraulic valves, clutches and synchronizers, and application tests comprehending all the main features of the control actions performed by the TCU. Being based on physical laws, in every condition the model simulates a plausible reaction of the system. A test automation procedure has finally been developed to permit the execution of a pattern of tests without the interaction of the user; perfectly repeatable tests can be performed for non-regression verification, allowing the testing of new software releases in fully automatic mode.

  10. The cognitive environment simulation as a tool for modeling human performance and reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Pople, H. Jr.

    1989-01-01

    Various studies have shown that intention errors, or cognitive error, are a major contributor to the risk of disaster. Intention formation refers to the cognitive processes by which an agent decides on what actions are appropriate to carry out (information gathering, situation assessment, diagnosis, response selection). Understanding, measuring, predicting and correcting cognitive errors depends on the answers to the question - what are difficult problems? The answer to this question defines what are risky situations from the point of view of what incidents will the human-technical system manage safely and what incidents will the human-technical system manage poorly and evolve towards negative outcomes. The authors have made progress in the development of such measuring devices through an NRC sponsored research program on cognitive modeling of operator performance. The approach is based on the demand-resource match view of human error. In this approach the difficulty of a problem depends on both the nature of the problem itself and on the resources (e.g., knowledge, plans) available to solve the problem. One can test the difficulty posed by a domain incident, given some set of resources by running the incident through a cognitive simulation that carries out the cognitive activities of a limited resource problem solver in a dynamic, uncertain, risky and highly doctrinal (pre-planned routines and procedures) world. The cognitive simulation that they have developed to do this in NPP accidents is called the Cognitive Environment Simulation (CES). They will illustrate the power of this approach by comparing the behavior of operators in variants on a simulated accident to the behavior of CES in the same accidents

  11. Evapotranspiration simulated by CRITERIA and AquaCrop models in stony soils

    Directory of Open Access Journals (Sweden)

    Pasquale Campi

    2015-06-01

    Full Text Available The performance of a water balance model is also based on the ability to correctly perform simulations in heterogeneous soils. The objective of this paper is to test CRITERIA and AquaCrop models in order to evaluate their suitability in estimating evapotranspiration at the field scale in two types of soil in the Mediterranean region: non-stony and stony soil. The first step of the work was to calibrate both models under the non-stony conditions. The models were calibrated by using observations on wheat crop (leaf area index or canopy cover, and phenological stages as a function of degree days and pedo-climatic measurements. The second step consisted in the analysing the impact of the soil type on the models performances by comparing simulated and measured values. The outputs retained in the analysis were soil water content (at the daily scale and crop evapotranspiration (at two time scales: daily and crop season. The model performances were evaluated through four statistical tests: normalised difference (D% at the seasonal time scale; and relative root mean square error (RRMSE, efficiency index (EF, coefficient of determination (r2 at the daily scale. At the seasonal scale, values of D% were less than 15% in stony and on-stony soils, indicating a good performance attained by both models. At the daily scale, the RRMSE values (<30% indicate that the evapotranspiration simulated by CRITERIA is acceptable in both soil types. In the stony soil conditions, 3 out 4 statistical tests (RRMSE, EF, r2 indicate the inadequacy of AquaCrop to simulate correctly daily evapotranspiration. The higher performance of CRITERIA model to simulate daily evapotranspiration in stony soils, is due to the soil submodel, which requires the percentage skeleton as an input, while AquaCrop model takes into account the presence of skeleton by reducing the soil volume.

  12. SIMULATING AN EVOLUTIONARY MULTI-AGENT BASED MODEL OF THE STOCK MARKET

    Directory of Open Access Journals (Sweden)

    Diana MARICA

    2015-08-01

    Full Text Available The paper focuses on artificial stock market simulations using a multi-agent model incorporating 2,000 heterogeneous agents interacting on the artificial market. The agents interaction is due to trading activity on the market through a call auction trading mechanism. The multi-agent model uses evolutionary techniques such as genetic programming in order to generate an adaptive and evolving population of agents. Each artificial agent is endowed with wealth and a genetic programming induced trading strategy. The trading strategy evolves and adapts to the new market conditions through a process called breeding, which implies that at each simulation step, new agents with better trading strategies are generated by the model, from recombining the best performing trading strategies and replacing the agents which have the worst performing trading strategies. The simulation model was build with the help of the simulation software Altreva Adaptive Modeler which offers a suitable platform for financial market simulations of evolutionary agent based models, the S&P500 composite index being used as a benchmark for the simulation results.

  13. Modelling, simulation and validation of the industrial robot

    Directory of Open Access Journals (Sweden)

    Aleksandrov Slobodan Č.

    2014-01-01

    Full Text Available In this paper, a DH model of industrial robot, with anthropomorphic configuration and five degrees of freedom - Mitsubishi RV2AJ, is developed. The model is verified on the example robot Mitsubishi RV2AJ. In paper detailed represented the complete mathematical model of the robot and the parameters of the programming. On the basis of this model, simulation of robot motion from point to point is performed, as well as the continuous movement of the pre-defined path. Also, programming of industrial robots identical to simulation programs is made, and comparative analysis of real and simulated experiment is shown. In the final section, a detailed analysis of robot motion is described.

  14. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    Science.gov (United States)

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  15. Vermont Yankee simulator BOP model upgrade

    International Nuclear Information System (INIS)

    Alejandro, R.; Udbinac, M.J.

    2006-01-01

    The Vermont Yankee simulator has undergone significant changes in the 20 years since the original order was placed. After the move from the original Unix to MS Windows environment, and upgrade to the latest version of SimPort, now called MASTER, the platform was set for an overhaul and replacement of major plant system models. Over a period of a few months, the VY simulator team, in partnership with WSC engineers, replaced outdated legacy models of the main steam, condenser, condensate, circulating water, feedwater and feedwater heaters, and main turbine and auxiliaries. The timing was ideal, as the plant was undergoing a power up-rate, so the opportunity was taken to replace the legacy models with industry-leading, true on-line object oriented graphical models. Due to the efficiency of design and ease of use of the MASTER tools, VY staff performed the majority of the modeling work themselves with great success, with only occasional assistance from WSC, in a relatively short time-period, despite having to maintain all of their 'regular' simulator maintenance responsibilities. This paper will provide a more detailed view of the VY simulator, including how it is used and how it has benefited from the enhancements and upgrades implemented during the project. (author)

  16. Fully Coupled Simulation of Lithium Ion Battery Cell Performance

    Energy Technology Data Exchange (ETDEWEB)

    Trembacki, Bradley L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Murthy, Jayathi Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Scott Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Lithium-ion battery particle-scale (non-porous electrode) simulations applied to resolved electrode geometries predict localized phenomena and can lead to better informed decisions on electrode design and manufacturing. This work develops and implements a fully-coupled finite volume methodology for the simulation of the electrochemical equations in a lithium-ion battery cell. The model implementation is used to investigate 3D battery electrode architectures that offer potential energy density and power density improvements over traditional layer-by-layer particle bed battery geometries. Advancement of micro-scale additive manufacturing techniques has made it possible to fabricate these 3D electrode microarchitectures. A variety of 3D battery electrode geometries are simulated and compared across various battery discharge rates and length scales in order to quantify performance trends and investigate geometrical factors that improve battery performance. The energy density and power density of the 3D battery microstructures are compared in several ways, including a uniform surface area to volume ratio comparison as well as a comparison requiring a minimum manufacturable feature size. Significant performance improvements over traditional particle bed electrode designs are observed, and electrode microarchitectures derived from minimal surfaces are shown to be superior. A reduced-order volume-averaged porous electrode theory formulation for these unique 3D batteries is also developed, allowing simulations on the full-battery scale. Electrode concentration gradients are modeled using the diffusion length method, and results for plate and cylinder electrode geometries are compared to particle-scale simulation results. Additionally, effective diffusion lengths that minimize error with respect to particle-scale results for gyroid and Schwarz P electrode microstructures are determined.

  17. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  18. Dynamic modelling and simulation for control of a cylindrical robotic manipulator

    International Nuclear Information System (INIS)

    Iqbal, A.; Athar, S.M.

    1995-03-01

    In this report a dynamic model for the three degrees-of-freedom cylindrical manipulator, INFOMATE has been developed. Although the robot dynamics are highly coupled and non-linear, the developed model is relatively straight forward and compact for control engineering and simulation applications. The model has been simulated using the graphical simulation package SIMULINK. Different aspects of INFOMATE associated with forward dynamics, inverse dynamics and control have been investigated by performing various simulation experiments. These simulation experiments confirm the accuracy and applicability of the dynamic robot model. (author) 18 figs

  19. Comprehensive Performance Evaluation for Hydrological and Nutrients Simulation Using the Hydrological Simulation Program–Fortran in a Mesoscale Monsoon Watershed, China

    OpenAIRE

    Zhaofu Li; Chuan Luo; Kaixia Jiang; Rongrong Wan; Hengpeng Li

    2017-01-01

    The Hydrological Simulation Program–Fortran (HSPF) is a hydrological and water quality computer model that was developed by the United States Environmental Protection Agency. Comprehensive performance evaluations were carried out for hydrological and nutrient simulation using the HSPF model in the Xitiaoxi watershed in China. Streamflow simulation was calibrated from 1 January 2002 to 31 December 2007 and then validated from 1 January 2008 to 31 December 2010 using daily observed data, and nu...

  20. Modelling, simulating and optimizing boiler heating surfaces and evaporator circuits

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for optimizing the dynamic performance of boiler have been developed. Design variables related to the size of the boiler and its dynamic performance have been defined. The object function to be optimized takes the weight of the boiler and its dynamic capability into account. As constraints...... for the optimization a dynamic model for the boiler is applied. Furthermore a function for the value of the dynamic performance is included in the model. The dynamic models for simulating boiler performance consists of a model for the flue gas side, a model for the evaporator circuit and a model for the drum....... The dynamic model has been developed for the purpose of determining boiler material temperatures and heat transfer from the flue gas side to the water-/steam side in order to simulate the circulation in the evaporator circuit and hereby the water level fluctuations in the drum. The dynamic model has been...

  1. Modeling and Simulation of a 12 MW Wind Farm

    Directory of Open Access Journals (Sweden)

    GROZA, V.

    2010-05-01

    Full Text Available The installation of wind turbines in power systems has developed rapidly through the last 20 years. In this paper a complete simulation model of a 6 x 2 MW wind turbines is presented using data from a wind farm installed in Denmark. A model of the wind turbine with cage-rotor induction generator is presented in details. A set of simulations are performed and they show that it is possible to simulate a complete wind farm from wind to the grid. The simulation tool can also be used to simulate bigger wind farms connected to the grid.

  2. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  3. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  4. Conceptual Design of Simulation Models in an Early Development Phase of Lunar Spacecraft Simulator Using SMP2 Standard

    Science.gov (United States)

    Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.

  5. Effects of incentives on psychosocial performances in simulated space-dwelling groups

    Science.gov (United States)

    Hienz, Robert D.; Brady, Joseph V.; Hursh, Steven R.; Gasior, Eric D.; Spence, Kevin R.; Emurian, Henry H.

    Prior research with individually isolated 3-person crews in a distributed, interactive, planetary exploration simulation examined the effects of communication constraints and crew configuration changes on crew performance and psychosocial self-report measures. The present report extends these findings to a model of performance maintenance that operationalizes conditions under which disruptive affective responses by crew participants might be anticipated to emerge. Experiments evaluated the effects of changes in incentive conditions on crew performance and self-report measures in simulated space-dwelling groups. Crews participated in a simulated planetary exploration mission that required identification, collection, and analysis of geologic samples. Results showed that crew performance effectiveness was unaffected by either positive or negative incentive conditions, while self-report measures were differentially affected—negative incentive conditions produced pronounced increases in negative self-report ratings and decreases in positive self-report ratings, while positive incentive conditions produced increased positive self-report ratings only. Thus, incentive conditions associated with simulated spaceflight missions can significantly affect psychosocial adaptation without compromising task performance effectiveness in trained and experienced crews.

  6. Modeling and simulation goals and accomplishments

    International Nuclear Information System (INIS)

    Turinsky, P.

    2013-01-01

    The CASL (Consortium for Advanced Simulation of Light Water Reactors) mission is to develop and apply the Virtual Reactor simulator (VERA) to optimise nuclear power in terms of capital and operating costs, of nuclear waste production and of nuclear safety. An efficient and reliable virtual reactor simulator relies on 3-dimensional calculations, accurate physics models and code coupling. Advances in computer hardware, along with comparable advances in numerical solvers make the VERA project achievable. This series of slides details the VERA project and presents the specificities and performance of the codes involved in the project and ends by listing the computing needs

  7. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    Science.gov (United States)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  8. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  9. Constrained optimization via simulation models for new product innovation

    Science.gov (United States)

    Pujowidianto, Nugroho A.

    2017-11-01

    We consider the problem of constrained optimization where the decision makers aim to optimize the primary performance measure while constraining the secondary performance measures. This paper provides a brief overview of stochastically constrained optimization via discrete event simulation. Most review papers tend to be methodology-based. This review attempts to be problem-based as decision makers may have already decided on the problem formulation. We consider constrained optimization models as there are usually constraints on secondary performance measures as trade-off in new product development. It starts by laying out different possible methods and the reasons using constrained optimization via simulation models. It is then followed by the review of different simulation optimization approach to address constrained optimization depending on the number of decision variables, the type of constraints, and the risk preferences of the decision makers in handling uncertainties.

  10. MASADA: A MODELING AND SIMULATION AUTOMATED DATA ANALYSIS FRAMEWORK FOR CONTINUOUS DATA-INTENSIVE VALIDATION OF SIMULATION MODELS

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  11. MASADA: A Modeling and Simulation Automated Data Analysis framework for continuous data-intensive validation of simulation models

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  12. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  13. Uncertainty and sensitivity analysis in building performance simulation for decision support and design optimization

    NARCIS (Netherlands)

    Hopfe, C.J.

    2009-01-01

    Building performance simulation (BPS) uses computer-based models that cover performance aspects such as energy consumption and thermal comfort in buildings. The uptake of BPS in current building design projects is limited. Although there is a large number of building simulation tools available, the

  14. Architecture oriented modeling and simulation method for combat mission profile

    Directory of Open Access Journals (Sweden)

    CHEN Xia

    2017-05-01

    Full Text Available In order to effectively analyze the system behavior and system performance of combat mission profile, an architecture-oriented modeling and simulation method is proposed. Starting from the architecture modeling,this paper describes the mission profile based on the definition from National Military Standard of China and the US Department of Defense Architecture Framework(DoDAFmodel, and constructs the architecture model of the mission profile. Then the transformation relationship between the architecture model and the agent simulation model is proposed to form the mission profile executable model. At last,taking the air-defense mission profile as an example,the agent simulation model is established based on the architecture model,and the input and output relations of the simulation model are analyzed. It provides method guidance for the combat mission profile design.

  15. New exploration on TMSR: modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Si, S.; Chen, Q.; Bei, H.; Zhao, J., E-mail: ssy@snerdi.com.cn [Shanghai Nuclear Engineering Research & Design Inst., Shanghai (China)

    2015-07-01

    A tightly coupled multi-physics model for MSR (Molten Salt Reactor) system involving the reactor core and the rest of the primary loop has been developed and employed in an in-house developed computer code TANG-MSR. In this paper, the computer code is used to simulate the behavior of steady state operation and transient for our redesigned TMSR. The presented simulation results demonstrate that the models employed in TANG-MSR can capture major physics phenomena in MSR and the redesigned TMSR has excellent performance of safety and sustainability. (author)

  16. On the performance simulation of inter-stage turbine reheat

    International Nuclear Information System (INIS)

    Pellegrini, Alvise; Nikolaidis, Theoklis; Pachidis, Vassilios; Köhler, Stephan

    2017-01-01

    Highlights: • An innovative gas turbine performance simulation methodology is proposed. • It allows to perform DP and OD performance calculations for complex engines layouts. • It is essential for inter-turbine reheat (ITR) engine performance calculation. • A detailed description is provided for fast and flexible implementation. • The methodology is successfully verified against a commercial closed-source software. - Abstract: Several authors have suggested the implementation of reheat in high By-Pass Ratio (BPR) aero engines, to improve engine performance. In contrast to military afterburning, civil aero engines would aim at reducing Specific Fuel Consumption (SFC) by introducing ‘Inter-stage Turbine Reheat’ (ITR). To maximise benefits, the second combustor should be placed at an early stage of the expansion process, e.g. between the first and second High-Pressure Turbine (HPT) stages. The aforementioned cycle design requires the accurate simulation of two or more turbine stages on the same shaft. The Design Point (DP) performance can be easily evaluated by defining a Turbine Work Split (TWS) ratio between the turbine stages. However, the performance simulation of Off-Design (OD) operating points requires the calculation of the TWS parameter for every OD step, by taking into account the thermodynamic behaviour of each turbine stage, represented by their respective maps. No analytical solution of the aforementioned problem is currently available in the public domain. This paper presents an analytical methodology by which ITR can be simulated at DP and OD. Results show excellent agreement with a commercial, closed-source performance code; discrepancies range from 0% to 3.48%, and are ascribed to the different gas models implemented in the codes.

  17. Numerical simulation investigation on centrifugal compressor performance of turbocharger

    International Nuclear Information System (INIS)

    Li, Jie; Yin, Yuting; Li, Shuqi; Zhang, Jizhong

    2013-01-01

    In this paper, the mathematical model of the flow filed in centrifugal compressor of turbocharger was studied. Based on the theory of computational fluid dynamics (CFD), performance curves and parameter distributions of the compressor were obtained from the 3-D numerical simulation by using CFX. Meanwhile, the influences of grid number and distribution on compressor performance were investigated, and numerical calculation method was analyzed and validated, through combining with test data. The results obtained show the increase of the grid number has little influence on compressor performance while the grid number of single-passage is above 300,000. The results also show that the numerical calculation mass flow rate of compressor choke situation has a good consistent with test results, and the maximum difference of the diffuser exit pressure between simulation and experiment decrease to 3.5% with the assumption of 6 kPa additional total pressure loss at compressor inlet. The numerical simulation method in this paper can be used to predict compressor performance, and the difference of total pressure ratio between calculation and test is less than 7%, and the total-to-total efficiency also have a good consistent with test.

  18. Numerical simulation investigation on centrifugal compressor performance of turbocharger

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jie [China Iron and Steel Research Institute Group, Beijing (China); Yin, Yuting [China North Engine Research Institute, Datong (China); Li, Shuqi; Zhang, Jizhong [Science and Technology Diesel Engine Turbocharging Laboratory, Datong (China)

    2013-06-15

    In this paper, the mathematical model of the flow filed in centrifugal compressor of turbocharger was studied. Based on the theory of computational fluid dynamics (CFD), performance curves and parameter distributions of the compressor were obtained from the 3-D numerical simulation by using CFX. Meanwhile, the influences of grid number and distribution on compressor performance were investigated, and numerical calculation method was analyzed and validated, through combining with test data. The results obtained show the increase of the grid number has little influence on compressor performance while the grid number of single-passage is above 300,000. The results also show that the numerical calculation mass flow rate of compressor choke situation has a good consistent with test results, and the maximum difference of the diffuser exit pressure between simulation and experiment decrease to 3.5% with the assumption of 6 kPa additional total pressure loss at compressor inlet. The numerical simulation method in this paper can be used to predict compressor performance, and the difference of total pressure ratio between calculation and test is less than 7%, and the total-to-total efficiency also have a good consistent with test.

  19. Driving Simulator Development and Performance Study

    OpenAIRE

    Juto, Erik

    2010-01-01

    The driving simulator is a vital tool for much of the research performed at theSwedish National Road and Transport Institute (VTI). Currently VTI posses three driving simulators, two high fidelity simulators developed and constructed by VTI, and a medium fidelity simulator from the German company Dr.-Ing. Reiner Foerst GmbH. The two high fidelity simulators run the same simulation software, developed at VTI. The medium fidelity simulator runs a proprietary simulation software. At VTI there is...

  20. Simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  1. Validation of advanced NSSS simulator model for loss-of-coolant accidents

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.P.; Chang, S.K.; Huang, H.C. [Nuclear Training Branch, Northeast Utilities, Waterford, CT (United States)

    1995-09-01

    The replacement of the NSSS (Nuclear Steam Supply System) model on the Millstone 2 full-scope simulator has significantly increased its fidelity to simulate adverse conditions in the RCS. The new simulator NSSS model is a real-time derivative of the Nuclear Plant Analyzer by ABB. The thermal-hydraulic model is a five-equation, non-homogeneous model for water, steam, and non-condensible gases. The neutronic model is a three-dimensional nodal diffusion model. In order to certify the new NSSS model for operator training, an extensive validation effort has been performed by benchmarking the model performance against RELAP5/MOD2. This paper presents the validation results for the cases of small-and large-break loss-of-coolant accidents (LOCA). Detailed comparisons in the phenomena of reflux-condensation, phase separation, and two-phase natural circulation are discussed.

  2. The Five Key Questions of Human Performance Modeling.

    Science.gov (United States)

    Wu, Changxu

    2018-01-01

    Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.

  3. Performance evaluation by simulation and analysis with applications to computer networks

    CERN Document Server

    Chen, Ken

    2015-01-01

    This book is devoted to the most used methodologies for performance evaluation: simulation using specialized software and mathematical modeling. An important part is dedicated to the simulation, particularly in its theoretical framework and the precautions to be taken in the implementation of the experimental procedure.  These principles are illustrated by concrete examples achieved through operational simulation languages ​​(OMNeT ++, OPNET). Presented under the complementary approach, the mathematical method is essential for the simulation. Both methodologies based largely on the theory of

  4. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  5. Enhanced Contact Graph Routing (ECGR) MACHETE Simulation Model

    Science.gov (United States)

    Segui, John S.; Jennings, Esther H.; Clare, Loren P.

    2013-01-01

    Contact Graph Routing (CGR) for Delay/Disruption Tolerant Networking (DTN) space-based networks makes use of the predictable nature of node contacts to make real-time routing decisions given unpredictable traffic patterns. The contact graph will have been disseminated to all nodes before the start of route computation. CGR was designed for space-based networking environments where future contact plans are known or are independently computable (e.g., using known orbital dynamics). For each data item (known as a bundle in DTN), a node independently performs route selection by examining possible paths to the destination. Route computation could conceivably run thousands of times a second, so computational load is important. This work refers to the simulation software model of Enhanced Contact Graph Routing (ECGR) for DTN Bundle Protocol in JPL's MACHETE simulation tool. The simulation model was used for performance analysis of CGR and led to several performance enhancements. The simulation model was used to demonstrate the improvements of ECGR over CGR as well as other routing methods in space network scenarios. ECGR moved to using earliest arrival time because it is a global monotonically increasing metric that guarantees the safety properties needed for the solution's correctness since route re-computation occurs at each node to accommodate unpredicted changes (e.g., traffic pattern, link quality). Furthermore, using earliest arrival time enabled the use of the standard Dijkstra algorithm for path selection. The Dijkstra algorithm for path selection has a well-known inexpensive computational cost. These enhancements have been integrated into the open source CGR implementation. The ECGR model is also useful for route metric experimentation and comparisons with other DTN routing protocols particularly when combined with MACHETE's space networking models and Delay Tolerant Link State Routing (DTLSR) model.

  6. Uterus models for use in virtual reality hysteroscopy simulators.

    Science.gov (United States)

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  7. Regional model simulations of New Zealand climate

    Science.gov (United States)

    Renwick, James A.; Katzfey, Jack J.; Nguyen, Kim C.; McGregor, John L.

    1998-03-01

    Simulation of New Zealand climate is examined through the use of a regional climate model nested within the output of the Commonwealth Scientific and Industrial Research Organisation nine-level general circulation model (GCM). R21 resolution GCM output is used to drive a regional model run at 125 km grid spacing over the Australasian region. The 125 km run is used in turn to drive a simulation at 50 km resolution over New Zealand. Simulations with a full seasonal cycle are performed for 10 model years. The focus is on the quality of the simulation of present-day climate, but results of a doubled-CO2 run are discussed briefly. Spatial patterns of mean simulated precipitation and surface temperatures improve markedly as horizontal resolution is increased, through the better resolution of the country's orography. However, increased horizontal resolution leads to a positive bias in precipitation. At 50 km resolution, simulated frequency distributions of daily maximum/minimum temperatures are statistically similar to those of observations at many stations, while frequency distributions of daily precipitation appear to be statistically different to those of observations at most stations. Modeled daily precipitation variability at 125 km resolution is considerably less than observed, but is comparable to, or exceeds, observed variability at 50 km resolution. The sensitivity of the simulated climate to changes in the specification of the land surface is discussed briefly. Spatial patterns of the frequency of extreme temperatures and precipitation are generally well modeled. Under a doubling of CO2, the frequency of precipitation extremes changes only slightly at most locations, while air frosts become virtually unknown except at high-elevation sites.

  8. Simulation modeling for quality and productivity in steel cord manufacturing

    OpenAIRE

    Türkseven, Can Hulusi; Turkseven, Can Hulusi; Ertek, Gürdal; Ertek, Gurdal

    2003-01-01

    We describe the application of simulation modeling to estimate and improve quality and productivity performance of a steel cord manufacturing system. We describe the typical steel cord manufacturing plant, emphasize its distinguishing characteristics, identify various production settings and discuss applicability of simulation as a management decision support tool. Besides presenting the general structure of the developed simulation model, we focus on wire fractures, which can be an important...

  9. Effects of Structural Transparency in System Dynamics Simulators on Performance and Understanding

    Directory of Open Access Journals (Sweden)

    Birgit Kopainsky

    2015-10-01

    Full Text Available Prior exploration is an instructional strategy that has improved performance and understanding in system-dynamics-based simulators, but only to a limited degree. This study investigates whether model transparency, that is, showing users the internal structure of models, can extend the prior exploration strategy and improve learning even more. In an experimental study, participants in a web-based simulation learned about and managed a small developing nation. All participants were provided the prior exploration strategy but only half received prior exploration embedded in a structure-behavior diagram intended to make the underlying model’s structure more transparent. Participants provided with the more transparent strategy demonstrated better understanding of the underlying model. Their performance, however, was the equivalent to those in the less transparent condition. Combined with previous studies, our results suggest that while prior exploration is a beneficial strategy for both performance and understanding, making the model structure transparent with structure-behavior diagrams is more limited in its effect.

  10. On the Fidelity of Semi-distributed Hydrologic Model Simulations for Large Scale Catchment Applications

    Science.gov (United States)

    Ajami, H.; Sharma, A.; Lakshmi, V.

    2017-12-01

    Application of semi-distributed hydrologic modeling frameworks is a viable alternative to fully distributed hyper-resolution hydrologic models due to computational efficiency and resolving fine-scale spatial structure of hydrologic fluxes and states. However, fidelity of semi-distributed model simulations is impacted by (1) formulation of hydrologic response units (HRUs), and (2) aggregation of catchment properties for formulating simulation elements. Here, we evaluate the performance of a recently developed Soil Moisture and Runoff simulation Toolkit (SMART) for large catchment scale simulations. In SMART, topologically connected HRUs are delineated using thresholds obtained from topographic and geomorphic analysis of a catchment, and simulation elements are equivalent cross sections (ECS) representative of a hillslope in first order sub-basins. Earlier investigations have shown that formulation of ECSs at the scale of a first order sub-basin reduces computational time significantly without compromising simulation accuracy. However, the implementation of this approach has not been fully explored for catchment scale simulations. To assess SMART performance, we set-up the model over the Little Washita watershed in Oklahoma. Model evaluations using in-situ soil moisture observations show satisfactory model performance. In addition, we evaluated the performance of a number of soil moisture disaggregation schemes recently developed to provide spatially explicit soil moisture outputs at fine scale resolution. Our results illustrate that the statistical disaggregation scheme performs significantly better than the methods based on topographic data. Future work is focused on assessing the performance of SMART using remotely sensed soil moisture observations using spatially based model evaluation metrics.

  11. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  12. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  13. Intraocular Telescopic System Design: Optical and Visual Simulation in a Human Eye Model

    OpenAIRE

    Zoulinakis, Georgios; Ferrer-Blasco, Teresa

    2017-01-01

    Purpose. To design an intraocular telescopic system (ITS) for magnifying retinal image and to simulate its optical and visual performance after implantation in a human eye model. Methods. Design and simulation were carried out with a ray-tracing and optical design software. Two different ITS were designed, and their visual performance was simulated using the Liou-Brennan eye model. The difference between the ITS was their lenses’ placement in the eye model and their powers. Ray tracing in bot...

  14. CFD simulation of rotor aerodynamic performance when using additional surface structure array

    Science.gov (United States)

    Wang, Bing; Kong, Deyi

    2017-10-01

    The present work analyses the aerodynamic performance of the rotor with additional surface structure array in an attempt to maximize its performance in hover flight. The unstructured grids and the Reynolds Average Navier-Stokes equations were used to calculate the performance of the prototype rotor and the rotor with additional surface structure array in the air. The computational fluid dynamics software FLUENT was used to simulate the thrust of the rotors. The results of the calculations are in reasonable agreement with experimental data, which shows that the calculation model used in this work is useful in simulating the performance of the rotor with additional surface structure array. With this theoretical calculation model, the thrusts of the rotors with arrays of surface structure in three different shapes were calculated. According to the simulation results and the experimental data, the rotor with triangle surface structure array has better aerodynamic performance than the other rotors. In contrast with the prototype rotor, the thrust of the rotor with triangle surface structure array increases by 5.2% at the operating rotating speed of 3000r/min, and the additional triangle surface structure array has almost no influence on the efficiency of the rotor.

  15. 3D numerical simulation on heat transfer performance of a cylindrical liquid immersion solar receiver

    International Nuclear Information System (INIS)

    Xiang Haijun; Wang Yiping; Zhu Li; Han Xinyue; Sun Yong; Zhao Zhengjian

    2012-01-01

    Highlights: ► Establishment of a three-dimensional numerical simulation model of a cylindrical liquid immersion solar receiver. ► Determination of model parameters and validation of the model by using the real-collected data. ► Optimization of liquid flow rate and fin’s structure for better heat transfer performance. - Abstract: Liquid immersion cooling for a cylindrical solar receiver in a dish concentrator photovoltaic system has been experimentally verified to be a promising method of removing surplus heat from densely packed solar cells. In the present study, a three-dimensional (3D) numerical simulation model of the prototype was established for better understanding the mechanism of the direct-contact heat transfer process. With the selection of standard k–ε turbulent model, the detailed simulation results of velocity field and temperature characteristics were obtained. The heat transfer performance of two structural modules (bare module and finned module) under actual weather conditions was simulated. It was found that the predicted temperature distribution of the two structural modules at the axial and lateral direction was in good agreement with the experimental data. Based on the validated simulation model, the influence of liquid flow rate and module geometric parameters on the cell temperature was then investigated. The simulated results indicated that the cell module with fin height of 4 mm and fin number of 11 has the best heat transfer performance and will be used in further works.

  16. Influence of World and Gravity Model Selection on Surface Interacting Vehicle Simulations

    Science.gov (United States)

    Madden, Michael M.

    2007-01-01

    A vehicle simulation is surface-interacting if the state of the vehicle (position, velocity, and acceleration) relative to the surface is important. Surface-interacting simulations perform ascent, entry, descent, landing, surface travel, or atmospheric flight. Modeling of gravity is an influential environmental factor for surface-interacting simulations. Gravity is the free-fall acceleration observed from a world-fixed frame that rotates with the world. Thus, gravity is the sum of gravitation and the centrifugal acceleration due to the world s rotation. In surface-interacting simulations, the fidelity of gravity at heights above the surface is more significant than gravity fidelity at locations in inertial space. A surface-interacting simulation cannot treat the gravity model separately from the world model, which simulates the motion and shape of the world. The world model's simulation of the world's rotation, or lack thereof, produces the centrifugal acceleration component of gravity. The world model s reproduction of the world's shape will produce different positions relative to the world center for a given height above the surface. These differences produce variations in the gravitation component of gravity. This paper examines the actual performance of world and gravity/gravitation pairs in a simulation using the Earth.

  17. An applied artificial intelligence approach towards assessing building performance simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  18. Accelerating transient simulation of linear reduced order models.

    Energy Technology Data Exchange (ETDEWEB)

    Thornquist, Heidi K.; Mei, Ting; Keiter, Eric Richard; Bond, Brad

    2011-10-01

    Model order reduction (MOR) techniques have been used to facilitate the analysis of dynamical systems for many years. Although existing model reduction techniques are capable of providing huge speedups in the frequency domain analysis (i.e. AC response) of linear systems, such speedups are often not obtained when performing transient analysis on the systems, particularly when coupled with other circuit components. Reduced system size, which is the ostensible goal of MOR methods, is often insufficient to improve transient simulation speed on realistic circuit problems. It can be shown that making the correct reduced order model (ROM) implementation choices is crucial to the practical application of MOR methods. In this report we investigate methods for accelerating the simulation of circuits containing ROM blocks using the circuit simulator Xyce.

  19. Performance of the CORDEX regional climate models in simulating offshore wind and wind potential

    Science.gov (United States)

    Kulkarni, Sumeet; Deo, M. C.; Ghosh, Subimal

    2018-03-01

    This study is oriented towards quantification of the skill addition by regional climate models (RCMs) in the parent general circulation models (GCMs) while simulating wind speed and wind potential with particular reference to the Indian offshore region. To arrive at a suitable reference dataset, the performance of wind outputs from three different reanalysis datasets is evaluated. The comparison across the RCMs and their corresponding parent GCMs is done on the basis of annual/seasonal wind statistics, intermodel bias, wind climatology, and classes of wind potential. It was observed that while the RCMs could simulate spatial variability of winds, well for certain subregions, they generally failed to replicate the overall spatial pattern, especially in monsoon and winter. Various causes of biases in RCMs were determined by assessing corresponding maps of wind vectors, surface temperature, and sea-level pressure. The results highlight the necessity to carefully assess the RCM-yielded winds before using them for sensitive applications such as coastal vulnerability and hazard assessment. A supplementary outcome of this study is in form of wind potential atlas, based on spatial distribution of wind classes. This could be beneficial in suitably identifying viable subregions for developing offshore wind farms by intercomparing both the RCM and GCM outcomes. It is encouraging that most of the RCMs and GCMs indicate that around 70% of the Indian offshore locations in monsoon would experience mean wind potential greater than 200 W/m2.

  20. Critical review of glass performance modeling

    International Nuclear Information System (INIS)

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process

  1. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  2. MMSNF 2005. Materials models and simulations for nuclear fuels

    Energy Technology Data Exchange (ETDEWEB)

    Freyss, M.; Durinck, J.; Carlot, G.; Sabathier, C.; Martin, P.; Garcia, P.; Ripert, M.; Blanpain, P.; Lippens, M.; Schut, H.; Federov, A.V.; Bakker, K.; Osaka, M.; Miwa, S.; Sato, I.; Tanaka, K.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Govers, K.; Verwerft, M.; Hou, M.; Lemehov, S.E.; Terentyev, D.; Govers, K.; Kotomin, E.A.; Ashley, N.J.; Grimes, R.W.; Van Uffelen, P.; Mastrikov, Y.; Zhukovskii, Y.; Rondinella, V.V.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Minato, K.; Phillpot, S.; Watanabe, T.; Shukla, P.; Sinnott, S.; Nino, J.; Grimes, R.; Staicu, D.; Hiernaut, J.P.; Wiss, T.; Rondinella, V.V.; Ronchi, C.; Yakub, E.; Kaye, M.H.; Morrison, C.; Higgs, J.D.; Akbari, F.; Lewis, B.J.; Thompson, W.T.; Gueneau, C.; Gosse, S.; Chatain, S.; Dumas, J.C.; Sundman, B.; Dupin, N.; Konings, R.; Noel, H.; Veshchunov, M.; Dubourg, R.; Ozrin, C.V.; Veshchunov, M.S.; Welland, M.T.; Blanc, V.; Michel, B.; Ricaud, J.M.; Calabrese, R.; Vettraino, F.; Tverberg, T.; Kissane, M.; Tulenko, J.; Stan, M.; Ramirez, J.C.; Cristea, P.; Rachid, J.; Kotomin, E.; Ciriello, A.; Rondinella, V.V.; Staicu, D.; Wiss, T.; Konings, R.; Somers, J.; Killeen, J

    2006-07-01

    The MMSNF Workshop series aims at stimulating research and discussions on models and simulations of nuclear fuels and coupling the results into fuel performance codes.This edition was focused on materials science and engineering for fuel performance codes. The presentations were grouped in three technical sessions: fundamental modelling of fuel properties; integral fuel performance codes and their validation; collaborations and integration of activities. (A.L.B.)

  3. MMSNF 2005. Materials models and simulations for nuclear fuels

    International Nuclear Information System (INIS)

    Freyss, M.; Durinck, J.; Carlot, G.; Sabathier, C.; Martin, P.; Garcia, P.; Ripert, M.; Blanpain, P.; Lippens, M.; Schut, H.; Federov, A.V.; Bakker, K.; Osaka, M.; Miwa, S.; Sato, I.; Tanaka, K.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Govers, K.; Verwerft, M.; Hou, M.; Lemehov, S.E.; Terentyev, D.; Govers, K.; Kotomin, E.A.; Ashley, N.J.; Grimes, R.W.; Van Uffelen, P.; Mastrikov, Y.; Zhukovskii, Y.; Rondinella, V.V.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Minato, K.; Phillpot, S.; Watanabe, T.; Shukla, P.; Sinnott, S.; Nino, J.; Grimes, R.; Staicu, D.; Hiernaut, J.P.; Wiss, T.; Rondinella, V.V.; Ronchi, C.; Yakub, E.; Kaye, M.H.; Morrison, C.; Higgs, J.D.; Akbari, F.; Lewis, B.J.; Thompson, W.T.; Gueneau, C.; Gosse, S.; Chatain, S.; Dumas, J.C.; Sundman, B.; Dupin, N.; Konings, R.; Noel, H.; Veshchunov, M.; Dubourg, R.; Ozrin, C.V.; Veshchunov, M.S.; Welland, M.T.; Blanc, V.; Michel, B.; Ricaud, J.M.; Calabrese, R.; Vettraino, F.; Tverberg, T.; Kissane, M.; Tulenko, J.; Stan, M.; Ramirez, J.C.; Cristea, P.; Rachid, J.; Kotomin, E.; Ciriello, A.; Rondinella, V.V.; Staicu, D.; Wiss, T.; Konings, R.; Somers, J.; Killeen, J.

    2006-01-01

    The MMSNF Workshop series aims at stimulating research and discussions on models and simulations of nuclear fuels and coupling the results into fuel performance codes.This edition was focused on materials science and engineering for fuel performance codes. The presentations were grouped in three technical sessions: fundamental modelling of fuel properties; integral fuel performance codes and their validation; collaborations and integration of activities. (A.L.B.)

  4. Four dimensional data assimilation (FDDA) impacts on WRF performance in simulating inversion layer structure and distributions of CMAQ-simulated winter ozone concentrations in Uintah Basin

    Science.gov (United States)

    Tran, Trang; Tran, Huy; Mansfield, Marc; Lyman, Seth; Crosman, Erik

    2018-03-01

    Four-dimensional data assimilation (FDDA) was applied in WRF-CMAQ model sensitivity tests to study the impact of observational and analysis nudging on model performance in simulating inversion layers and O3 concentration distributions within the Uintah Basin, Utah, U.S.A. in winter 2013. Observational nudging substantially improved WRF model performance in simulating surface wind fields, correcting a 10 °C warm surface temperature bias, correcting overestimation of the planetary boundary layer height (PBLH) and correcting underestimation of inversion strengths produced by regular WRF model physics without nudging. However, the combined effects of poor performance of WRF meteorological model physical parameterization schemes in simulating low clouds, and warm and moist biases in the temperature and moisture initialization and subsequent simulation fields, likely amplified the overestimation of warm clouds during inversion days when observational nudging was applied, impacting the resulting O3 photochemical formation in the chemistry model. To reduce the impact of a moist bias in the simulations on warm cloud formation, nudging with the analysis water mixing ratio above the planetary boundary layer (PBL) was applied. However, due to poor analysis vertical temperature profiles, applying analysis nudging also increased the errors in the modeled inversion layer vertical structure compared to observational nudging. Combining both observational and analysis nudging methods resulted in unrealistically extreme stratified stability that trapped pollutants at the lowest elevations at the center of the Uintah Basin and yielded the worst WRF performance in simulating inversion layer structure among the four sensitivity tests. The results of this study illustrate the importance of carefully considering the representativeness and quality of the observational and model analysis data sets when applying nudging techniques within stable PBLs, and the need to evaluate model results

  5. Dynamic modeling and simulation of power transformer maintenance costs

    Directory of Open Access Journals (Sweden)

    Ristić Olga

    2016-01-01

    Full Text Available The paper presents the dynamic model of maintenance costs of the power transformer functional components. Reliability is modeled combining the exponential and Weibull's distribution. The simulation was performed with the aim of corrective maintenance and installation of the continuous monitoring system of the most critical components. Simulation Dynamic System (SDS method and VENSIM PLE software was used to simulate the cost. In this way, significant savings in maintenance costs will be achieved with a small initial investment. [Projekat Ministarstva nauke Republike Srbije, br. III 41025 i br. OI 171007

  6. A New Model for Simulating TSS Washoff in Urban Areas

    Directory of Open Access Journals (Sweden)

    E. Crobeddu

    2011-01-01

    Full Text Available This paper presents the formulation and validation of the conceptual Runoff Quality Simulation Model (RQSM that was developed to simulate the erosion and transport of solid particles in urban areas. The RQSM assumes that solid particle accumulation on pervious and impervious areas is infinite. The RQSM simulates soil erosion using rainfall kinetic energy and solid particle transport with linear system theory. A sensitivity analysis was conducted on the RQSM to show the influence of each parameter on the simulated load. Total suspended solid (TSS loads monitored at the outlet of the borough of Verdun in Canada and at three catchment outlets of the City of Champaign in the United States were used to validate the RQSM. TSS loads simulated by the RQSM were compared to measured loads and to loads simulated by the Rating Curve model and the Exponential model of the SWMM software. The simulation performance of the RQSM was comparable to the Exponential and Rating Curve models.

  7. OPNET Modeler simulations of performance for multi nodes wireless systems

    Directory of Open Access Journals (Sweden)

    Krupanek Beata

    2016-01-01

    Full Text Available Paper presents a study under the Quality of Service in modern wireless sensor networks. Such a networks are characterized by small amount of data transmitted in fixed periods. Very often this data must by transmitted in real time so data transmission delays should be well known. This article shows multimode network simulated in packet OPNET Modeler. Also nowadays the quality of services is very important especially in multi-nodes systems such a home automation or measurement systems.

  8. Advanced Models and Algorithms for Self-Similar IP Network Traffic Simulation and Performance Analysis

    Science.gov (United States)

    Radev, Dimitar; Lokshina, Izabella

    2010-11-01

    The paper examines self-similar (or fractal) properties of real communication network traffic data over a wide range of time scales. These self-similar properties are very different from the properties of traditional models based on Poisson and Markov-modulated Poisson processes. Advanced fractal models of sequentional generators and fixed-length sequence generators, and efficient algorithms that are used to simulate self-similar behavior of IP network traffic data are developed and applied. Numerical examples are provided; and simulation results are obtained and analyzed.

  9. Leakage flow simulation in a specific pump model

    International Nuclear Information System (INIS)

    Dupont, P; Bayeul-Lainé, A C; Dazin, A; Bois, G; Roussette, O; Si, Q

    2014-01-01

    This paper deals with the influence of leakage flow existing in SHF pump model on the analysis of internal flow behaviour inside the vane diffuser of the pump model performance using both experiments and calculations. PIV measurements have been performed at different hub to shroud planes inside one diffuser channel passage for a given speed of rotation and various flow rates. For each operating condition, the PIV measurements have been trigged with different angular impeller positions. The performances and the static pressure rise of the diffuser were also measured using a three-hole probe. The numerical simulations were carried out with Star CCM+ 8.06 code (RANS frozen and unsteady calculations). Comparisons between numerical and experimental results are presented and discussed for three flow rates. The performances of the diffuser obtained by numerical simulation results are compared to the performances obtained by three-hole probe indications. The comparisons show few influence of fluid leakage on global performances but a real improvement concerning the efficiency of the impeller, the pump and the velocity distributions. These results show that leakage is an important parameter that has to be taken into account in order to make improved comparisons between numerical approaches and experiments in such a specific model set up

  10. Tsunami simulation using submarine displacement calculated from simulation of ground motion due to seismic source model

    Science.gov (United States)

    Akiyama, S.; Kawaji, K.; Fujihara, S.

    2013-12-01

    Since fault fracturing due to an earthquake can simultaneously cause ground motion and tsunami, it is appropriate to evaluate the ground motion and the tsunami by single fault model. However, several source models are used independently in the ground motion simulation or the tsunami simulation, because of difficulty in evaluating both phenomena simultaneously. Many source models for the 2011 off the Pacific coast of Tohoku Earthquake are proposed from the inversion analyses of seismic observations or from those of tsunami observations. Most of these models show the similar features, which large amount of slip is located at the shallower part of fault area near the Japan Trench. This indicates that the ground motion and the tsunami can be evaluated by the single source model. Therefore, we examine the possibility of the tsunami prediction, using the fault model estimated from seismic observation records. In this study, we try to carry out the tsunami simulation using the displacement field of oceanic crustal movements, which is calculated from the ground motion simulation of the 2011 off the Pacific coast of Tohoku Earthquake. We use two fault models by Yoshida et al. (2011), which are based on both the teleseismic body wave and on the strong ground motion records. Although there is the common feature in those fault models, the amount of slip near the Japan trench is lager in the fault model from the strong ground motion records than in that from the teleseismic body wave. First, the large-scale ground motion simulations applying those fault models used by the voxel type finite element method are performed for the whole eastern Japan. The synthetic waveforms computed from the simulations are generally consistent with the observation records of K-NET (Kinoshita (1998)) and KiK-net stations (Aoi et al. (2000)), deployed by the National Research Institute for Earth Science and Disaster Prevention (NIED). Next, the tsunami simulations are performed by the finite

  11. Multi-scale modelling of supercapacitors: From molecular simulations to a transmission line model

    Science.gov (United States)

    Pean, C.; Rotenberg, B.; Simon, P.; Salanne, M.

    2016-09-01

    We perform molecular dynamics simulations of a typical nanoporous-carbon based supercapacitor. The organic electrolyte consists in 1-ethyl-3-methylimidazolium and hexafluorophosphate ions dissolved in acetonitrile. We simulate systems at equilibrium, for various applied voltages. This allows us to determine the relevant thermodynamic (capacitance) and transport (in-pore resistivities) properties. These quantities are then injected in a transmission line model for testing its ability to predict the charging properties of the device. The results from this macroscopic model are in good agreement with non-equilibrium molecular dynamics simulations, which validates its use for interpreting electrochemical impedance experiments.

  12. LISA Pathfinder E2E performance simulation: optical and self-gravity stability analysis

    Science.gov (United States)

    Brandt, N.; Fichter, W.; Kersten, M.; Lucarelli, S.; Montemurro, F.

    2005-05-01

    End-to-end (E2E) modelling and simulation, i.e. verifying the science performance of LISA Pathfinder (spacecraft and payload), is mandatory in order to minimize mission risks. In this paper, focus is on two particular applications of the E2E performance simulator currently being developed at EADS Astrium GmbH: the opto-dynamical stability and the self-gravity disturbance stability analysis. The E2E models applied here comprise the opto-dynamical modelling of the optical metrology systems (OMS) laser interferometry, the thermo-elastic distortion modelling of the OMS optical elements and the self-gravity disturbance model accounting for structural distortions. Preliminary analysis results are presented in detail, identifying shortcomings of the current LISA technology package (LTP) mounting baseline. As a consequence, the design is now being revised.

  13. LISA Pathfinder E2E performance simulation: optical and self-gravity stability analysis

    International Nuclear Information System (INIS)

    Brandt, N; Fichter, W; Kersten, M; Lucarelli, S; Montemurro, F

    2005-01-01

    End-to-end (E2E) modelling and simulation, i.e. verifying the science performance of LISA Pathfinder (spacecraft and payload), is mandatory in order to minimize mission risks. In this paper, focus is on two particular applications of the E2E performance simulator currently being developed at EADS Astrium GmbH: the opto-dynamical stability and the self-gravity disturbance stability analysis. The E2E models applied here comprise the opto-dynamical modelling of the optical metrology systems (OMS) laser interferometry, the thermo-elastic distortion modelling of the OMS optical elements and the self-gravity disturbance model accounting for structural distortions. Preliminary analysis results are presented in detail, identifying shortcomings of the current LISA technology package (LTP) mounting baseline. As a consequence, the design is now being revised

  14. Science-based HRA: experimental comparison of operator performance to IDAC (Information-Decision-Action Crew) simulations

    Energy Technology Data Exchange (ETDEWEB)

    Shirley, Rachel [The Ohio State Univ., Columbus, OH (United States); Smidts, Carol [The Ohio State Univ., Columbus, OH (United States); Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Li, Yuandan [Univ. of Maryland, College Park, MD (United States); Mosleh, Ali [Univ. of Maryland, College Park, MD (United States)

    2015-02-01

    Information-Decision-Action Crew (IDAC) operator model simulations of a Steam Generator Tube Rupture are compared to student operator performance in studies conducted in the Ohio State University’s Nuclear Power Plant Simulator Facility. This study is presented as a prototype for conducting simulator studies to validate key aspects of Human Reliability Analysis (HRA) methods. Seven student operator crews are compared to simulation results for crews designed to demonstrate three different decision-making strategies. The IDAC model used in the simulations is modified slightly to capture novice behavior rather that expert operators. Operator actions and scenario pacing are compared. A preliminary review of available performance shaping factors (PSFs) is presented. After the scenario in the NPP Simulator Facility, student operators review a video of the scenario and evaluate six PSFs at pre-determined points in the scenario. This provides a dynamic record of the PSFs experienced by the OSU student operators. In this preliminary analysis, Time Constraint Load (TCL) calculated in the IDAC simulations is compared to TCL reported by student operators. We identify potential modifications to the IDAC model to develop an “IDAC Student Operator Model.” This analysis provides insights into how similar experiments could be conducted using expert operators to improve the fidelity of IDAC simulations.

  15. Modeling and Simulation of Long-Term Performance of Near-Surface Barriers

    International Nuclear Information System (INIS)

    Piet, S. J.; Jacobson, J. J.; Martian, P.; Martineau, R.; Soto, R.

    2003-01-01

    INEEL started a new project on long-term barrier integrity in April 2002 that aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late, prior to system-level failure). This paper describes our computer simulation approach for better understanding the relationships and dynamics between the various components and management decisions in a cap. The simulation is designed to clarify the complex relationships between the various components within the cap system and the various management practices that affect the barrier performance. We have also conceptualized a time-dependent 3-D simulation with rigorous solution to unsaturated flow physics with complex surface boundary conditions

  16. Extended behavioural device modelling and circuit simulation with Qucs-S

    Science.gov (United States)

    Brinson, M. E.; Kuznetsov, V.

    2018-03-01

    Current trends in circuit simulation suggest a growing interest in open source software that allows access to more than one simulation engine while simultaneously supporting schematic drawing tools, behavioural Verilog-A and XSPICE component modelling, and output data post-processing. This article introduces a number of new features recently implemented in the 'Quite universal circuit simulator - SPICE variant' (Qucs-S), including structure and fundamental schematic capture algorithms, at the same time highlighting their use in behavioural semiconductor device modelling. Particular importance is placed on the interaction between Qucs-S schematics, equation-defined devices, SPICE B behavioural sources and hardware description language (HDL) scripts. The multi-simulator version of Qucs is a freely available tool that offers extended modelling and simulation features compared to those provided by legacy circuit simulators. The performance of a number of Qucs-S modelling extensions are demonstrated with a GaN HEMT compact device model and data obtained from tests using the Qucs-S/Ngspice/Xyce ©/SPICE OPUS multi-engine circuit simulator.

  17. Aviation Safety Simulation Model

    Science.gov (United States)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  18. A holistic 3D finite element simulation model for thermoelectric power generator element

    International Nuclear Information System (INIS)

    Wu, Guangxi; Yu, Xiong

    2014-01-01

    Highlights: • Development of a holistic simulation model for the thermoelectric energy harvester. • Account for delta Seebeck coefficient and carrier charge densities variations. • Solution of thermo-electric coupling problem with finite element method. • Model capable of predicting phenomena not captured by traditional models. • A simulation tool for design of innovative TEM materials and structures. - Abstract: Harvesting the thermal energy stored in the ambient environment provides a potential sustainable energy source. Thermoelectric power generators have advantages of having no moving parts, being durable, and light-weighted. These unique features are advantageous for many applications (i.e., carry-on medical devices, embedded infrastructure sensors, aerospace, transportation, etc.). To ensure the efficient applications of thermoelectric energy harvesting system, the behaviors of such systems need to be fully understood. Finite element simulations provide important tools for such purpose. Although modeling the performance of thermoelectric modules has been conducted by many researchers, due to the complexity in solving the coupled problem, the influences of the effective Seebeck coefficient and carrier density variations on the performance of thermoelectric system are generally neglected. This results in an overestimation of the power generator performance under strong-ionization temperature region. This paper presents an advanced simulation model for thermoelectric elements that considers the effects of both factors. The mathematical basis of this model is firstly presented. Finite element simulations are then implemented on a thermoelectric power generator unit. The characteristics of the thermoelectric power generator and their relationship to its performance are discussed under different working temperature regions. The internal physics processes of the TEM harvester are analyzed from the results of computational simulations. The new model

  19. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  20. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  1. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    Directory of Open Access Journals (Sweden)

    Yunzhou Fan

    Full Text Available BACKGROUND: Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. OBJECTIVE: This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. METHODS: Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1 outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. RESULTS: In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp <90%. CONCLUSIONS: The temporal simulation model based on healthcare-seeking behaviors offers an accessible method for evaluating the performance of multi-stream surveillance.

  2. In Patients with Cirrhosis, Driving Simulator Performance is Associated With Real-life Driving

    DEFF Research Database (Denmark)

    Lauridsen, Mette Enok Munk; Thacker, Leroy R; White, Melanie B

    2016-01-01

    BACKGROUND & AIMS: Minimal hepatic encephalopathy (MHE) has been linked to higher real-life rates of automobile crashes and poor performance in driving simulation studies, but the link between driving simulator performance and real-life automobile crashes has not been clearly established. Further......, not all patients with MHE are unsafe drivers, but it is unclear how to distinguish them from unsafe drivers. We investigated the link between performance on driving simulators and real-life automobile accidents and traffic violations. We also aimed to identify features of unsafe drivers with cirrhosis...... and evaluated changes in simulated driving skills and MHE status after 1 year. METHODS: We performed a study of outpatients with cirrhosis (n=205; median 55 years old; median model for end-stage liver disease score, 9.5; none with overt hepatic encephalopathy or alcohol or illicit drug use within previous 6...

  3. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  4. A Simulation Study: The Impact of Random and Realistic Mobility Models on the Performance of Bypass-AODV in Ad Hoc Wireless Networks

    Directory of Open Access Journals (Sweden)

    Baroudi Uthman

    2010-01-01

    Full Text Available To bring VANET into reality, it is crucial to devise routing protocols that can exploit the inherited characteristics of VANET environment to enhance the performance of the running applications. Previous studies have shown that a certain routing protocol behaves differently under different presumed mobility patterns. Bypass-AODV is a new optimization of the AODV routing protocol for mobile ad-hoc networks. It is proposed as a local recovery mechanism to enhance the performance of the AODV routing protocol. It shows outstanding performance under the Random Waypoint mobility model compared with AODV. However, Random Waypoint is a simple model that may be applicable to some scenarios but it is not sufficient to capture some important mobility characteristics of scenarios where VANETs are deployed. In this paper, we will investigate the performance of Bypass-AODV under a wide range of mobility models including other random mobility models, group mobility models, and vehicular mobility models. Simulation results show an interesting feature that is the insensitivity of Bypass-AODV to the selected random mobility model, and it has a clear performance improvement compared to AODV. For group mobility model, both protocols show a comparable performance, but for vehicular mobility models, Bypass-AODV suffers from performance degradation in high-speed conditions.

  5. Hypersonic Combustor Model Inlet CFD Simulations and Experimental Comparisons

    Science.gov (United States)

    Venkatapathy, E.; TokarcikPolsky, S.; Deiwert, G. S.; Edwards, Thomas A. (Technical Monitor)

    1995-01-01

    Numerous two-and three-dimensional computational simulations were performed for the inlet associated with the combustor model for the hypersonic propulsion experiment in the NASA Ames 16-Inch Shock Tunnel. The inlet was designed to produce a combustor-inlet flow that is nearly two-dimensional and of sufficient mass flow rate for large scale combustor testing. The three-dimensional simulations demonstrated that the inlet design met all the design objectives and that the inlet produced a very nearly two-dimensional combustor inflow profile. Numerous two-dimensional simulations were performed with various levels of approximations such as in the choice of chemical and physical models, as well as numerical approximations. Parametric studies were conducted to better understand and to characterize the inlet flow. Results from the two-and three-dimensional simulations were used to predict the mass flux entering the combustor and a mass flux correlation as a function of facility stagnation pressure was developed. Surface heat flux and pressure measurements were compared with the computed results and good agreement was found. The computational simulations helped determine the inlet low characteristics in the high enthalpy environment, the important parameters that affect the combustor-inlet flow, and the sensitivity of the inlet flow to various modeling assumptions.

  6. Crystal and molecular simulation of high-performance polymers.

    Science.gov (United States)

    Colquhoun, H M; Williams, D J

    2000-03-01

    Single-crystal X-ray analyses of oligomeric models for high-performance aromatic polymers, interfaced to computer-based molecular modeling and diffraction simulation, have enabled the determination of a range of previously unknown polymer crystal structures from X-ray powder data. Materials which have been successfully analyzed using this approach include aromatic polyesters, polyetherketones, polythioetherketones, polyphenylenes, and polycarboranes. Pure macrocyclic homologues of noncrystalline polyethersulfones afford high-quality single crystals-even at very large ring sizes-and have provided the first examples of a "protein crystallographic" approach to the structures of conventionally amorphous synthetic polymers.

  7. Speech Perception With Combined Electric-Acoustic Stimulation: A Simulation and Model Comparison.

    Science.gov (United States)

    Rader, Tobias; Adel, Youssef; Fastl, Hugo; Baumann, Uwe

    2015-01-01

    The aim of this study is to simulate speech perception with combined electric-acoustic stimulation (EAS), verify the advantage of combined stimulation in normal-hearing (NH) subjects, and then compare it with cochlear implant (CI) and EAS user results from the authors' previous study. Furthermore, an automatic speech recognition (ASR) system was built to examine the impact of low-frequency information and is proposed as an applied model to study different hypotheses of the combined-stimulation advantage. Signal-detection-theory (SDT) models were applied to assess predictions of subject performance without the need to assume any synergistic effects. Speech perception was tested using a closed-set matrix test (Oldenburg sentence test), and its speech material was processed to simulate CI and EAS hearing. A total of 43 NH subjects and a customized ASR system were tested. CI hearing was simulated by an aurally adequate signal spectrum analysis and representation, the part-tone-time-pattern, which was vocoded at 12 center frequencies according to the MED-EL DUET speech processor. Residual acoustic hearing was simulated by low-pass (LP)-filtered speech with cutoff frequencies 200 and 500 Hz for NH subjects and in the range from 100 to 500 Hz for the ASR system. Speech reception thresholds were determined in amplitude-modulated noise and in pseudocontinuous noise. Previously proposed SDT models were lastly applied to predict NH subject performance with EAS simulations. NH subjects tested with EAS simulations demonstrated the combined-stimulation advantage. Increasing the LP cutoff frequency from 200 to 500 Hz significantly improved speech reception thresholds in both noise conditions. In continuous noise, CI and EAS users showed generally better performance than NH subjects tested with simulations. In modulated noise, performance was comparable except for the EAS at cutoff frequency 500 Hz where NH subject performance was superior. The ASR system showed similar behavior

  8. Implementation of angular response function modeling in SPECT simulations with GATE

    International Nuclear Information System (INIS)

    Descourt, P; Visvikis, D; Carlier, T; Bardies, M; Du, Y; Song, X; Frey, E C; Tsui, B M W; Buvat, I

    2010-01-01

    Among Monte Carlo simulation codes in medical imaging, the GATE simulation platform is widely used today given its flexibility and accuracy, despite long run times, which in SPECT simulations are mostly spent in tracking photons through the collimators. In this work, a tabulated model of the collimator/detector response was implemented within the GATE framework to significantly reduce the simulation times in SPECT. This implementation uses the angular response function (ARF) model. The performance of the implemented ARF approach has been compared to standard SPECT GATE simulations in terms of the ARF tables' accuracy, overall SPECT system performance and run times. Considering the simulation of the Siemens Symbia T SPECT system using high-energy collimators, differences of less than 1% were measured between the ARF-based and the standard GATE-based simulations, while considering the same noise level in the projections, acceleration factors of up to 180 were obtained when simulating a planar 364 keV source seen with the same SPECT system. The ARF-based and the standard GATE simulation results also agreed very well when considering a four-head SPECT simulation of a realistic Jaszczak phantom filled with iodine-131, with a resulting acceleration factor of 100. In conclusion, the implementation of an ARF-based model of collimator/detector response for SPECT simulations within GATE significantly reduces the simulation run times without compromising accuracy. (note)

  9. Implementation of angular response function modeling in SPECT simulations with GATE

    Energy Technology Data Exchange (ETDEWEB)

    Descourt, P; Visvikis, D [INSERM, U650, LaTIM, IFR SclnBioS, Universite de Brest, CHU Brest, Brest, F-29200 (France); Carlier, T; Bardies, M [CRCNA INSERM U892, Nantes (France); Du, Y; Song, X; Frey, E C; Tsui, B M W [Department of Radiology, J Hopkins University, Baltimore, MD (United States); Buvat, I, E-mail: dimitris@univ-brest.f [IMNC-UMR 8165 CNRS Universites Paris 7 et Paris 11, Orsay (France)

    2010-05-07

    Among Monte Carlo simulation codes in medical imaging, the GATE simulation platform is widely used today given its flexibility and accuracy, despite long run times, which in SPECT simulations are mostly spent in tracking photons through the collimators. In this work, a tabulated model of the collimator/detector response was implemented within the GATE framework to significantly reduce the simulation times in SPECT. This implementation uses the angular response function (ARF) model. The performance of the implemented ARF approach has been compared to standard SPECT GATE simulations in terms of the ARF tables' accuracy, overall SPECT system performance and run times. Considering the simulation of the Siemens Symbia T SPECT system using high-energy collimators, differences of less than 1% were measured between the ARF-based and the standard GATE-based simulations, while considering the same noise level in the projections, acceleration factors of up to 180 were obtained when simulating a planar 364 keV source seen with the same SPECT system. The ARF-based and the standard GATE simulation results also agreed very well when considering a four-head SPECT simulation of a realistic Jaszczak phantom filled with iodine-131, with a resulting acceleration factor of 100. In conclusion, the implementation of an ARF-based model of collimator/detector response for SPECT simulations within GATE significantly reduces the simulation run times without compromising accuracy. (note)

  10. Photovoltaic performance models - A report card

    Science.gov (United States)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  11. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  12. Modeling and Simulation of Claus Unit Reaction Furnace

    Directory of Open Access Journals (Sweden)

    Maryam Pahlavan

    2016-01-01

    Full Text Available Reaction furnace is the most important part of the Claus sulfur recovery unit and its performance has a significant impact on the process efficiency. Too many reactions happen in the furnace and their kinetics and mechanisms are not completely understood; therefore, modeling reaction furnace is difficult and several works have been carried out on in this regard so far. Equilibrium models are commonly used to simulate the furnace, but the related literature states that the outlet of furnace is not in equilibrium and the furnace reactions are controlled by kinetic laws; therefore, in this study, the reaction furnace is simulated by a kinetic model. The predicted outlet temperature and concentrations by this model are compared with experimental data published in the literature and the data obtained by PROMAX V2.0 simulator. The results show that the accuracy of the proposed kinetic model and PROMAX simulator is almost similar, but the kinetic model used in this paper has two importance abilities. Firstly, it is a distributed model and can be used to obtain the temperature and concentration profiles along the furnace. Secondly, it is a dynamic model and can be used for analyzing the transient behavior and designing the control system.

  13. NRTA simulation by modeling PFPF

    International Nuclear Information System (INIS)

    Asano, Takashi; Fujiwara, Shigeo; Takahashi, Saburo; Shibata, Junichi; Totsu, Noriko

    2003-01-01

    In PFPF, NRTA system has been applied since 1991. It has been confirmed by evaluating facility material accountancy data provided from operator in each IIV that a significant MUF was not generated. In case of throughput of PFPF scale, MUF can be evaluated with a sufficient detection probability by the present NRTA evaluation manner. However, by increasing of throughput, the uncertainty of material accountancy will increase, and the detection probability will decline. The relationship between increasing of throughput and declining of detection probability and the maximum throughput upon application of following measures with a sufficient detection probability were evaluated by simulation of NRTA system. This simulation was performed by modeling of PFPF. Measures for increasing detection probability are shown as follows. Shortening of the evaluation interval. Segmentation of evaluation area. This report shows the results of these simulations. (author)

  14. Fracture network modeling and GoldSim simulation support

    International Nuclear Information System (INIS)

    Sugita, Kenichiro; Dershowitz, William

    2003-01-01

    During Heisei-14, Golder Associates provided support for JNC Tokai through data analysis and simulation of the MIU Underground Rock Laboratory, participation in Task 6 of the Aespoe Task Force on Modelling of Groundwater Flow and Transport, and analysis of repository safety assessment technologies including cell networks for evaluation of the disturbed rock zone (DRZ) and total systems performance assessment (TSPA). MIU Underground Rock Laboratory support during H-14 involved discrete fracture network (DFN) modelling in support of the Multiple Modelling Project (MMP) and the Long Term Pumping Test (LPT). Golder developed updated DFN models for the MIU site, reflecting updated analyses of fracture data. Golder also developed scripts to support JNC simulations of flow and transport pathways within the MMP. Golder supported JNC participation in Task 6 of the Aespoe Task Force on Modelling of Groundwater Flow and Transport during H-14. Task 6A and 6B compared safety assessment (PA) and experimental time scale simulations along a pipe transport pathway. Task 6B2 extended Task 6B simulations from 1-D to 2-D. For Task 6B2, Golder carried out single fracture transport simulations on a wide variety of generic heterogeneous 2D fractures using both experimental and safety assessment boundary conditions. The heterogeneous 2D fractures were implemented according to a variety of in plane heterogeneity patterns. Multiple immobile zones were considered including stagnant zones, infillings, altered wall rock, and intact rock. During H-14, JNC carried out extensive studies of the distributed rock zone (DRZ) surrounding repository tunnels and drifts. Golder supported this activity be evaluating the calculation time necessary for simulating a reference heterogeneous DRZ cell network for a range of computational strategies. To support the development of JNC's total system performance assessment (TSPA) strategy, Golder carried out a review of the US DOE Yucca Mountain Project TSPA. This

  15. Control-relevant modeling and simulation of a SOFC-GT hybrid system

    OpenAIRE

    Rambabu Kandepu; Lars Imsland; Christoph Stiller; Bjarne A. Foss; Vinay Kariwala

    2006-01-01

    In this paper, control-relevant models of the most important components in a SOFC-GT hybrid system are described. Dynamic simulations are performed on the overall hybrid system. The model is used to develop a simple control structure, but the simulations show that more elaborate control is needed.

  16. Management of Industrial Performance Indicators: Regression Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Walter Roberto Hernandez Vergara

    2017-11-01

    Full Text Available Stochastic methods can be used in problem solving and explanation of natural phenomena through the application of statistical procedures. The article aims to associate the regression analysis and systems simulation, in order to facilitate the practical understanding of data analysis. The algorithms were developed in Microsoft Office Excel software, using statistical techniques such as regression theory, ANOVA and Cholesky Factorization, which made it possible to create models of single and multiple systems with up to five independent variables. For the analysis of these models, the Monte Carlo simulation and analysis of industrial performance indicators were used, resulting in numerical indices that aim to improve the goals’ management for compliance indicators, by identifying systems’ instability, correlation and anomalies. The analytical models presented in the survey indicated satisfactory results with numerous possibilities for industrial and academic applications, as well as the potential for deployment in new analytical techniques.

  17. Theoretical Models of Protostellar Binary and Multiple Systems with AMR Simulations

    Science.gov (United States)

    Matsumoto, Tomoaki; Tokuda, Kazuki; Onishi, Toshikazu; Inutsuka, Shu-ichiro; Saigo, Kazuya; Takakuwa, Shigehisa

    2017-05-01

    We present theoretical models for protostellar binary and multiple systems based on the high-resolution numerical simulation with an adaptive mesh refinement (AMR) code, SFUMATO. The recent ALMA observations have revealed early phases of the binary and multiple star formation with high spatial resolutions. These observations should be compared with theoretical models with high spatial resolutions. We present two theoretical models for (1) a high density molecular cloud core, MC27/L1521F, and (2) a protobinary system, L1551 NE. For the model for MC27, we performed numerical simulations for gravitational collapse of a turbulent cloud core. The cloud core exhibits fragmentation during the collapse, and dynamical interaction between the fragments produces an arc-like structure, which is one of the prominent structures observed by ALMA. For the model for L1551 NE, we performed numerical simulations of gas accretion onto protobinary. The simulations exhibit asymmetry of a circumbinary disk. Such asymmetry has been also observed by ALMA in the circumbinary disk of L1551 NE.

  18. GENERAL REQUIREMENTS FOR SIMULATION MODELS IN WASTE MANAGEMENT

    International Nuclear Information System (INIS)

    Miller, Ian; Kossik, Rick; Voss, Charlie

    2003-01-01

    Most waste management activities are decided upon and carried out in a public or semi-public arena, typically involving the waste management organization, one or more regulators, and often other stakeholders and members of the public. In these environments, simulation modeling can be a powerful tool in reaching a consensus on the best path forward, but only if the models that are developed are understood and accepted by all of the parties involved. These requirements for understanding and acceptance of the models constrain the appropriate software and model development procedures that are employed. This paper discusses requirements for both simulation software and for the models that are developed using the software. Requirements for the software include transparency, accessibility, flexibility, extensibility, quality assurance, ability to do discrete and/or continuous simulation, and efficiency. Requirements for the models that are developed include traceability, transparency, credibility/validity, and quality control. The paper discusses these requirements with specific reference to the requirements for performance assessment models that are used for predicting the long-term safety of waste disposal facilities, such as the proposed Yucca Mountain repository

  19. An individual-based probabilistic model for simulating fisheries population dynamics

    Directory of Open Access Journals (Sweden)

    Jie Cao

    2016-12-01

    Full Text Available The purpose of stock assessment is to support managers to provide intelligent decisions regarding removal from fish populations. Errors in assessment models may have devastating impacts on the population fitness and negative impacts on the economy of the resource users. Thus, accuracte estimations of population size, growth rates are critical for success. Evaluating and testing the behavior and performance of stock assessment models and assessing the consequences of model mis-specification and the impact of management strategies requires an operating model that accurately describe the dynamics of the target species, and can resolve spatial and seasonal changes. In addition, the most thorough evaluations of assessment models use an operating model that takes a different form than the assessment model. This paper presents an individual-based probabilistic model used to simulate the complex dynamics of populations and their associated fisheries. Various components of population dynamics are expressed as random Bernoulli trials in the model and detailed life and fishery histories of each individual are tracked over their life span. The simulation model is designed to be flexible so it can be used for different species and fisheries. It can simulate mixing among multiple stocks and link stock-recruit relationships to environmental factors. Furthermore, the model allows for flexibility in sub-models (e.g., growth and recruitment and model assumptions (e.g., age- or size-dependent selectivity. This model enables the user to conduct various simulation studies, including testing the performance of assessment models under different assumptions, assessing the impacts of model mis-specification and evaluating management strategies.

  20. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  1. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  2. TRSM-a thermal-hydraulic real-time simulation model for PWR

    International Nuclear Information System (INIS)

    Zhou Weichang

    1997-01-01

    TRSM (a Thermal-hydraulic Real-time Simulation Model) has been developed for PWR real-time simulation and best-estimate prediction of normal operating and abnormal accident conditions. It is a non-equilibrium two phase flow thermal-hydraulic model based on five basic conservation equations. A drift flux model is used to account for the unequal velocities of liquid and gaseous mixture, with or without the presence of the noncondensibles. Critical flow models are applied for break flow and valve flow calculations. A 5-regime two phase heat convection model is applied for clad-to-coolant as well as fluid-to-tubing heat transfer. A rigorous reactor coolant pump model is used to calculate the pressure drop and rise for the suction and discharge ends with complete pump characteristics curves included. The TRSM model has been adapted in the full-scale training simulator of Qinshan Nuclear Power Plant 300 MW unit to simulate the thermal-hydraulic performance of the NSSS. The simulation results of a cold leg LOCA and a steam generator tube rupture (SGTR) accident are presented

  3. Control-relevant modeling and simulation of a SOFC-GT hybrid system

    Directory of Open Access Journals (Sweden)

    Rambabu Kandepu

    2006-07-01

    Full Text Available In this paper, control-relevant models of the most important components in a SOFC-GT hybrid system are described. Dynamic simulations are performed on the overall hybrid system. The model is used to develop a simple control structure, but the simulations show that more elaborate control is needed.

  4. Simulation of performance of centrifugal circulators with vaneless diffuser for GCR applications

    International Nuclear Information System (INIS)

    Tauveron, N.; Dor, I.

    2010-01-01

    In the frame of the international forum GenIV, CEA has selected various innovative concepts of gas-cooled nuclear reactor. Thermal hydraulic performances are a key issue for the design. For transient conditions and decay heat removal situations, the thermal hydraulic performance must remain as high as possible. In this context, all the transient situations, the incidental and accidental scenarii must be evaluated by a validated system code able to correctly describe, in particular, the thermal hydraulics of the whole plant. As concepts use a helium compressor to maintain the flow in the core, a special emphasis must be laid on compressor modelling. Centrifugal circulators with a vaneless diffuser have significant properties in term of simplicity, cost, ability to operate over a wide range of conditions. The objective of this paper is to present a dedicated description of centrifugal compressor, based on a one-dimensional approach. This type of model requires various correlations as input data. The present contribution consists in establishing and validating the numerical simulations (including different sets of correlations) by comparison with representative experimental data. The results obtained show a qualitatively correct behaviour of the model compared to open literature cases of the gas turbine aircraft community and helium circulators of high temperature gas reactors. The model is finally used in a depressurised transient simulation of a small power gas fast reactor (ALLEGRO concept). Advantages of this model versus first preliminary simulations are shown. Further work on modelling and validation are nevertheless needed to have a better confidence in the simulation predictions.

  5. Simulation of performance of centrifugal circulators with vaneless diffuser for GCR applications

    Energy Technology Data Exchange (ETDEWEB)

    Tauveron, N., E-mail: nicolas.tauveron@cea.f [CEA, DEN, DER/SSTH, 17 rue des Martyrs, F-38054 Grenoble (France); Dor, I., E-mail: isabelle.dor@cea.f [CEA, DEN, DER/SSTH, 17 rue des Martyrs, F-38054 Grenoble (France)

    2010-10-15

    In the frame of the international forum GenIV, CEA has selected various innovative concepts of gas-cooled nuclear reactor. Thermal hydraulic performances are a key issue for the design. For transient conditions and decay heat removal situations, the thermal hydraulic performance must remain as high as possible. In this context, all the transient situations, the incidental and accidental scenarii must be evaluated by a validated system code able to correctly describe, in particular, the thermal hydraulics of the whole plant. As concepts use a helium compressor to maintain the flow in the core, a special emphasis must be laid on compressor modelling. Centrifugal circulators with a vaneless diffuser have significant properties in term of simplicity, cost, ability to operate over a wide range of conditions. The objective of this paper is to present a dedicated description of centrifugal compressor, based on a one-dimensional approach. This type of model requires various correlations as input data. The present contribution consists in establishing and validating the numerical simulations (including different sets of correlations) by comparison with representative experimental data. The results obtained show a qualitatively correct behaviour of the model compared to open literature cases of the gas turbine aircraft community and helium circulators of high temperature gas reactors. The model is finally used in a depressurised transient simulation of a small power gas fast reactor (ALLEGRO concept). Advantages of this model versus first preliminary simulations are shown. Further work on modelling and validation are nevertheless needed to have a better confidence in the simulation predictions.

  6. The Effect of Bypass Nozzle Exit Area on Fan Aerodynamic Performance and Noise in a Model Turbofan Simulator

    Science.gov (United States)

    Hughes, Christopher E.; Podboy, Gary, G.; Woodward, Richard P.; Jeracki, Robert, J.

    2013-01-01

    The design of effective new technologies to reduce aircraft propulsion noise is dependent on identifying and understanding the noise sources and noise generation mechanisms in the modern turbofan engine, as well as determining their contribution to the overall aircraft noise signature. Therefore, a comprehensive aeroacoustic wind tunnel test program was conducted called the Fan Broadband Source Diagnostic Test as part of the NASA Quiet Aircraft Technology program. The test was performed in the anechoic NASA Glenn 9- by 15-Foot Low Speed Wind Tunnel using a 1/5 scale model turbofan simulator which represented a current generation, medium pressure ratio, high bypass turbofan aircraft engine. The investigation focused on simulating in model scale only the bypass section of the turbofan engine. The test objectives were to: identify the noise sources within the model and determine their noise level; investigate several component design technologies by determining their impact on the aerodynamic and acoustic performance of the fan stage; and conduct detailed flow diagnostics within the fan flow field to characterize the physics of the noise generation mechanisms in a turbofan model. This report discusses results obtained for one aspect of the Source Diagnostic Test that investigated the effect of the bypass or fan nozzle exit area on the bypass stage aerodynamic performance, specifically the fan and outlet guide vanes or stators, as well as the farfield acoustic noise level. The aerodynamic performance, farfield acoustics, and Laser Doppler Velocimeter flow diagnostic results are presented for the fan and four different fixed-area bypass nozzle configurations. The nozzles simulated fixed engine operating lines and encompassed the fan stage operating envelope from near stall to cruise. One nozzle was selected as a baseline reference, representing the nozzle area which would achieve the design point operating conditions and fan stage performance. The total area change from

  7. Simulation Modelling and Strategic Change: Creating the Sustainable Enterprise

    Directory of Open Access Journals (Sweden)

    Patrick Dawson

    2010-01-01

    Full Text Available This paper highlights the benefits of using discrete event simulation models for developing change management frameworks which facilitate productivity and environmental improvements in order to create a sustainable enterprise. There is an increasing need for organisations to be more socially and environmentally responsible, however these objectives cannot be realised in isolation of the strategic, operations and business objectives of the enterprise. Discrete Event Simulation models facilitate a multidimensional approach to enterprise modelling which can integrate operations and strategic considerations with environmental and social issues. Moreover these models can provide a dynamic roadmap for implementing a change strategy for realising the optimal conditions for operational and environmental performance. It is important to note that the nature of change is itself dynamic and that simulation models are capable of characterising the dynamics of the change process. The paper argues that incorporating social and environmental challenges into a strategic business model for an enterprise can result in improved profits and long term viability and that a multidimensional simulation approach can support decision making throughout the change process to more effectively achieve these goals.

  8. Cost and Performance Model for Photovoltaic Systems

    Science.gov (United States)

    Borden, C. S.; Smith, J. H.; Davisson, M. C.; Reiter, L. J.

    1986-01-01

    Lifetime cost and performance (LCP) model assists in assessment of design options for photovoltaic systems. LCP is simulation of performance, cost, and revenue streams associated with photovoltaic power systems connected to electric-utility grid. LCP provides user with substantial flexibility in specifying technical and economic environment of application.

  9. Simulating Study on Drive System Performance for Hybrid Electric Bus Based on ADVISOR

    Directory of Open Access Journals (Sweden)

    Wang Xingxing

    2017-01-01

    Full Text Available Hybrid electric bus has a number of advantages when compared with ordinary passenger cars, but in the dynamic matching and the vehicle performance are difficult to detect, thus limits its development process. In this paper, combined with the actual models, the hybrid electric bus module parameters were modified in the software of ADVISOR (Advanced Vehicle Simulator, main including: module of the vehicle, the wheel module, motor module, a battery module and engine module, three kinds of bus models for A, B and C were established, and the related performance that need to be analyzed was set up, such as acceleration, gradability, emissions and energy utilization and so on, in order to ensure the vehicle running in the same environment and convenient for comparison, a fixed vehicle driving cycles was chose, then the simulation results was analyzed, and the various performance was compared with the dynamic indicators and economic indicators which determined by referencing of traditional city bus standard and each other, and finally, the performance optimal model of B was chose out which can meet the demand, its related performance parameters of the simulation results are as follows: the best gradability is 26%, maximum speed is 72.7km/h, maximum acceleration is 1.7m/s2, 0~50km/h acceleration time is 9.5s and fuel consumption is 25L/km.

  10. Review of Methods Related to Assessing Human Performance in Nuclear Power Plant Control Room Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Katya L Le Blanc; Ronald L Boring; David I Gertman

    2001-11-01

    With the increased use of digital systems in Nuclear Power Plant (NPP) control rooms comes a need to thoroughly understand the human performance issues associated with digital systems. A common way to evaluate human performance is to test operators and crews in NPP control room simulators. However, it is often challenging to characterize human performance in meaningful ways when measuring performance in NPP control room simulations. A review of the literature in NPP simulator studies reveals a variety of ways to measure human performance in NPP control room simulations including direct observation, automated computer logging, recordings from physiological equipment, self-report techniques, protocol analysis and structured debriefs, and application of model-based evaluation. These methods and the particular measures used are summarized and evaluated.

  11. Simulation modeling and analysis in safety. II

    International Nuclear Information System (INIS)

    Ayoub, M.A.

    1981-01-01

    The paper introduces and illustrates simulation modeling as a viable approach for dealing with complex issues and decisions in safety and health. The author details two studies: evaluation of employee exposure to airborne radioactive materials and effectiveness of the safety organization. The first study seeks to define a policy to manage a facility used in testing employees for radiation contamination. An acceptable policy is one that would permit the testing of all employees as defined under regulatory requirements, while not exceeding available resources. The second study evaluates the relationship between safety performance and the characteristics of the organization, its management, its policy, and communication patterns among various functions and levels. Both studies use models where decisions are reached based on the prevailing conditions and occurrence of key events within the simulation environment. Finally, several problem areas suitable for simulation studies are highlighted. (Auth.)

  12. 20th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Patel, Nisarg; Kobayashi, Hiroaki

    2016-01-01

    The book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It explores general trends in hardware and software development, and then focuses specifically on the future of high-performance systems and heterogeneous architectures. It also covers applications such as computational fluid dynamics, material science, medical applications and climate research and discusses innovative fields like coupled multi-physics or multi-scale simulations. The papers included were selected from the presentations given at the 20th Workshop on Sustained Simulation Performance at the HLRS, University of Stuttgart, Germany in December 2015, and the subsequent Workshop on Sustained Simulation Performance at Tohoku University in February 2016.

  13. Predictive Simulation of Material Failure Using Peridynamics -- Advanced Constitutive Modeling, Verification and Validation

    Science.gov (United States)

    2016-03-31

    AFRL-AFOSR-VA-TR-2016-0309 Predictive simulation of material failure using peridynamics- advanced constitutive modeling, verification , and validation... Self -explanatory. 8. PERFORMING ORGANIZATION REPORT NUMBER. Enter all unique alphanumeric report numbers assigned by the performing organization, e.g...for public release. Predictive simulation of material failure using peridynamics-advanced constitutive modeling, verification , and validation John T

  14. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  15. Cell-element simulations to optimize the performance of osmotic processes in porous membranes

    KAUST Repository

    Calo, Victor M.

    2018-05-11

    We present a new module of the software tool PoreChem for 3D simulations of osmotic processes at the cell-element scale. We consider the most general fully coupled model (see e.g., Sagiv and Semiat (2011)) in 3D to evaluate the impact on the membrane performance of both internal and external concentration polarization, which occurs in a cell-element for different operational conditions. The model consists of the Navier–Stokes–Brinkman system to describe the free fluid flow and the flow within the membrane with selective and support layers, a convection–diffusion equation to describe the solute transport, and nonlinear interface conditions to fully couple these equations. First, we briefly describe the mathematical model and discuss the discretization of the continuous model, the iterative solution, and the software implementation. Then, we present the analytical and numerical validation of the simulation tool. Next, we perform and discuss numerical simulations for a case study. The case study concerns the design of a cell element for the forward osmosis experiments. Using the developed software tool we qualitatively and quantitatively investigate the performance of a cell element that we designed for laboratory experiments of forward osmosis, and discuss the differences between the numerical solutions obtained with the full 3D and reduced 2D models. Finally, we demonstrate how the software enables investigating membrane heterogeneities.

  16. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  17. Evaluating the performance of coupled snow-soil models in SURFEXv8 to simulate the permafrost thermal regime at a high Arctic site

    Science.gov (United States)

    Barrere, Mathieu; Domine, Florent; Decharme, Bertrand; Morin, Samuel; Vionnet, Vincent; Lafaysse, Matthieu

    2017-09-01

    Climate change projections still suffer from a limited representation of the permafrost-carbon feedback. Predicting the response of permafrost temperature to climate change requires accurate simulations of Arctic snow and soil properties. This study assesses the capacity of the coupled land surface and snow models ISBA-Crocus and ISBA-ES to simulate snow and soil properties at Bylot Island, a high Arctic site. Field measurements complemented with ERA-Interim reanalyses were used to drive the models and to evaluate simulation outputs. Snow height, density, temperature, thermal conductivity and thermal insulance are examined to determine the critical variables involved in the soil and snow thermal regime. Simulated soil properties are compared to measurements of thermal conductivity, temperature and water content. The simulated snow density profiles are unrealistic, which is most likely caused by the lack of representation in snow models of the upward water vapor fluxes generated by the strong temperature gradients within the snowpack. The resulting vertical profiles of thermal conductivity are inverted compared to observations, with high simulated values at the bottom of the snowpack. Still, ISBA-Crocus manages to successfully simulate the soil temperature in winter. Results are satisfactory in summer, but the temperature of the top soil could be better reproduced by adequately representing surface organic layers, i.e., mosses and litter, and in particular their water retention capacity. Transition periods (soil freezing and thawing) are the least well reproduced because the high basal snow thermal conductivity induces an excessively rapid heat transfer between the soil and the snow in simulations. Hence, global climate models should carefully consider Arctic snow thermal properties, and especially the thermal conductivity of the basal snow layer, to perform accurate predictions of the permafrost evolution under climate change.

  18. Simulating the Sky as Seen by the Square Kilometer Array using the MIT Array Performance Simulator (MAPS)

    Science.gov (United States)

    Matthews, Lynn D.; Cappallo, R. J.; Doeleman, S. S.; Fish, V. L.; Lonsdale, C. J.; Oberoi, D.; Wayth, R. B.

    2009-05-01

    The Square Kilometer Array (SKA) is a proposed next-generation radio telescope that will operate at frequencies of 0.1-30 GHz and be 50-100 times more sensitive than existing radio arrays. Meeting the performance goals of this instrument will require innovative new hardware and software developments, a variety of which are now under consideration. Key to evaluating the performance characteristics of proposed SKA designs and testing the feasibility of new data calibration and processing algorithms is the ability to carry out realistic simulations of radio wavelength arrays under a variety of observing conditions. The MIT Array Performance Simulator (MAPS) (http://www.haystack.mit.edu/ast/arrays/maps/index.html) is an observations simulation package designed to achieve this goal. MAPS accepts an input source list or sky model and generates a model visibility set for a user-defined "virtual observatory'', incorporating such factors as array geometry, primary beam shape, field-of-view, and time and frequency resolution. Optionally, effects such as thermal noise, out-of-beam sources, variable station beams, and time/location-dependent ionospheric effects can be included. We will showcase current capabilities of MAPS for SKA applications by presenting results from an analysis of the effects of realistic sky backgrounds on the achievable image fidelity and dynamic range of SKA-like arrays comprising large numbers of small-diameter antennas.

  19. Optimal Spatial Subdivision method for improving geometry navigation performance in Monte Carlo particle transport simulation

    International Nuclear Information System (INIS)

    Chen, Zhenping; Song, Jing; Zheng, Huaqing; Wu, Bin; Hu, Liqin

    2015-01-01

    Highlights: • The subdivision combines both advantages of uniform and non-uniform schemes. • The grid models were proved to be more efficient than traditional CSG models. • Monte Carlo simulation performance was enhanced by Optimal Spatial Subdivision. • Efficiency gains were obtained for realistic whole reactor core models. - Abstract: Geometry navigation is one of the key aspects of dominating Monte Carlo particle transport simulation performance for large-scale whole reactor models. In such cases, spatial subdivision is an easily-established and high-potential method to improve the run-time performance. In this study, a dedicated method, named Optimal Spatial Subdivision, is proposed for generating numerically optimal spatial grid models, which are demonstrated to be more efficient for geometry navigation than traditional Constructive Solid Geometry (CSG) models. The method uses a recursive subdivision algorithm to subdivide a CSG model into non-overlapping grids, which are labeled as totally or partially occupied, or not occupied at all, by CSG objects. The most important point is that, at each stage of subdivision, a conception of quality factor based on a cost estimation function is derived to evaluate the qualities of the subdivision schemes. Only the scheme with optimal quality factor will be chosen as the final subdivision strategy for generating the grid model. Eventually, the model built with the optimal quality factor will be efficient for Monte Carlo particle transport simulation. The method has been implemented and integrated into the Super Monte Carlo program SuperMC developed by FDS Team. Testing cases were used to highlight the performance gains that could be achieved. Results showed that Monte Carlo simulation runtime could be reduced significantly when using the new method, even as cases reached whole reactor core model sizes

  20. Generic simplified simulation model for DFIG with active crowbar

    Energy Technology Data Exchange (ETDEWEB)

    Buendia, Francisco Jimenez [Gamesa Innovation and Technology, Sarriguren, Navarra (Spain). Technology Dept.; Barrasa Gordo, Borja [Assystem Iberia, Bilbao, Vizcaya (Spain)

    2012-07-01

    Simplified models for transient stability studies are a general requirement for transmission system operators to wind turbine (WTG) manufacturers. Those models must represent the performance of the WTGs for transient stability studies, mainly voltage dips originated by short circuits in the electrical network. Those models are implemented in simulation software as PSS/E, DigSilent or PSLF. Those software platforms allow simulation of transients in large electrical networks with thousands of busses, generators and loads. The high complexity of the grid requires that the models inserted into the grid should be simplified in order to allow the simulations being executed as fast as possible. The development of a model which is simplified enough to be integrated in those complex grids and represent the performance of WTG is a challenge. The IEC TC88 working group has developed generic models for different types of generators, among others for WTGs using doubly fed induction generators (DFIG). This paper will focus in an extension of the models for DFIG WTGs developed in IEC in order to be able to represent the simplified model of DFIG with an active crowbar, which is required to withstand voltage dips without disconnecting from the grid. This paper improves current generic model of Type 3 for DFIG adding a simplified version of the generator including crowbar functionality and a simplified version of the crowbar firing. In addition, this simplified model is validated by correlation with voltage dip field test from a real wind turbine. (orig.)

  1. Simulation of hybrid vehicle propulsion with an advanced battery model

    Energy Technology Data Exchange (ETDEWEB)

    Nallabolu, S.; Kostetzer, L.; Rudnyi, E. [CADFEM GmbH, Grafing (Germany); Geppert, M.; Quinger, D. [LION Smart GmbH, Frieding (Germany)

    2011-07-01

    In the recent years there has been observed an increasing concern about global warming and greenhouse gas emissions. In addition to the environmental issues the predicted scarcity of oil supplies and the dramatic increase in oil price puts new demands on vehicle design. As a result energy efficiency and reduced emission have become one of main selling point for automobiles. Hybrid electric vehicles (HEV) have therefore become an interesting technology for the governments and automotive industries. HEV are more complicated compared to conventional vehicles due to the fact that these vehicles contain more electrical components such as electric machines, power electronics, electronic continuously variable transmissions (CVT), and embedded powertrain controllers. Advanced energy storage devices and energy converters, such as Li-ion batteries, ultracapacitors, and fuel cells are also considered. A detailed vehicle model used for an energy flow analysis and vehicle performance simulation is necessary. Computer simulation is indispensible to facilitate the examination of the vast hybrid electric vehicle design space with the aim to predict the vehicle performance over driving profiles, estimate fuel consumption and the pollution emissions. There are various types of mathematical models and simulators available to perform system simulation of vehicle propulsion. One of the standard methods to model the complete vehicle powertrain is ''backward quasistatic modeling''. In this method vehicle subsystems are defined based on experiential models in the form of look-up tables and efficiency maps. The interaction between adjacent subsystems of the vehicle is defined through the amount of power flow. Modeling the vehicle subsystems like motor, engine, gearbox and battery is under this technique is based on block diagrams. The vehicle model is applied in two case studies to evaluate the vehicle performance and fuel consumption. In the first case study the affect

  2. Using interactive model simulations in co-design : An experiment in urban design

    NARCIS (Netherlands)

    Steen, M.G.D.; Arendsen, J.; Cremers, A.H.M.; Vries, A. de; Jong, J.M.G. de; Koning, N.M. de

    2013-01-01

    This paper presents an experiment in which people performed a co-design task in urban design, using a multi-user touch table application with or without interactive model simulations. We hypothesised that using the interactive model simulations would improve communication and co-operation between

  3. Modeling, simulation, and design of SAW grating filters

    Science.gov (United States)

    Schwelb, Otto; Adler, E. L.; Slaboszewicz, J. K.

    1990-05-01

    A systematic procedure for modeling, simulating, and designing SAW (surface acoustic wave) grating filters, taking losses into account, is described. Grating structures and IDTs (interdigital transducers) coupling to SAWs are defined by cascadable transmission-matrix building blocks. Driving point and transfer characteristics (immittances) of complex architectures consisting of gratings, transducers, and coupling networks are obtained by chain-multiplying building-block matrices. This modular approach to resonator filter analysis and design combines the elements of lossy filter synthesis with the transmission-matrix description of SAW components. A multipole filter design procedure based on a lumped-element-model approximation of one-pole two-port resonator building blocks is given and the range of validity of this model examined. The software for simulating the performance of SAW grating devices based on this matrix approach is described, and its performance, when linked to the design procedure to form a CAD/CAA (computer-aided design and analysis) multiple-filter design package, is illustrated with a resonator filter design example.

  4. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  5. Extended two-fluid model for simulating magneto-rheological fluid flows

    International Nuclear Information System (INIS)

    Shivaram, A C

    2011-01-01

    The current practice of designing magneto-rheological (MR) fluid-based devices is, to a large extent, based on simple phenomenological models like the Bingham model. Though useful for initial force or torque estimation and sizing, these models lack the capability to predict performance degradation due to changes in the particle volume fraction distribution. The present work demonstrates the use of the two-fluid model for predicting the particle volume fraction distribution inside a device in the absence of a field and proposes a novel modeling scheme which can simulate the fluid flow in the presence of a field. This modeling scheme can be used to (a) visualize flow patterns inside a device under various operating conditions, (b) predict the spatial distribution of particles inside a device after multiple operating cycles, (c) assist in estimating the extent of performance degradation due to non-uniform particle distribution and (d) enable testing of various design strategies to mitigate such performance issues using simulations. This is illustrated through numerical examples of a few case studies of typical MR device configurations

  6. STEADY-STATE modeling and simulation of pipeline networks for compressible fluids

    Directory of Open Access Journals (Sweden)

    A.L.H. Costa

    1998-12-01

    Full Text Available This paper presents a model and an algorithm for the simulation of pipeline networks with compressible fluids. The model can predict pressures, flow rates, temperatures and gas compositions at any point of the network. Any network configuration can be simulated; the existence of cycles is not an obstacle. Numerical results from simulated data on a proposed network are shown for illustration. The potential of the simulator is explored by the analysis of a pressure relief network, using a stochastic procedure for the evaluation of system performance.

  7. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    Science.gov (United States)

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  8. The NASA Ames Hypersonic Combustor-Model Inlet CFD Simulations and Experimental Comparisons

    Science.gov (United States)

    Venkatapathy, E.; Tokarcik-Polsky, S.; Deiwert, G. S.; Edwards, Thomas A. (Technical Monitor)

    1995-01-01

    Computations have been performed on a three-dimensional inlet associated with the NASA Ames combustor model for the hypersonic propulsion experiment in the 16-inch shock tunnel. The 3-dimensional inlet was designed to have the combustor inlet flow nearly two-dimensional and of sufficient mass flow necessary for combustion. The 16-inch shock tunnel experiment is a short duration test with test time of the order of milliseconds. The flow through the inlet is in chemical non-equilibrium. Two test entries have been completed and limited experimental results for the inlet region of the combustor-model are available. A number of CFD simulations, with various levels of simplifications such as 2-D simulations, 3-D simulations with and without chemical reactions, simulations with and without turbulent conditions, etc., have been performed. These simulations have helped determine the model inlet flow characteristics and the important factors that affect the combustor inlet flow and the sensitivity of the flow field to these simplifications. In the proposed paper, CFD modeling of the hypersonic inlet, results from the simulations and comparison with available experimental results will be presented.

  9. The use of vestibular models for design and evaluation of flight simulator motion

    Science.gov (United States)

    Bussolari, Steven R.; Young, Laurence R.; Lee, Alfred T.

    1989-01-01

    Quantitative models for the dynamics of the human vestibular system are applied to the design and evaluation of flight simulator platform motion. An optimal simulator motion control algorithm is generated to minimize the vector difference between perceived spatial orientation estimated in flight and in simulation. The motion controller has been implemented on the Vertical Motion Simulator at NASA Ames Research Center and evaluated experimentally through measurement of pilot performance and subjective rating during VTOL aircraft simulation. In general, pilot performance in a longitudinal tracking task (formation flight) did not appear to be sensitive to variations in platform motion condition as long as motion was present. However, pilot assessment of motion fidelity by means of a rating scale designed for this purpose, were sensitive to motion controller design. Platform motion generated with the optimal motion controller was found to be generally equivalent to that generated by conventional linear crossfeed washout. The vestibular models are used to evaluate the motion fidelity of transport category aircraft (Boeing 727) simulation in a pilot performance and simulator acceptability study at the Man-Vehicle Systems Research Facility at NASA Ames Research Center. Eighteen airline pilots, currently flying B-727, were given a series of flight scenarios in the simulator under various conditions of simulator motion. The scenarios were chosen to reflect the flight maneuvers that these pilots might expect to be given during a routine pilot proficiency check. Pilot performance and subjective rating of simulator fidelity was relatively insensitive to the motion condition, despite large differences in the amplitude of motion provided. This lack of sensitivity may be explained by means of the vestibular models, which predict little difference in the modeled motion sensations of the pilots when different motion conditions are imposed.

  10. Unmanned Aerial ad Hoc Networks: Simulation-Based Evaluation of Entity Mobility Models’ Impact on Routing Performance

    Directory of Open Access Journals (Sweden)

    Jean-Daniel Medjo Me Biomo

    2015-06-01

    Full Text Available An unmanned aerial ad hoc network (UAANET is a special type of mobile ad hoc network (MANET. For these networks, researchers rely mostly on simulations to evaluate their proposed networking protocols. Hence, it is of great importance that the simulation environment of a UAANET replicates as much as possible the reality of UAVs. One major component of that environment is the movement pattern of the UAVs. This means that the mobility model used in simulations has to be thoroughly understood in terms of its impact on the performance of the network. In this paper, we investigate how mobility models affect the performance of UAANET in simulations in order to come up with conclusions/recommendations that provide a benchmark for future UAANET simulations. To that end, we first propose a few metrics to evaluate the mobility models. Then, we present five random entity mobility models that allow nodes to move almost freely and independently from one another and evaluate four carefully-chosen MANET/UAANET routing protocols: ad hoc on-demand distance vector (AODV, optimized link state routing (OLSR, reactive-geographic hybrid routing (RGR and geographic routing protocol (GRP. In addition, flooding is also evaluated. The results show a wide variation of the protocol performance over different mobility models. These performance differences can be explained by the mobility model characteristics, and we discuss these effects. The results of our analysis show that: (i the enhanced Gauss–Markov (EGM mobility model is best suited for UAANET; (ii OLSR, a table-driven proactive routing protocol, and GRP, a position-based geographic protocol, are the protocols most sensitive to the change of mobility models; (iii RGR, a reactive-geographic hybrid routing protocol, is best suited for UAANET.

  11. Assessing the Impact of Equipment Aging on System Performance Using Simulation Modeling Methods

    International Nuclear Information System (INIS)

    Gupta, N. K.

    2005-01-01

    The radiological Inductively Coupled Plasma Mass Spectrometer (ICP-MS) is used to analyze the radioactive samples collected from different radioactive material processing operations at Savannah River Site (SRS). The expeditious processing of these samples is important for safe and reliable operations at SRS. As the radiological (RAD) ICP-MS machine ages, the experience shows that replacement parts and repairs are difficult to obtain on time for reliable operations after 5 years of service. A discrete event model using commercial software EXTEND was prepared to assess the impact on sample turn around times as the ICP-MS gets older. The model was prepared using the sample statistics from the previous 4 years. Machine utilization rates were calculated for the new machine, 5 year old machine, 10 year old machine, and a 12 year old machine. Computer simulations were run for these periods and the sample delay times calculated. The model was validated against the sample statistics collected from the previous 4 quarters. 90% confidence intervals were calculated for the 10th, 25th, 50th, and 90th quantiles of the samples. The simulation results show that if 50% of the samples are needed on time for efficient site operations, a 10 year old machine could take nearly 50 days longer to process these samples than a 5-year old machine. This simulation effort quantifies the impact on sample turn around time as the ICP-MS gets older

  12. A Tutorial on RxODE: Simulating Differential Equation Pharmacometric Models in R.

    Science.gov (United States)

    Wang, W; Hallow, K M; James, D A

    2016-01-01

    This tutorial presents the application of an R package, RxODE, that facilitates quick, efficient simulations of ordinary differential equation models completely within R. Its application is illustrated through simulation of design decision effects on an adaptive dosing regimen. The package provides an efficient, versatile way to specify dosing scenarios and to perform simulation with variability with minimal custom coding. Models can be directly translated to Rshiny applications to facilitate interactive, real-time evaluation/iteration on simulation scenarios.

  13. Mathematical modelling and simulation of the thermal performance of a solar heated indoor swimming pool

    Directory of Open Access Journals (Sweden)

    Mančić Marko V.

    2014-01-01

    Full Text Available Buildings with indoor swimming pools have a large energy footprint. The source of major energy loss is the swimming pool hall where air humidity is increased by evaporation from the pool water surface. This increases energy consumption for heating and ventilation of the pool hall, fresh water supply loss and heat demand for pool water heating. In this paper, a mathematical model of the swimming pool was made to assess energy demands of an indoor swimming pool building. The mathematical model of the swimming pool is used with the created multi-zone building model in TRNSYS software to determine pool hall energy demand and pool losses. Energy loss for pool water and pool hall heating and ventilation are analyzed for different target pool water and air temperatures. The simulation showed that pool water heating accounts for around 22%, whereas heating and ventilation of the pool hall for around 60% of the total pool hall heat demand. With a change of preset controller air and water temperatures in simulations, evaporation loss was in the range 46-54% of the total pool losses. A solar thermal sanitary hot water system was modelled and simulated to analyze it's potential for energy savings of the presented demand side model. The simulation showed that up to 87% of water heating demands could be met by the solar thermal system, while avoiding stagnation. [Projekat Ministarstva nauke Republike Srbije, br. III 42006: Research and development of energy and environmentally highly effective polygeneration systems based on using renewable energy sources

  14. Modeling & Simulation Education for the Acquisition and T&E Workforce: FY07 Deliverable Package

    National Research Council Canada - National Science Library

    Olwell, David H; Johnson, Jean; Few, Stephanie; Didoszak, Jarema M

    2007-01-01

    This technical report presents the deliverables for calendar year 2007 for the "Educating the Modeling and Simulation Workforce" project performed for the DoD Modeling and Simulation Steering Committee...

  15. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    Science.gov (United States)

    Fan, Yunzhou; Wang, Ying; Jiang, Hongbo; Yang, Wenwen; Yu, Miao; Yan, Weirong; Diwan, Vinod K; Xu, Biao; Dong, Hengjin; Palm, Lars; Nie, Shaofa

    2014-01-01

    Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1) outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp performance of multi-stream surveillance.

  16. Conversion of a mainframe simulation for maintenance performance to a PC environment

    International Nuclear Information System (INIS)

    Gertman, D.I.

    1991-01-01

    A computer-based simulation capable of generating human error probabilities (HEPs) for maintenance activities is presented. The HEPs are suitable for use in probabilistic risk assessments (PRAs) and are an important source of information for data management systems such as NUCLARR- the Nuclear Computerized Library for Assessing Reactor Reliability. (1) The basic computer model MAPPS--the maintenance personnel performance simulation has been developed and validated by the US NRC in order to improve maintenance practices and procedures at nuclear power plants. This model validated previously, has now been implemented and improved, in a PC environment, and renamed MicroMAPPS. The model is stochastically based, able to simulate the performance of 2 to 15 person crews for a variety of maintenance conditions. These conditions include aspects of crew actions as potentially influenced by the task, the environment, or characteristics of the personnel involved. The nature of the software code makes it particularly appropriate for determining changes in HEP rates due to fluctuations in important task, environment,. or personnel parameters. The presentation presents a brief review of the mainframe version of the code and presents a summarization of the enhancements which dramatically change the nature of the human computer interaction

  17. Flat Knitting Loop Deformation Simulation Based on Interlacing Point Model

    Directory of Open Access Journals (Sweden)

    Jiang Gaoming

    2017-12-01

    Full Text Available In order to create realistic loop primitives suitable for the faster CAD of the flat-knitted fabric, we have performed research on the model of the loop as well as the variation of the loop surface. This paper proposes an interlacing point-based model for the loop center curve, and uses the cubic Bezier curve to fit the central curve of the regular loop, elongated loop, transfer loop, and irregular deformed loop. In this way, a general model for the central curve of the deformed loop is obtained. The obtained model is then utilized to perform texture mapping, texture interpolation, and brightness processing, simulating a clearly structured and lifelike deformed loop. The computer program LOOP is developed by using the algorithm. The deformed loop is simulated with different yarns, and the deformed loop is applied to design of a cable stitch, demonstrating feasibility of the proposed algorithm. This paper provides a loop primitive simulation method characterized by lifelikeness, yarn material variability, and deformation flexibility, and facilitates the loop-based fast computer-aided design (CAD of the knitted fabric.

  18. Building performance simulation for sustainable buildings

    NARCIS (Netherlands)

    Hensen, J.L.M.

    2010-01-01

    This paper aims to provide a general view of the background and current state of building performance simulation, which has the potential to deliver, directly or indirectly, substantial benefits to building stakeholders and to the environment. However the building simulation community faces many

  19. Validation of a power-law noise model for simulating small-scale breast tissue

    International Nuclear Information System (INIS)

    Reiser, I; Edwards, A; Nishikawa, R M

    2013-01-01

    We have validated a small-scale breast tissue model based on power-law noise. A set of 110 patient images served as truth. The statistical model parameters were determined by matching the radially averaged power-spectrum of the projected simulated tissue with that of the central tomosynthesis patient breast projections. Observer performance in a signal-known exactly detection task in simulated and actual breast backgrounds was compared. Observers included human readers, a pre-whitening observer model and a channelized Hotelling observer model. For all observers, good agreement between performance in the simulated and actual backgrounds was found, both in the tomosynthesis central projections and the reconstructed images. This tissue model can be used for breast x-ray imaging system optimization. The complete statistical description of the model is provided. (paper)

  20. Predictive neuromechanical simulations indicate why walking performance declines with ageing.

    Science.gov (United States)

    Song, Seungmoon; Geyer, Hartmut

    2018-04-01

    Although the natural decline in walking performance with ageing affects the quality of life of a growing elderly population, its physiological origins remain unknown. By using predictive neuromechanical simulations of human walking with age-related neuro-musculo-skeletal changes, we find evidence that the loss of muscle strength and muscle contraction speed dominantly contribute to the reduced walking economy and speed. The findings imply that focusing on recovering these muscular changes may be the only effective way to improve performance in elderly walking. More generally, the work is of interest for investigating the physiological causes of altered gait due to age, injury and disorders. Healthy elderly people walk slower and energetically less efficiently than young adults. This decline in walking performance lowers the quality of life for a growing ageing population, and understanding its physiological origin is critical for devising interventions that can delay or revert it. However, the origin of the decline in walking performance remains unknown, as ageing produces a range of physiological changes whose individual effects on gait are difficult to separate in experiments with human subjects. Here we use a predictive neuromechanical model to separately address the effects of common age-related changes to the skeletal, muscular and nervous systems. We find in computer simulations of this model that the combined changes produce gait consistent with elderly walking and that mainly the loss of muscle strength and mass reduces energy efficiency. In addition, we find that the slower preferred walking speed of elderly people emerges in the simulations when adapting to muscle fatigue, again mainly caused by muscle-related changes. The results suggest that a focus on recovering these muscular changes may be the only effective way to improve performance in elderly walking. © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.

  1. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  2. WRF-Chem Model Simulations of Arizona Dust Storms

    Science.gov (United States)

    Mohebbi, A.; Chang, H. I.; Hondula, D.

    2017-12-01

    The online Weather Research and Forecasting model with coupled chemistry module (WRF-Chem) is applied to simulate the transport, deposition and emission of the dust aerosols in an intense dust outbreak event that took place on July 5th, 2011 over Arizona. Goddard Chemistry Aerosol Radiation and Transport (GOCART), Air Force Weather Agency (AFWA), and University of Cologne (UoC) parameterization schemes for dust emission were evaluated. The model was found to simulate well the synoptic meteorological conditions also widely documented in previous studies. The chemistry module performance in reproducing the atmospheric desert dust load was evaluated using the horizontal field of the Aerosol Optical Depth (AOD) from Moderate Resolution Imaging Spectro (MODIS) radiometer Terra/Aqua and Aerosol Robotic Network (AERONET) satellites employing standard Dark Target (DT) and Deep Blue (DB) algorithms. To assess the temporal variability of the dust storm, Particulate Matter mass concentration data (PM10 and PM2.5) from Arizona Department of Environmental Quality (AZDEQ) ground-based air quality stations were used. The promising performance of WRF-Chem indicate that the model is capable of simulating the right timing and loading of a dust event in the planetary-boundary-layer (PBL) which can be used to forecast approaching severe dust events and to communicate an effective early warning.

  3. Multiscale modeling and characterization for performance and safety of lithium-ion batteries

    International Nuclear Information System (INIS)

    Pannala, S.; Turner, J. A.; Allu, S.; Elwasif, W. R.; Kalnaus, S.; Simunovic, S.; Kumar, A.; Billings, J. J.; Wang, H.; Nanda, J.

    2015-01-01

    Lithium-ion batteries are highly complex electrochemical systems whose performance and safety are governed by coupled nonlinear electrochemical-electrical-thermal-mechanical processes over a range of spatiotemporal scales. Gaining an understanding of the role of these processes as well as development of predictive capabilities for design of better performing batteries requires synergy between theory, modeling, and simulation, and fundamental experimental work to support the models. This paper presents the overview of the work performed by the authors aligned with both experimental and computational efforts. In this paper, we describe a new, open source computational environment for battery simulations with an initial focus on lithium-ion systems but designed to support a variety of model types and formulations. This system has been used to create a three-dimensional cell and battery pack models that explicitly simulate all the battery components (current collectors, electrodes, and separator). The models are used to predict battery performance under normal operations and to study thermal and mechanical safety aspects under adverse conditions. This paper also provides an overview of the experimental techniques to obtain crucial validation data to benchmark the simulations at various scales for performance as well as abuse. We detail some initial validation using characterization experiments such as infrared and neutron imaging and micro-Raman mapping. In addition, we identify opportunities for future integration of theory, modeling, and experiments

  4. Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

    Directory of Open Access Journals (Sweden)

    Florin-Catalin ENACHE

    2015-10-01

    Full Text Available The growing character of the cloud business has manifested exponentially in the last 5 years. The capacity managers need to concentrate on a practical way to simulate the random demands a cloud infrastructure could face, even if there are not too many mathematical tools to simulate such demands.This paper presents an introduction into the most important stochastic processes and queueing theory concepts used for modeling computer performance. Moreover, it shows the cases where such concepts are applicable and when not, using clear programming examples on how to simulate a queue, and how to use and validate a simulation, when there are no mathematical concepts to back it up.

  5. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Reference Manual for the System Advisor Model's Wind Power Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Jorgenson, J.; Gilman, P.; Ferguson, T.

    2014-08-01

    This manual describes the National Renewable Energy Laboratory's System Advisor Model (SAM) wind power performance model. The model calculates the hourly electrical output of a single wind turbine or of a wind farm. The wind power performance model requires information about the wind resource, wind turbine specifications, wind farm layout (if applicable), and costs. In SAM, the performance model can be coupled to one of the financial models to calculate economic metrics for residential, commercial, or utility-scale wind projects. This manual describes the algorithms used by the wind power performance model, which is available in the SAM user interface and as part of the SAM Simulation Core (SSC) library, and is intended to supplement the user documentation that comes with the software.

  7. Simulating Effects of High Angle of Attack on Turbofan Engine Performance

    Science.gov (United States)

    Liu, Yuan; Claus, Russell W.; Litt, Jonathan S.; Guo, Ten-Huei

    2013-01-01

    A method of investigating the effects of high angle of attack (AOA) flight on turbofan engine performance is presented. The methodology involves combining a suite of diverse simulation tools. Three-dimensional, steady-state computational fluid dynamics (CFD) software is used to model the change in performance of a commercial aircraft-type inlet and fan geometry due to various levels of AOA. Parallel compressor theory is then applied to assimilate the CFD data with a zero-dimensional, nonlinear, dynamic turbofan engine model. The combined model shows that high AOA operation degrades fan performance and, thus, negatively impacts compressor stability margins and engine thrust. In addition, the engine response to high AOA conditions is shown to be highly dependent upon the type of control system employed.

  8. Development of a Simulation Model for Swimming with Diving Fins

    Directory of Open Access Journals (Sweden)

    Motomu Nakashima

    2018-02-01

    Full Text Available The simulation model to assess the performance of diving fin was developed by extending the swimming human simulation model SWUM. A diving fin was modeled as a series of five rigid plates and connected to the human model by springs and dampers. These plates were connected to each other by virtual springs and dampers, and fin’s bending property was represented by springs and dampers as well. An actual diver’s swimming motion with fins was acquired by a motion capture experiment. In order to determine the bending property of the fin, two bending tests on land were conducted. In addition, an experiment was conducted in order to determine the fluid force coefficients in the fluid force model for the fin. Finally, using all measured and identified information, a simulation, in which the experimental situation was reproduced, was carried out. It was confirmed that the diver in the simulation propelled forward in the water successfully.

  9. iCrowd: agent-based behavior modeling and crowd simulator

    Science.gov (United States)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  10. Performance engineering in the community atmosphere model

    International Nuclear Information System (INIS)

    Worley, P; Mirin, A; Drake, J; Sawyer, W

    2006-01-01

    The Community Atmosphere Model (CAM) is the atmospheric component of the Community Climate System Model (CCSM) and is the primary consumer of computer resources in typical CCSM simulations. Performance engineering has been an important aspect of CAM development throughout its existence. This paper briefly summarizes these efforts and their impacts over the past five years

  11. Neurocognitive Correlates of Young Drivers' Performance in a Driving Simulator.

    Science.gov (United States)

    Guinosso, Stephanie A; Johnson, Sara B; Schultheis, Maria T; Graefe, Anna C; Bishai, David M

    2016-04-01

    Differences in neurocognitive functioning may contribute to driving performance among young drivers. However, few studies have examined this relation. This pilot study investigated whether common neurocognitive measures were associated with driving performance among young drivers in a driving simulator. Young drivers (19.8 years (standard deviation [SD] = 1.9; N = 74)) participated in a battery of neurocognitive assessments measuring general intellectual capacity (Full-Scale Intelligence Quotient, FSIQ) and executive functioning, including the Stroop Color-Word Test (cognitive inhibition), Wisconsin Card Sort Test-64 (cognitive flexibility), and Attention Network Task (alerting, orienting, and executive attention). Participants then drove in a simulated vehicle under two conditions-a baseline and driving challenge. During the driving challenge, participants completed a verbal working memory task to increase demand on executive attention. Multiple regression models were used to evaluate the relations between the neurocognitive measures and driving performance under the two conditions. FSIQ, cognitive inhibition, and alerting were associated with better driving performance at baseline. FSIQ and cognitive inhibition were also associated with better driving performance during the verbal challenge. Measures of cognitive flexibility, orienting, and conflict executive control were not associated with driving performance under either condition. FSIQ and, to some extent, measures of executive function are associated with driving performance in a driving simulator. Further research is needed to determine if executive function is associated with more advanced driving performance under conditions that demand greater cognitive load. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  12. Improving Simulations of Extreme Flows by Coupling a Physically-based Hydrologic Model with a Machine Learning Model

    Science.gov (United States)

    Mohammed, K.; Islam, A. S.; Khan, M. J. U.; Das, M. K.

    2017-12-01

    With the large number of hydrologic models presently available along with the global weather and geographic datasets, streamflows of almost any river in the world can be easily modeled. And if a reasonable amount of observed data from that river is available, then simulations of high accuracy can sometimes be performed after calibrating the model parameters against those observed data through inverse modeling. Although such calibrated models can succeed in simulating the general trend or mean of the observed flows very well, more often than not they fail to adequately simulate the extreme flows. This causes difficulty in tasks such as generating reliable projections of future changes in extreme flows due to climate change, which is obviously an important task due to floods and droughts being closely connected to people's lives and livelihoods. We propose an approach where the outputs of a physically-based hydrologic model are used as an input to a machine learning model to try and better simulate the extreme flows. To demonstrate this offline-coupling approach, the Soil and Water Assessment Tool (SWAT) was selected as the physically-based hydrologic model, the Artificial Neural Network (ANN) as the machine learning model and the Ganges-Brahmaputra-Meghna (GBM) river system as the study area. The GBM river system, located in South Asia, is the third largest in the world in terms of freshwater generated and forms the largest delta in the world. The flows of the GBM rivers were simulated separately in order to test the performance of this proposed approach in accurately simulating the extreme flows generated by different basins that vary in size, climate, hydrology and anthropogenic intervention on stream networks. Results show that by post-processing the simulated flows of the SWAT models with ANN models, simulations of extreme flows can be significantly improved. The mean absolute errors in simulating annual maximum/minimum daily flows were minimized from 4967

  13. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents. Part 2: IDAC performance influencing factors model

    International Nuclear Information System (INIS)

    Chang, Y.H.J.; Mosleh, A.

    2007-01-01

    This is the second in a series of five papers describing the information, decision, and action in crew context (IDAC) model for human reliability analysis. An example application of this modeling technique is also discussed in this series. The model is developed to probabilistically predict the responses of the nuclear power plant control room operating crew in accident conditions. The operator response spectrum includes cognitive, psychological, and physical activities during the course of an accident. This paper identifies the IDAC set of performance influencing factors (PIFs), providing their definitions and causal organization in the form of a modular influence diagram. Fifty PIFs are identified to support the IDAC model to be implemented in a computer simulation environment. They are classified into eleven hierarchically structured groups. The PIFs within each group are independent to each other; however, dependencies may exist between PIFs within different groups. The supporting evidence for the selection and organization of the influence paths based on psychological literature, observations, and various human reliability analysis methodologies is also indicated

  14. Comparison of Thunderstorm Simulations from WRF-NMM and WRF-ARW Models over East Indian Region

    Directory of Open Access Journals (Sweden)

    A. J. Litta

    2012-01-01

    Full Text Available The thunderstorms are typical mesoscale systems dominated by intense convection. Mesoscale models are essential for the accurate prediction of such high-impact weather events. In the present study, an attempt has been made to compare the simulated results of three thunderstorm events using NMM and ARW model core of WRF system and validated the model results with observations. Both models performed well in capturing stability indices which are indicators of severe convective activity. Comparison of model-simulated radar reflectivity imageries with observations revealed that NMM model has simulated well the propagation of the squall line, while the squall line movement was slow in ARW. From the model-simulated spatial plots of cloud top temperature, we can see that NMM model has better captured the genesis, intensification, and propagation of thunder squall than ARW model. The statistical analysis of rainfall indicates the better performance of NMM than ARW. Comparison of model-simulated thunderstorm affected parameters with that of the observed showed that NMM has performed better than ARW in capturing the sharp rise in humidity and drop in temperature. This suggests that NMM model has the potential to provide unique and valuable information for severe thunderstorm forecasters over east Indian region.

  15. Dynamical Downscaling of NASA/GISS ModelE: Continuous, Multi-Year WRF Simulations

    Science.gov (United States)

    Otte, T.; Bowden, J. H.; Nolte, C. G.; Otte, M. J.; Herwehe, J. A.; Faluvegi, G.; Shindell, D. T.

    2010-12-01

    The WRF Model is being used at the U.S. EPA for dynamical downscaling of the NASA/GISS ModelE fields to assess regional impacts of climate change in the United States. The WRF model has been successfully linked to the ModelE fields in their raw hybrid vertical coordinate, and continuous, multi-year WRF downscaling simulations have been performed. WRF will be used to downscale decadal time slices of ModelE for recent past, current, and future climate as the simulations being conducted for the IPCC Fifth Assessment Report become available. This presentation will focus on the sensitivity to interior nudging within the RCM. The use of interior nudging for downscaled regional climate simulations has been somewhat controversial over the past several years but has been recently attracting attention. Several recent studies that have used reanalysis (i.e., verifiable) fields as a proxy for GCM input have shown that interior nudging can be beneficial toward achieving the desired downscaled fields. In this study, the value of nudging will be shown using fields from ModelE that are downscaled using WRF. Several different methods of nudging are explored, and it will be shown that the method of nudging and the choices made with respect to how nudging is used in WRF are critical to balance the constraint of ModelE against the freedom of WRF to develop its own fields.

  16. Performances on simulator and da Vinci robot on subjects with and without surgical background.

    Science.gov (United States)

    Moglia, Andrea; Ferrari, Vincenzo; Melfi, Franca; Ferrari, Mauro; Mosca, Franco; Cuschieri, Alfred; Morelli, Luca

    2017-08-17

    To assess whether previous training in surgery influences performance on da Vinci Skills Simulator and da Vinci robot. In this prospective study, thirty-seven participants (11 medical students, 17 residents, and 9 attending surgeons) without previous experience in laparoscopy and robotic surgery performed 26 exercises at da Vinci Skills Simulator. Thirty-five then executed a suture using a da Vinci robot. The overall scores on the exercises at the da Vinci Skills Simulator show a similar performance among the groups with no statistically significant pair-wise differences (p poor for the untrained groups (5 (3.5, 9)), without statistically significant difference (p < .05). This study showed, for subjects new to laparoscopy and robotic surgery, insignificant differences in the scores at the da Vinci Skills Simulator and at the da Vinci robot on inanimate models.

  17. The new rosetta targets observations, simulations and instrument performances

    CERN Document Server

    Epifani, Elena; Palumbo, Pasquale

    2004-01-01

    The Rosetta mission was successfully launched on March 2nd, 2004 for a rendezvous with the short period comet 67PChuryumov-Gerasimenko in 2014 The new baseline mission foresees also a double fly-by with asteroids 21 Lutetia and 2867 Steins, on the way towards the primary target This volume collects papers presented at the workshop on "The NEW Rosetta targets Observations, simulations and instrument performances", held in Capri on October 13-15, 2003 The papers cover the fields of observations of the new Rosetta targets, laboratory experiments and theoretical simulation of cometary processes, and the expected performances of Rosetta experiments Until real operations around 67PChuryumov-Gerasimenko will start in 10 years from now, new astronomical observations, laboratory experiments and theoretical models are required The goals are to increase knowledge about physics and chemistry of comets and to prepare to exploit at best Rosetta data

  18. Modeling and performance improvement of the constant power regulator systems in variable displacement axial piston pump.

    Science.gov (United States)

    Park, Sung Hwan; Lee, Ji Min; Kim, Jong Shik

    2013-01-01

    An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software.

  19. Application of laboratory data from small-scale simulators to human performance issues in the nuclear industry

    International Nuclear Information System (INIS)

    Spettell, C.M.

    1986-01-01

    Laboratory analogs of nuclear power plant tasks were simulated on personal computers in two experimental studies. Human performance data were collected during each experimental study. The goal of the first experiment was to validate a quantitative model of dependence among human errors during testing, calibration, and maintenance activities. This model, the Multiple Sequential Failure (MSF) model (NUREG/CR-2211) has been used to quantify dependent human error failure probabilities for human reliability analyses in Probabilistic Risk Assessments (PRAs). The goal of the second experiment was to examine the relationship among psychological and behavioral characteristics of individuals and their performance at controlling a simulated nuclear power plant. These studies demonstrated the usefulness of the experimental psychology approach for validating models of human performance at nuclear power plant tasks

  20. SEAscan 3.5: A simulator performance analyzer

    International Nuclear Information System (INIS)

    Dennis, T.; Eisenmann, S.

    1990-01-01

    SEAscan 3.5 is a personal computer based tool developed to analyze the dynamic performance of nuclear power plant training simulators. The system has integrated features to provide its own human featured performance. In this paper, the program is described as a tool for the analysis of training simulator performance. The structure and operating characteristics of SEAscan 3.5 are described. The hardcopy documents are shown to aid in verification of conformance to ANSI/ANS-3.5-1985

  1. Mesoscopic modelling and simulation of soft matter.

    Science.gov (United States)

    Schiller, Ulf D; Krüger, Timm; Henrich, Oliver

    2017-12-20

    The deformability of soft condensed matter often requires modelling of hydrodynamical aspects to gain quantitative understanding. This, however, requires specialised methods that can resolve the multiscale nature of soft matter systems. We review a number of the most popular simulation methods that have emerged, such as Langevin dynamics, dissipative particle dynamics, multi-particle collision dynamics, sometimes also referred to as stochastic rotation dynamics, and the lattice-Boltzmann method. We conclude this review with a short glance at current compute architectures for high-performance computing and community codes for soft matter simulation.

  2. Modelling and Simulation for Requirements Engineering and Options Analysis

    Science.gov (United States)

    2010-05-01

    should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments

  3. Solar power plant performance evaluation: simulation and experimental validation

    Science.gov (United States)

    Natsheh, E. M.; Albarbar, A.

    2012-05-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P&O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  4. Specialty Payment Model Opportunities and Assessment: Oncology Simulation Report.

    Science.gov (United States)

    White, Chapin; Chan, Chris; Huckfeldt, Peter J; Kofner, Aaron; Mulcahy, Andrew W; Pollak, Julia; Popescu, Ioana; Timbie, Justin W; Hussey, Peter S

    2015-07-15

    This article describes the results of a simulation analysis of a payment model for specialty oncology services that is being developed for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare & Medicaid Services (CMS). CMS asked MITRE and RAND to conduct simulation analyses to preview some of the possible impacts of the payment model and to inform design decisions related to the model. The simulation analysis used an episode-level dataset based on Medicare fee-for-service (FFS) claims for historical oncology episodes provided to Medicare FFS beneficiaries in 2010. Under the proposed model, participating practices would continue to receive FFS payments, would also receive per-beneficiary per-month care management payments for episodes lasting up to six months, and would be eligible for performance-based payments based on per-episode spending for attributed episodes relative to a per-episode spending target. The simulation offers several insights into the proposed payment model for oncology: (1) The care management payments used in the simulation analysis-$960 total per six-month episode-represent only 4 percent of projected average total spending per episode (around $27,000 in 2016), but they are large relative to the FFS revenues of participating oncology practices, which are projected to be around $2,000 per oncology episode. By themselves, the care management payments would increase physician practices' Medicare revenues by roughly 50 percent on average. This represents a substantial new outlay for the Medicare program and a substantial new source of revenues for oncology practices. (2) For the Medicare program to break even, participating oncology practices would have to reduce utilization and intensity by roughly 4 percent. (3) The break-even point can be reduced if the care management payments are reduced or if the performance-based payments are reduced.

  5. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  6. The development of fully dynamic rotating machine models for nuclear training simulators

    International Nuclear Information System (INIS)

    Birsa, J.J.

    1990-01-01

    Prior to beginning the development of an enhanced set of electrical plant models for several nuclear training simulators, an extensive literature search was conducted to evaluate and select rotating machine models for use on these simulators. These models include the main generator, diesel generators, in-plant electric power distribution and off-side power. Form the results of this search, various models were investigated and several were selected for further evaluation. Several computer studies were performed on the selected models in order to determine their suitability for use in a training simulator environment. One surprising result of this study was that a number of established, classical models could not be made to reproduce actual plant steady-state data over the range necessary for a training simulator. This evaluation process and its results are presented in this paper. Various historical, as well as contemporary, electrical models of rotating machines are discussed. Specific criteria for selection of rotating machine models for training simulator use are presented

  7. An ocular biomechanic model for dynamic simulation of different eye movements.

    Science.gov (United States)

    Iskander, J; Hossny, M; Nahavandi, S; Del Porto, L

    2018-04-11

    Simulating and analysing eye movement is useful for assessing visual system contribution to discomfort with respect to body movements, especially in virtual environments where simulation sickness might occur. It can also be used in the design of eye prosthesis or humanoid robot eye. In this paper, we present two biomechanic ocular models that are easily integrated into the available musculoskeletal models. The model was previously used to simulate eye-head coordination. The models are used to simulate and analyse eye movements. The proposed models are based on physiological and kinematic properties of the human eye. They incorporate an eye-globe, orbital suspension tissues and six muscles with their connective tissues (pulleys). Pulleys were incorporated in rectus and inferior oblique muscles. The two proposed models are the passive pulleys and the active pulleys models. Dynamic simulations of different eye movements, including fixation, saccade and smooth pursuit, are performed to validate both models. The resultant force-length curves of the models were similar to the experimental data. The simulation results show that the proposed models are suitable to generate eye movement simulations with results comparable to other musculoskeletal models. The maximum kinematic root mean square error (RMSE) is 5.68° and 4.35° for the passive and active pulley models, respectively. The analysis of the muscle forces showed realistic muscle activation with increased muscle synergy in the active pulley model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Qualification of a Plant Disease Simulation Model: Performance of the LATEBLIGHT Model Across a Broad Range of Environments.

    Science.gov (United States)

    Andrade-Piedra, Jorge L; Forbes, Gregory A; Shtienberg, Dani; Grünwald, Niklaus J; Chacón, María G; Taipe, Marco V; Hijmans, Robert J; Fry, William E

    2005-12-01

    ABSTRACT The concept of model qualification, i.e., discovering the domain over which a validated model may be properly used, was illustrated with LATEBLIGHT, a mathematical model that simulates the effect of weather, host growth and resistance, and fungicide use on asexual development and growth of Phytophthora infestans on potato foliage. Late blight epidemics from Ecuador, Mexico, Israel, and the United States involving 13 potato cultivars (32 epidemics in total) were compared with model predictions using graphical and statistical tests. Fungicides were not applied in any of the epidemics. For the simulations, a host resistance level was assigned to each cultivar based on general categories reported by local investigators. For eight cultivars, the model predictions fit the observed data. For four cultivars, the model predictions overestimated disease, likely due to inaccurate estimates of host resistance. Model predictions were inconsistent for one cultivar and for one location. It was concluded that the domain of applicability of LATEBLIGHT can be extended from the range of conditions in Peru for which it has been previously validated to those observed in this study. A sensitivity analysis showed that, within the range of values observed empirically, LATEBLIGHT is more sensitive to changes in variables related to initial inoculum and to weather than to changes in variables relating to host resistance.

  9. Human performance modeling for system of systems analytics :soldier fatigue.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  10. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  11. SLC positron source: Simulation and performance

    International Nuclear Information System (INIS)

    Pitthan, R.; Braun, H.; Clendenin, J.E.; Ecklund, S.D.; Helm, R.H.; Kulikov, A.V.; Odian, A.C.; Pei, G.X.; Ross, M.C.; Woodley, M.D.

    1991-06-01

    Performance of the source was found to be in good general agreement with computer simulations with S-band acceleration, and where not, the simulations lead to identification of problems, in particular the underestimated impact of linac misalignments due to the 1989 Loma Prieta Earthquake. 13 refs., 7 figs

  12. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  13. Comparison between the performance of some KEK-klystrons and simulation results

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, Shigeki [National Lab. for High Energy Physics, Tsukuba, Ibaraki (Japan)

    1997-04-01

    Recent developments of various klystron simulation codes have enabled us to realistically design klystrons. This paper presents various simulation results using the FCI code and the performances of tubes manufactured based on this code. Upgrading a 30-MW S-band klystron and developing a 50-MW S-band klystron for the KEKB projects are successful examples based on FCI-code predictions. Mass-productions of these tubes have already started. On the other hand, a discrepancy has been found between the FCI simulation results and the performance of real tubes. In some cases, the simulation results lead to high-efficiency results, while manufactured tubes show the usual value, or a lower value, of the efficiency. One possible cause may come from a data mismatch between the electron-gun simulation and the input data set of the FCI code for the gun region. This kind of discrepancy has been observed in 30-MW S-band pulsed tubes, sub-booster pulsed tubes and L-band high-duty pulsed klystrons. Sometimes, JPNDSK (one-dimensional disk-model code) gives similar results. Some examples using the FCI code are given in this article. An Arsenal-MSU code could be applied to the 50-MW klystron under collaboration with Moscow State University; a good agreement has been found between the prediction of the code and performance. (author)

  14. Impact of a function-based payment model on the financial performance of acute inpatient medical rehabilitation providers: a simulation analysis.

    Science.gov (United States)

    Sutton, J P; DeJong, G; Song, H; Wilkerson, D

    1997-12-01

    To operationalize research findings about a medical rehabilitation classification and payment model by building a prototype of a prospective payment system, and to determine whether this prototype model promotes payment equity. This latter objective is accomplished by identifying whether any facility or payment model characteristics are systematically associated with financial performance. This study was conducted in two phases. In Phase 1 the components of a diagnosis-related group (DRG)-like payment system, including a base rate, function-related group (FRG) weights, and adjusters, were identified and estimated using hospital cost functions. Phase 2 consisted of a simulation analysis in which each facility's financial performance was modeled, based on its 1990-1991 case mix. A multivariate regression equation was conducted to assess the extent to which characteristics of 42 rehabilitation facilities contribute toward determining financial performance under the present Medicare payment system as well as under the hypothetical model developed. Phase 1 (model development) included 61 rehabilitation hospitals. Approximately 59% were rehabilitation units within a general hospital and 48% were teaching facilities. The number of rehabilitation beds averaged 52. Phase 2 of the stimulation analysis included 42 rehabilitation facilities, subscribers to UDS in 1990-1991. Of these, 69% were rehabilitation units and 52% were teaching facilities. The number of rehabilitation beds averaged 48. Financial performance, as measured by the ratio of reimbursement to average costs. Case-mix index is the primary determinant of financial performance under the present Medicare payment system. None of the facility characteristics included in this analysis were associated with financial performance under the hypothetical FRG payment model. The most notable impact of an FRG-based payment model would be to create a stronger link between resource intensity and level of reimbursement

  15. Simulation platform to model, optimize and design wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    farms. The performance of these models is proven and they can be directly implemented in different simulation tools. Then, the general conclusions regarding the achieved results during the project are summarized and some guidelines for future work are given. A general conclusion is that the main goals of the project have been achieved. Finally, the papers and reports published during the project are presented. (au)

  16. Challenge problem and milestones for : Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe, Jr.

    2010-09-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  17. Challenge problem and milestones for: Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC)

    International Nuclear Information System (INIS)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe Jr.

    2010-01-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  18. High-Performance Beam Simulator for the LANSCE Linac

    International Nuclear Information System (INIS)

    Pang, Xiaoying; Rybarcyk, Lawrence J.; Baily, Scott A.

    2012-01-01

    A high performance multiparticle tracking simulator is currently under development at Los Alamos. The heart of the simulator is based upon the beam dynamics simulation algorithms of the PARMILA code, but implemented in C++ on Graphics Processing Unit (GPU) hardware using NVIDIA's CUDA platform. Linac operating set points are provided to the simulator via the EPICS control system so that changes of the real time linac parameters are tracked and the simulation results updated automatically. This simulator will provide valuable insight into the beam dynamics along a linac in pseudo real-time, especially where direct measurements of the beam properties do not exist. Details regarding the approach, benefits and performance are presented.

  19. Thermal unit availability modeling in a regional simulation model

    International Nuclear Information System (INIS)

    Yamayee, Z.A.; Port, J.; Robinett, W.

    1983-01-01

    The System Analysis Model (SAM) developed under the umbrella of PNUCC's System Analysis Committee is capable of simulating the operation of a given load/resource scenario. This model employs a Monte-Carlo simulation to incorporate uncertainties. Among uncertainties modeled is thermal unit availability both for energy simulation (seasonal) and capacity simulations (hourly). This paper presents the availability modeling in the capacity and energy models. The use of regional and national data in deriving the two availability models, the interaction between the two and modifications made to the capacity model in order to reflect regional practices is presented. A sample problem is presented to show the modification process. Results for modeling a nuclear unit using NERC-GADS is presented

  20. Nuclear reactor core modelling in multifunctional simulators

    International Nuclear Information System (INIS)

    Puska, E.K.

    1999-01-01

    studied to assess the possibilities for using three-dimensional cores in training simulators. The core model results have been compared with the Loviisa WWER-type plant measurement data in steady state and in some transients. Hypothetical control rod withdrawal, ejection and boron dilution transients have been calculated with various three-dimensional core models for the Loviisa WWER-440 core. Several ATWS analyses for the WWER-1000/91 plant have been performed using the three-dimensional core model. In this context, the results of APROS have been compared in detail with the results of the HEXTRAN code. The three-dimensional Olkiluoto BWR-type core model has been used for transient calculation and for severe accident re-criticality studies. The one-dimensional core model is at present used in several plant analyser and training simulator applications and it has been used extensively for safety analyses in the Loviisa WWER-440 plant modernisation project. (orig.)

  1. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    studied to assess the possibilities for using three-dimensional cores in training simulators. The core model results have been compared with the Loviisa WWER-type plant measurement data in steady state and in some transients. Hypothetical control rod withdrawal, ejection and boron dilution transients have been calculated with various three-dimensional core models for the Loviisa WWER-440 core. Several ATWS analyses for the WWER-1000/91 plant have been performed using the three-dimensional core model. In this context, the results of APROS have been compared in detail with the results of the HEXTRAN code. The three-dimensional Olkiluoto BWR-type core model has been used for transient calculation and for severe accident re-criticality studies. The one-dimensional core model is at present used in several plant analyser and training simulator applications and it has been used extensively for safety analyses in the Loviisa WWER-440 plant modernisation project. (orig.) 75 refs. The thesis includes also eight previous publications by author

  2. Ravenscar Computational Model compliant AADL Simulation on LEON2

    Directory of Open Access Journals (Sweden)

    Roberto Varona-Gómez

    2013-02-01

    Full Text Available AADL has been proposed for designing and analyzing SW and HW architectures for real-time mission-critical embedded systems. Although the Behavioral Annex improves its simulation semantics, AADL is a language for analyzing architectures and not for simulating them. AADS-T is an AADL simulation tool that supports the performance analysis of the AADL specification throughout the refinement process from the initial system architecture until the complete, detailed application and execution platform are developed. In this way, AADS-T enables the verification of the initial timing constraints during the complete design process. In this paper we focus on the compatibility of AADS-T with the Ravenscar Computational Model (RCM as part of the TASTE toolset. Its flexibility enables AADS-T to support different processors. In this work we have focused on performing the simulation on a LEON2 processor.

  3. ExaSAT: An exascale co-design tool for performance modeling

    International Nuclear Information System (INIS)

    Unat, Didem; Chan, Cy; Zhang, Weiqun; Williams, Samuel; Bachan, John

    2015-01-01

    One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range of hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.

  4. Aircraft Performance for Open Air Traffic Simulations

    NARCIS (Netherlands)

    Metz, I.C.; Hoekstra, J.M.; Ellerbroek, J.; Kugler, D.

    2016-01-01

    The BlueSky Open Air Tra_c Simulator developed by the Control & Simulation section of TU Delft aims at supporting research for analysing Air Tra_c Management concepts by providing an open source simulation platform. The goal of this study was to complement BlueSky with aircraft performance

  5. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    Science.gov (United States)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  6. Modeling and Performance Analysis of Manufacturing Systems in ...

    African Journals Online (AJOL)

    Modeling and Performance Analysis of Manufacturing Systems in Footwear Industry. ... researcher to experiment with different variables and controls the manufacturing process ... In this study Arena simulation software is employed to model and measure ... for Authors · for Policy Makers · about Open Access · Journal Quality.

  7. Conversion of a mainframe simulation for maintenance performance to a PC environment

    International Nuclear Information System (INIS)

    Gertman, D.I.

    1990-01-01

    The computer model MAPPS, the Maintenance Personnel Performance Simulation, has been developed and validated by the US NRC [Nuclear Regulatory Commission] in order to improve maintenance practices and procedures at nuclear power plants. This model has now been implemented and improved, in a PC [personal computer] environment and renamed MICROMAPPS. The model is stochastically based and users are able to simulate the performance of 2- to 8-person crews for a variety of maintenance tasks under a variety of conditions. These conditions include aspects of crew actions as potentially influenced by the task, the environment, or the personnel involved. For example, the influence of the following factors is currently modeled within the MAPPS computer code: (1) personnel characteristics include but are not limited to intellectual and perceptual motor ability levels, the effect of fatigue and conversely, of rest breaks on performance, stress, communication, supervisor acceptance, motivation, organizational climate, time since the tasks was last performed and the staffing level available; (2) task variables include but are not limited to time allowed, occurrence of shift change, intellectual requirements, perceptual motor requirements, procedures quality, necessity for protective clothing and essentiality of a procedures quality, necessity for protective clothing and essentiality of a subtask; and (3) environment variables include temperature of the workplace, radiation level, and noise levels. The output describing maintainer performance includes subtask and task identification, success proportion, work and wait durations, time spent repeating various subtasks and outcome in terms of errors detected by the crew, false alarms, undetected errors, duration, and the probability of success. The model is comprehensive and allows for the modeling of decision making, trouble-shooting and branching of tasks

  8. Lithium-ion Battery Electrothermal Model, Parameter Estimation, and Simulation Environment

    Directory of Open Access Journals (Sweden)

    Simone Orcioni

    2017-03-01

    Full Text Available The market for lithium-ion batteries is growing exponentially. The performance of battery cells is growing due to improving production technology, but market request is growing even more rapidly. Modeling and characterization of single cells and an efficient simulation environment is fundamental for the development of an efficient battery management system. The present work is devoted to defining a novel lumped electrothermal circuit of a single battery cell, the extraction procedure of the parameters of the single cell from experiments, and a simulation environment in SystemC-WMS for the simulation of a battery pack. The electrothermal model of the cell was validated against experimental measurements obtained in a climatic chamber. The model is then used to simulate a 48-cell battery, allowing statistical variations among parameters. The different behaviors of the cells in terms of state of charge, current, voltage, or heat flow rate can be observed in the results of the simulation environment.

  9. Parallel Beam Dynamics Simulation Tools for Future Light Source Linac Modeling

    International Nuclear Information System (INIS)

    Qiang, Ji; Pogorelov, Ilya v.; Ryne, Robert D.

    2007-01-01

    Large-scale modeling on parallel computers is playing an increasingly important role in the design of future light sources. Such modeling provides a means to accurately and efficiently explore issues such as limits to beam brightness, emittance preservation, the growth of instabilities, etc. Recently the IMPACT codes suite was enhanced to be applicable to future light source design. Simulations with IMPACT-Z were performed using up to one billion simulation particles for the main linac of a future light source to study the microbunching instability. Combined with the time domain code IMPACT-T, it is now possible to perform large-scale start-to-end linac simulations for future light sources, including the injector, main linac, chicanes, and transfer lines. In this paper we provide an overview of the IMPACT code suite, its key capabilities, and recent enhancements pertinent to accelerator modeling for future linac-based light sources

  10. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    trials. However, if simulation models would be used, good quality input data must be available. To model FMD, several disease spread models are available. For this project, we chose three simulation model; Davis Animal Disease Spread (DADS), that has been upgraded to DTU-DADS, InterSpread Plus (ISP......Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and field...... trials to investigate the effect of alternative conditions or actions on a specific system. Nonetheless, field trials are expensive and sometimes not possible to conduct, as in case of foot-and-mouth disease (FMD). Instead, simulation models can be a good and cheap substitute for experiments and field...

  11. Use of high performance networks and supercomputers for real-time flight simulation

    Science.gov (United States)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  12. Modeling and simulation of gamma camera

    International Nuclear Information System (INIS)

    Singh, B.; Kataria, S.K.; Samuel, A.M.

    2002-08-01

    Simulation techniques play a vital role in designing of sophisticated instruments and also for the training of operating and maintenance staff. Gamma camera systems have been used for functional imaging in nuclear medicine. Functional images are derived from the external counting of the gamma emitting radioactive tracer that after introduction in to the body mimics the behavior of native biochemical compound. The position sensitive detector yield the coordinates of the gamma ray interaction with the detector and are used to estimate the point of gamma ray emission within the tracer distribution space. This advanced imaging device is thus dependent on the performance of algorithm for coordinate computing, estimation of point of emission, generation of image and display of the image data. Contemporary systems also have protocols for quality control and clinical evaluation of imaging studies. Simulation of this processing leads to understanding of the basic camera design problems. This report describes a PC based package for design and simulation of gamma camera along with the options of simulating data acquisition and quality control of imaging studies. Image display and data processing the other options implemented in SIMCAM will be described in separate reports (under preparation). Gamma camera modeling and simulation in SIMCAM has preset configuration of the design parameters for various sizes of crystal detector with the option to pack the PMT on hexagon or square lattice. Different algorithm for computation of coordinates and spatial distortion removal are allowed in addition to the simulation of energy correction circuit. The user can simulate different static, dynamic, MUGA and SPECT studies. The acquired/ simulated data is processed for quality control and clinical evaluation of the imaging studies. Results show that the program can be used to assess these performances. Also the variations in performance parameters can be assessed due to the induced

  13. How model and input uncertainty impact maize yield simulations in West Africa

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter; Wang, Enli

    2015-02-01

    Crop models are common tools for simulating crop yields and crop production in studies on food security and global change. Various uncertainties however exist, not only in the model design and model parameters, but also and maybe even more important in soil, climate and management input data. We analyze the performance of the point-scale crop model APSIM and the global scale crop model LPJmL with different climate and soil conditions under different agricultural management in the low-input maize-growing areas of Burkina Faso, West Africa. We test the models’ response to different levels of input information from little to detailed information on soil, climate (1961-2000) and agricultural management and compare the models’ ability to represent the observed spatial (between locations) and temporal variability (between years) in crop yields. We found that the resolution of different soil, climate and management information influences the simulated crop yields in both models. However, the difference between models is larger than between input data and larger between simulations with different climate and management information than between simulations with different soil information. The observed spatial variability can be represented well from both models even with little information on soils and management but APSIM simulates a higher variation between single locations than LPJmL. The agreement of simulated and observed temporal variability is lower due to non-climatic factors e.g. investment in agricultural research and development between 1987 and 1991 in Burkina Faso which resulted in a doubling of maize yields. The findings of our study highlight the importance of scale and model choice and show that the most detailed input data does not necessarily improve model performance.

  14. Modelling the climate of the last millennium: what causes the differences between simulations?

    NARCIS (Netherlands)

    Goosse, H.; Crowley, T.J.; Zorita, E.; Ammann, C.M.; Renssen, H.; Driesschaert, E.

    2005-01-01

    An ensemble of simulations performed with a coarse resolution 3-D climate model driven by various combinations of external forcing is used to investigate possible causes for differences noticed in two recent simulations of the climate of the past millennium using General Circulation Models (GCMs).

  15. Real-time volumetric deformable models for surgery simulation using finite elements and condensation

    DEFF Research Database (Denmark)

    Bro-Nielsen, Morten; Cotin, S.

    1996-01-01

    This paper discusses the application of SD solid volumetric Finite Element models to surgery simulation. In particular it introduces three new ideas for solving the problem of achieving real-time performance for these models. The simulation system we have developed is described and we demonstrate...

  16. [Preparation of simulate craniocerebral models via three dimensional printing technique].

    Science.gov (United States)

    Lan, Q; Chen, A L; Zhang, T; Zhu, Q; Xu, T

    2016-08-09

    Three dimensional (3D) printing technique was used to prepare the simulate craniocerebral models, which were applied to preoperative planning and surgical simulation. The image data was collected from PACS system. Image data of skull bone, brain tissue and tumors, cerebral arteries and aneurysms, and functional regions and relative neural tracts of the brain were extracted from thin slice scan (slice thickness 0.5 mm) of computed tomography (CT), magnetic resonance imaging (MRI, slice thickness 1mm), computed tomography angiography (CTA), and functional magnetic resonance imaging (fMRI) data, respectively. MIMICS software was applied to reconstruct colored virtual models by identifying and differentiating tissues according to their gray scales. Then the colored virtual models were submitted to 3D printer which produced life-sized craniocerebral models for surgical planning and surgical simulation. 3D printing craniocerebral models allowed neurosurgeons to perform complex procedures in specific clinical cases though detailed surgical planning. It offered great convenience for evaluating the size of spatial fissure of sellar region before surgery, which helped to optimize surgical approach planning. These 3D models also provided detailed information about the location of aneurysms and their parent arteries, which helped surgeons to choose appropriate aneurismal clips, as well as perform surgical simulation. The models further gave clear indications of depth and extent of tumors and their relationship to eloquent cortical areas and adjacent neural tracts, which were able to avoid surgical damaging of important neural structures. As a novel and promising technique, the application of 3D printing craniocerebral models could improve the surgical planning by converting virtual visualization into real life-sized models.It also contributes to functional anatomy study.

  17. A comparison of single column model simulations of summertime midlatitude continental convection

    Energy Technology Data Exchange (ETDEWEB)

    Ghan, Steven [Pacific Northwest National Laboratory, Richland, Washington (United States); Randall, David [Department of Atmospospheric Science, Colorado State University, Fort Collins, Colorado (United States); Xu, Kuan-Man [Department of Atmospospheric Science, Colorado State University, Fort Collins, Colorado (United States); Cederwall, Richard [Lawrence Livermore National Laboratory, Livermore, California (United States); Cripe, Douglas [Department of Atmospospheric Science, Colorado State University, Fort Collins, Colorado (United States); Hack, James [National Center for Atmospheric Research, Boulder, Colorado (United States); Iacobellis, Sam [Scripps Institution of Oceanography, University of California, La Jolla (United States); Klein, Stephen [Geophysical Fluid Dynamics Laboratory, Princeton, New Jersey (United States); Krueger, Steven [Department of Meterology, University of Utah, Salt Lake City, Utah (United States); Lohmann, Ulrike [Department of Physics and Oceanography, Dalhousie University, Halifax, Nova Scotia (Canada)] (and others)

    2000-01-27

    Eleven different single-column models (SCMs) and one cloud ensemble model (CEM) are driven by boundary conditions observed at the Atmospheric Radiation Measurement (ARM) program southern Great Plains site for a 17 day period during the summer of 1995. Comparison of the model simulations reveals common signatures identifiable as products of errors in the boundary conditions. Intermodel differences in the simulated temperature, humidity, cloud, precipitation, and radiative fluxes reflect differences in model resolution or physical parameterizations, although sensitive dependence on initial conditions can also contribute to intermodel differences. All models perform well at times but poorly at others. Although none of the SCM simulations stands out as superior to the others, the simulation by the CEM is in several respects in better agreement with the observations than the simulations by the SCMs. Nudging of the simulated temperature and humidity toward observations generally improves the simulated cloud and radiation fields as well as the simulated temperature and humidity but degrades the precipitation simulation for models with large temperature and humidity biases without nudging. Although some of the intermodel differences have not been explained, others have been identified as model problems that can be or have been corrected as a result of the comparison. (c) 2000 American Geophysical Union.

  18. A comparison of single column model simulations of summertime midlatitude continental convection

    International Nuclear Information System (INIS)

    Ghan, Steven; Randall, David; Xu, Kuan-Man; Cederwall, Richard; Cripe, Douglas; Hack, James; Iacobellis, Sam; Klein, Stephen; Krueger, Steven; Lohmann, Ulrike

    2000-01-01

    Eleven different single-column models (SCMs) and one cloud ensemble model (CEM) are driven by boundary conditions observed at the Atmospheric Radiation Measurement (ARM) program southern Great Plains site for a 17 day period during the summer of 1995. Comparison of the model simulations reveals common signatures identifiable as products of errors in the boundary conditions. Intermodel differences in the simulated temperature, humidity, cloud, precipitation, and radiative fluxes reflect differences in model resolution or physical parameterizations, although sensitive dependence on initial conditions can also contribute to intermodel differences. All models perform well at times but poorly at others. Although none of the SCM simulations stands out as superior to the others, the simulation by the CEM is in several respects in better agreement with the observations than the simulations by the SCMs. Nudging of the simulated temperature and humidity toward observations generally improves the simulated cloud and radiation fields as well as the simulated temperature and humidity but degrades the precipitation simulation for models with large temperature and humidity biases without nudging. Although some of the intermodel differences have not been explained, others have been identified as model problems that can be or have been corrected as a result of the comparison. (c) 2000 American Geophysical Union

  19. Particle tracking in sophisticated CAD models for simulation purposes

    International Nuclear Information System (INIS)

    Sulkimo, J.; Vuoskoski, J.

    1995-01-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT. (orig.)

  20. Particle tracking in sophisticated CAD models for simulation purposes

    Science.gov (United States)

    Sulkimo, J.; Vuoskoski, J.

    1996-02-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT.

  1. Research on Multi Hydrological Models Applicability and Modelling Data Uncertainty Analysis for Flash Flood Simulation in Hilly Area

    Science.gov (United States)

    Ye, L.; Wu, J.; Wang, L.; Song, T.; Ji, R.

    2017-12-01

    Flooding in small-scale watershed in hilly area is characterized by short time periods and rapid rise and recession due to the complex underlying surfaces, various climate type and strong effect of human activities. It is almost impossible for a single hydrological model to describe the variation of flooding in both time and space accurately for all the catchments in hilly area because the hydrological characteristics can vary significantly among different catchments. In this study, we compare the performance of 5 hydrological models with varying degrees of complexity for simulation of flash flood for 14 small-scale watershed in China in order to find the relationship between the applicability of the hydrological models and the catchments characteristics. Meanwhile, given the fact that the hydrological data is sparse in hilly area, the effect of precipitation data, DEM resolution and their interference on the uncertainty of flood simulation is also illustrated. In general, the results showed that the distributed hydrological model (HEC-HMS in this study) performed better than the lumped hydrological models. Xinajiang and API models had good simulation for the humid catchments when long-term and continuous rainfall data is provided. Dahuofang model can simulate the flood peak well while the runoff generation module is relatively poor. In addition, the effect of diverse modelling data on the simulations is not simply superposed, and there is a complex interaction effect among different modelling data. Overall, both the catchment hydrological characteristics and modelling data situation should be taken into consideration in order to choose the suitable hydrological model for flood simulation for small-scale catchment in hilly area.

  2. Performance Evaluation of a PID and a Fuzzy PID Controllers Designed for Controlling a Simulated Quadcopter Rotational Dynamics Model

    Directory of Open Access Journals (Sweden)

    Laith Jasim Saud

    2017-07-01

    Full Text Available This work is concerned with designing two types of controllers, a PID and a Fuzzy PID, to be used for flying and stabilizing a quadcopter. The designed controllers have been tuned, tested, and compared using two performance indices which are the Integral Square Error (ISE and the Integral Absolute Error (IAE, and also some response characteristics like the rise time, overshoot, settling time, and the steady state error. To try and test the controllers, a quadcopter mathematical model has been developed. The model concentrated on the rotational dynamics of the quadcopter, i.e. the roll, pitch, and yaw variables. The work has been simulated with “MATLAB”. To make testing the simulated model and the controllers more realistic, the testing signals have been applied by a user through a joystick interfaced to the computer. The results obtained indicated a general superiority in performance for the Fuzzy PID controller over the PID controller used in this work. This conclusion is based by the following figures:lesser ISA for the roll, pitch, and yaw consequently, lesser IAE for the roll, pitch, and yaw consequently, lesser rise time and settling time for the roll and pitch consequently, and lesser settling time for the yaw. Moreover, the FPID gave zero overshoot versus and in the PID case for the roll, pitch, and yaw consequently. Both controllers gave zero steady state error with close rise times for the yaw. This superiority of the FPID controller is gained as the fuzzy part of it continuously and online adapts the parameters of the PID part.

  3. CFD modelling of hydrogen stratification in enclosures: Model validation and application to PAR performance

    Energy Technology Data Exchange (ETDEWEB)

    Hoyes, J.R., E-mail: james.hoyes@hsl.gsi.gov.uk; Ivings, M.J.

    2016-12-15

    Highlights: • The ability of CFD to predict hydrogen stratification phenomena is investigated. • Contrary to expectation, simulations on tetrahedral meshes under-predict mixing. • Simulations on structured meshes give good agreement with experimental data. • CFD model used to investigate the effects of stratification on PAR performance. • Results show stratification can have a significant effect on PAR performance. - Abstract: Computational Fluid Dynamics (CFD) models are maturing into useful tools for supporting safety analyses. This paper investigates the capabilities of CFD models for predicting hydrogen stratification in a containment vessel using data from the NEA/OECD SETH2 MISTRA experiments. Further simulations are then carried out to illustrate the qualitative effects of hydrogen stratification on the performance of Passive Autocatalytic Recombiner (PAR) units. The MISTRA experiments have well-defined initial and boundary conditions which makes them well suited for use in a validation study. Results are presented for the sensitivity to mesh resolution and mesh type. Whilst the predictions are shown to be largely insensitive to the mesh resolution they are surprisingly sensitive to the mesh type. In particular, tetrahedral meshes are found to induce small unphysical convection currents that result in molecular diffusion and turbulent mixing being under-predicted. This behaviour is not unique to the CFD model used here (ANSYS CFX) and furthermore, it may affect simulations run on other non-aligned meshes (meshes that are not aligned perpendicular to gravity), including non-aligned structured meshes. Following existing best practice guidelines can help to identify potential unphysical predictions, but as an additional precaution consideration should be given to using gravity-aligned meshes for modelling stratified flows. CFD simulations of hydrogen recombination in the Becker Technologies THAI facility are presented with high and low PAR positions

  4. High-Performance Computer Modeling of the Cosmos-Iridium Collision

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S; Cook, K; Fasenfest, B; Jefferson, D; Jiang, M; Leek, J; Levatin, J; Nikolaev, S; Pertica, A; Phillion, D; Springer, K; De Vries, W

    2009-08-28

    This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellite collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  5. Numerical simulation of structure integrated cold storages with the model CST-WM; Numerische Simulation gebaeudeintegrierter Kaeltespeicher mit dem Modell CST-WM

    Energy Technology Data Exchange (ETDEWEB)

    Koppatz, Stefan; Urbaneck, Thorsten; Platzer, Bernd [TU Chemnitz (Germany). Fakultaet Maschinenbau; Kalz, Doreen; Sonntag, Martin [Fraunhofer ISE, Freiburg (Germany). Bereich Energieeffiziente und Solare Kuehlung

    2013-04-15

    Decentralized, structure integrated cold water storaged have been purpose of research in Germany for a short time, which is why appropriate system simulation models for mapping their thermal performance are missing. Intention of this article is the presentation of the MATLAB CST-WM model, which is adapted to the special requirements of this storage type in order to differ from existent models. Thereby, a specific method reduces the programming and computation effort.

  6. Application of air pollution dispersion modeling for source-contribution assessment and model performance evaluation at integrated industrial estate-Pantnagar

    Energy Technology Data Exchange (ETDEWEB)

    Banerjee, T., E-mail: tirthankaronline@gmail.com [Department of Environmental Science, G.B. Pant University of Agriculture and Technology, Pantnagar, U.S. Nagar, Uttarakhand 263 145 (India); Barman, S.C., E-mail: scbarman@yahoo.com [Department of Environmental Monitoring, Indian Institute of Toxicology Research, Post Box No. 80, Mahatma Gandhi Marg, Lucknow-226 001, Uttar Pradesh (India); Srivastava, R.K., E-mail: rajeevsrivastava08@gmail.com [Department of Environmental Science, G.B. Pant University of Agriculture and Technology, Pantnagar, U.S. Nagar, Uttarakhand 263 145 (India)

    2011-04-15

    Source-contribution assessment of ambient NO{sub 2} concentration was performed at Pantnagar, India through simulation of two urban mathematical dispersive models namely Gaussian Finite Line Source Model (GFLSM) and Industrial Source Complex Model (ISCST-3) and model performances were evaluated. Principal approaches were development of comprehensive emission inventory, monitoring of traffic density and regional air quality and conclusively simulation of urban dispersive models. Initially, 18 industries were found responsible for emission of 39.11 kg/h of NO{sub 2} through 43 elevated stacks. Further, vehicular emission potential in terms of NO{sub 2} was computed as 7.1 kg/h. Air quality monitoring delineates an annual average NO{sub 2} concentration of 32.6 {mu}g/m{sup 3}. Finally, GFLSM and ISCST-3 were simulated in conjunction with developed emission inventories and existing meteorological conditions. Models simulation indicated that contribution of NO{sub 2} from industrial and vehicular source was in a range of 45-70% and 9-39%, respectively. Further, statistical analysis revealed satisfactory model performance with an aggregate accuracy of 61.9%. - Research highlights: > Application of dispersion modeling for source-contribution assessment of ambient NO{sub 2}. > Inventorization revealed emission from industry and vehicles was 39.11 and 7.1 kg/h. > GFLSM revealed that vehicular pollution contributes a range of 9.0-38.6%. > Source-contribution of 45-70% was found for industrial emission through ISCST-3. > Aggregate performance of both models shows good agreement with an accuracy of 61.9%. - Development of industrial and vehicular inventory in terms of ambient NO{sub 2} for model simulation at Pantnagar, India and model validation revealed satisfactory outcome.

  7. ECONOMIC MODELING STOCKS CONTROL SYSTEM: SIMULATION MODEL

    OpenAIRE

    Климак, М.С.; Войтко, С.В.

    2016-01-01

    Considered theoretical and applied aspects of the development of simulation models to predictthe optimal development and production systems that create tangible products andservices. It isproved that theprocessof inventory control needs of economicandmathematical modeling in viewof thecomplexity of theoretical studies. A simulation model of stocks control that allows make managementdecisions with production logistics

  8. Strategy and gaps for modeling, simulation, and control of hybrid systems

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Garcia, Humberto E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hovsapian, Rob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mesina, George L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bragg-Sitton, Shannon M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boardman, Richard D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers, and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled

  9. Benchmarking Model Variants in Development of a Hardware-in-the-Loop Simulation System

    Science.gov (United States)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia M.; Kratz, Jonathan L.; Culley, Dennis E.; Thomas, George L.

    2016-01-01

    Distributed engine control architecture presents a significant increase in complexity over traditional implementations when viewed from the perspective of system simulation and hardware design and test. Even if the overall function of the control scheme remains the same, the hardware implementation can have a significant effect on the overall system performance due to differences in the creation and flow of data between control elements. A Hardware-in-the-Loop (HIL) simulation system is under development at NASA Glenn Research Center that enables the exploration of these hardware dependent issues. The system is based on, but not limited to, the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k). This paper describes the step-by-step conversion from the self-contained baseline model to the hardware in the loop model, and the validation of each step. As the control model hardware fidelity was improved during HIL system development, benchmarking simulations were performed to verify that engine system performance characteristics remained the same. The results demonstrate the goal of the effort; the new HIL configurations have similar functionality and performance compared to the baseline C-MAPSS40k system.

  10. Nonlinear Model Predictive Control of a Cable-Robot-Based Motion Simulator

    DEFF Research Database (Denmark)

    Katliar, Mikhail; Fischer, Joerg; Frison, Gianluca

    2017-01-01

    In this paper we present the implementation of a model-predictive controller (MPC) for real-time control of a cable-robot-based motion simulator. The controller computes control inputs such that a desired acceleration and angular velocity at a defined point in simulator's cabin are tracked while...... satisfying constraints imposed by working space and allowed cable forces of the robot. In order to fully use the simulator capabilities, we propose an approach that includes the motion platform actuation in the MPC model. The tracking performance and computation time of the algorithm are investigated...

  11. Performance simulation of a grid connected photovoltaic power system using TRNSYS 17

    Science.gov (United States)

    Raja Sekhar, Y.; Ganesh, D.; Kumar, A. Suresh; Abraham, Raju; Padmanathan, P.

    2017-11-01

    Energy plays an important role in a country’s economic growth in the current energy scenario, the major problem is depletion of energy sources (non-renewable) are more than being formed. One of the prominent solutions is minimizing the use of fossil fuels by utilization of renewable energy resources. A photovoltaic system is an efficient option in terms of utilizing the solar energy resource. The electricity output produced by the photovoltaic systems depends upon the incident solar radiation. This paper examines the performance simulation of 200KW photovoltaic power system at VIT University, Vellore. The main objective of this paper is to correlate the results between the predicted simulation data and the experimental data. The simulation tool used here is TRNSYS. Using TRNSYS modelling prediction of electricity produced throughout the year can be calculated with the help of TRNSYS weather station. The deviation of the simulated results with the experimented results varies due to the choice of weather station. Results from the field test and simulation results are to be correlated to attain the maximum performance of the system.

  12. Modeling and Performance Improvement of the Constant Power Regulator Systems in Variable Displacement Axial Piston Pump

    Science.gov (United States)

    Park, Sung Hwan; Lee, Ji Min; Kim, Jong Shik

    2013-01-01

    An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software. PMID:24282389

  13. Modeling and Performance Improvement of the Constant Power Regulator Systems in Variable Displacement Axial Piston Pump

    Directory of Open Access Journals (Sweden)

    Sung Hwan Park

    2013-01-01

    Full Text Available An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software.

  14. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    Energy Technology Data Exchange (ETDEWEB)

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow

  15. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  16. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  17. Comparative Performance of Four Single Extreme Outlier Discordancy Tests from Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    Surendra P. Verma

    2014-01-01

    Full Text Available Using highly precise and accurate Monte Carlo simulations of 20,000,000 replications and 102 independent simulation experiments with extremely low simulation errors and total uncertainties, we evaluated the performance of four single outlier discordancy tests (Grubbs test N2, Dixon test N8, skewness test N14, and kurtosis test N15 for normal samples of sizes 5 to 20. Statistical contaminations of a single observation resulting from parameters called δ from ±0.1 up to ±20 for modeling the slippage of central tendency or ε from ±1.1 up to ±200 for slippage of dispersion, as well as no contamination (δ=0 and ε=±1, were simulated. Because of the use of precise and accurate random and normally distributed simulated data, very large replications, and a large number of independent experiments, this paper presents a novel approach for precise and accurate estimations of power functions of four popular discordancy tests and, therefore, should not be considered as a simple simulation exercise unrelated to probability and statistics. From both criteria of the Power of Test proposed by Hayes and Kinsella and the Test Performance Criterion of Barnett and Lewis, Dixon test N8 performs less well than the other three tests. The overall performance of these four tests could be summarized as N2≅N15>N14>N8.

  18. Used Nuclear Fuel Loading and Structural Performance Under Normal Conditions of Transport - Modeling, Simulation and Experimental Integration RD&D Plan

    Energy Technology Data Exchange (ETDEWEB)

    Adkins, Harold E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-04-01

    Under current U.S. Nuclear Regulatory Commission regulation, it is not sufficient for used nuclear fuel (UNF) to simply maintain its integrity during the storage period, it must maintain its integrity in such a way that it can withstand the physical forces of handling and transportation associated with restaging the fuel and moving it to treatment or recycling facilities, or a geologic repository. Hence it is necessary to understand the performance characteristics of aged UNF cladding and ancillary components under loadings stemming from transport initiatives. Researchers would like to demonstrate that enough information, including experimental support and modeling and simulation capabilities, exists to establish a preliminary determination of UNF structural performance under normal conditions of transport (NCT). This research, development and demonstration (RD&D) plan describes a methodology, including development and use of analytical models, to evaluate loading and associated mechanical responses of UNF rods and key structural components. This methodology will be used to provide a preliminary assessment of the performance characteristics of UNF cladding and ancillary components under rail-related NCT loading. The methodology couples modeling and simulation and experimental efforts currently under way within the Used Fuel Disposition Campaign (UFDC). The methodology will involve limited uncertainty quantification in the form of sensitivity evaluations focused around available fuel and ancillary fuel structure properties exclusively. The work includes collecting information via literature review, soliciting input/guidance from subject matter experts, performing computational analyses, planning experimental measurement and possible execution (depending on timing), and preparing a variety of supporting documents that will feed into and provide the basis for future initiatives. The methodology demonstration will focus on structural performance evaluation of

  19. High-performance modeling of CO2 sequestration by coupling reservoir simulation and molecular dynamics

    KAUST Repository

    Bao, Kai; Yan, Mi; Lu, Ligang; Allen, Rebecca; Salam, Amgad; Jordan, Kirk E.; Sun, Shuyu

    2013-01-01

    multicomponent compositional flow simulation to handle more complicated physical process in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our

  20. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.

    Science.gov (United States)

    Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.

  1. Modeling and simulation of loss of the ultimate heat sink in a typical material testing reactor

    International Nuclear Information System (INIS)

    El-Khatib, Hisham; El-Morshedy, Salah El-Din; Higazy, Maher G.; El-Shazly, Karam

    2013-01-01

    Highlights: ► A thermal–hydraulic model has been developed to simulate loss of the ultimate heat sink in MTR. ► The model involves three coupled sub-models for core, heat exchanger and cooling tower. ► The model is validated against PARET for steady-state and verified by operation data for transients. ► The model is used to simulate the behavior of the reactor under a loss of the ultimate heat sink. ► The model results are analyzed and discussed. -- Abstract: A thermal–hydraulic model has been developed to simulate loss of the ultimate heat sink in a typical material testing reactor (MTR). The model involves three interactively coupled sub-models for reactor core, heat exchanger and cooling tower. The model is validated against PARET code for steady-state operation and verified by the reactor operation records for transients. Then, the model is used to simulate the thermal–hydraulic behavior of the reactor under a loss of the ultimate heat sink event. The simulation is performed for two operation regimes: regime I representing 11 MW power and three cooling tower cells operated, and regime II representing 22 MW power and six cooling tower cells operated. In regime I, the simulation is performed for 1, 2 and 3 cooling tower cells failed while in regime II, it is performed for 1, 2, 3, 4, 5 and 6 cooling tower cells failed. The simulation is performed under protected conditions where the safety action called power reduction is triggered by reactor protection system to decrease the reactor power by 20% when the coolant inlet temperature to the core reaches 43 °C and scram is triggered if the core inlet temperature reaches 44 °C. The model results are analyzed and discussed.

  2. Improvement of Cycle Dependent Core Model for NPP Simulator

    International Nuclear Information System (INIS)

    Song, J. S.; Koo, B. S.; Kim, H. Y. and others

    2003-11-01

    The purpose of this study is to establish automatic core model generation system and to develop 4 cycle real time core analysis methodology with 5% power distribution and 500 pcm reactivity difference criteria for nuclear power plant simulator. The standardized procedure to generate database from ROCS and ANC, which are used for domestic PWR core design, was established for the cycle specific simulator core model generation. An automatic data interface system to generate core model also established. The system includes ARCADIS which edits group constant and DHCGEN which generates interface coupling coefficient correction database. The interface coupling coefficient correction method developed in this study has 4 cycle real time capability and accuracies of which the maximum differences between core design results are within 103 pcm reactivity, 1% relative power distribution and 6% control rod worth. A nuclear power plant core simulation program R-MASTER was developed using the methodology and applied by the concept of distributed client system in simulator. The performance was verified by site acceptance test in Simulator no. 2 in Kori Training Center for 30 initial condition generation and 27 steady state, transient and postulated accident situations

  3. Improvement of Cycle Dependent Core Model for NPP Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Song, J. S.; Koo, B. S.; Kim, H. Y. and others

    2003-11-15

    The purpose of this study is to establish automatic core model generation system and to develop 4 cycle real time core analysis methodology with 5% power distribution and 500 pcm reactivity difference criteria for nuclear power plant simulator. The standardized procedure to generate database from ROCS and ANC, which are used for domestic PWR core design, was established for the cycle specific simulator core model generation. An automatic data interface system to generate core model also established. The system includes ARCADIS which edits group constant and DHCGEN which generates interface coupling coefficient correction database. The interface coupling coefficient correction method developed in this study has 4 cycle real time capability and accuracies of which the maximum differences between core design results are within 103 pcm reactivity, 1% relative power distribution and 6% control rod worth. A nuclear power plant core simulation program R-MASTER was developed using the methodology and applied by the concept of distributed client system in simulator. The performance was verified by site acceptance test in Simulator no. 2 in Kori Training Center for 30 initial condition generation and 27 steady state, transient and postulated accident situations.

  4. Characterization uncertainty and its effects on models and performance

    International Nuclear Information System (INIS)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization

  5. Large-watershed flood simulation and forecasting based on different-resolution distributed hydrological model

    Science.gov (United States)

    Li, J.

    2017-12-01

    Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.

  6. Progress in modeling and simulation.

    Science.gov (United States)

    Kindler, E

    1998-01-01

    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  7. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ

    1985-03-01

    Full Text Available -animal interactions. An additional two models, which expand aspects of the FYNBOS model, are described: a model for simulating canopy processes; and a Fire Recovery Simulator. The canopy process model will simulate ecophysiological processes in more detail than FYNBOS...

  8. Study on driver model for hybrid truck based on driving simulator experimental results

    Directory of Open Access Journals (Sweden)

    Dam Hoang Phuc

    2018-04-01

    Full Text Available In this paper, a proposed car-following driver model taking into account some features of both the compensatory and anticipatory model representing the human pedal operation has been verified by driving simulator experiments with several real drivers. The comparison between computer simulations performed by determined model parameters with the experimental results confirm the correctness of this mathematical driver model and identified model parameters. Then the driver model is joined to a hybrid vehicle dynamics model and the moderate car following maneuver simulations with various driver parameters are conducted to investigate influences of driver parameters on vehicle dynamics response and fuel economy. Finally, major driver parameters involved in the longitudinal control of drivers are clarified. Keywords: Driver model, Driver-vehicle closed-loop system, Car Following, Driving simulator/hybrid electric vehicle (B1

  9. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-02-01

    We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for

  10. Evaluating TCMS Train-to-Ground communication performances based on the LTE technology and discreet event simulations

    DEFF Research Database (Denmark)

    Bouaziz, Maha; Yan, Ying; Kassab, Mohamed

    2018-01-01

    is shared between the train and different passengers. The simulation is based on the discrete-events network simulator Riverbed Modeler. Next, second step focusses on a co-simulation testbed, to evaluate performances with real traffic based on Hardware-In-The-Loop and OpenAirInterface modules. Preliminary...... (Long Term Evolution) network as an alternative communication technology, instead of GSM-R (Global System for Mobile communications-Railway) because of some capacity and capability limits. First step, a pure simulation is used to evaluate the network load for a high-speed scenario, when the LTE network...... simulation and co-simulation results show that LTE provides good performance for the TCMS traffic exchange in terms of packet delay and data integrity...

  11. Solar power plant performance evaluation: simulation and experimental validation

    International Nuclear Information System (INIS)

    Natsheh, E M; Albarbar, A

    2012-01-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P and O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  12. A simple dynamic model and transient simulation of the nuclear power reactor on microcomputers

    Energy Technology Data Exchange (ETDEWEB)

    Han, Yang Gee; Park, Cheol [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A simple dynamic model is developed for the transient simulation of the nuclear power reactor. The dynamic model includes the normalized neutron kinetics model with reactivity feedback effects and the core thermal-hydraulics model. The main objective of this paper demonstrates the capability of the developed dynamic model to simulate various important variables of interest for a nuclear power reactor transient. Some representative results of transient simulations show the expected trends in all cases, even though no available data for comparison. In this work transient simulations are performed on a microcomputer using the DESIRE/N96T continuous system simulation language which is applicable to nuclear power reactor transient analysis. 3 refs., 9 figs. (Author)

  13. A simple dynamic model and transient simulation of the nuclear power reactor on microcomputers

    Energy Technology Data Exchange (ETDEWEB)

    Han, Yang Gee; Park, Cheol [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    A simple dynamic model is developed for the transient simulation of the nuclear power reactor. The dynamic model includes the normalized neutron kinetics model with reactivity feedback effects and the core thermal-hydraulics model. The main objective of this paper demonstrates the capability of the developed dynamic model to simulate various important variables of interest for a nuclear power reactor transient. Some representative results of transient simulations show the expected trends in all cases, even though no available data for comparison. In this work transient simulations are performed on a microcomputer using the DESIRE/N96T continuous system simulation language which is applicable to nuclear power reactor transient analysis. 3 refs., 9 figs. (Author)

  14. Performance evaluation of sea surface simulation methods for target detection

    Science.gov (United States)

    Xia, Renjie; Wu, Xin; Yang, Chen; Han, Yiping; Zhang, Jianqi

    2017-11-01

    With the fast development of sea surface target detection by optoelectronic sensors, machine learning has been adopted to improve the detection performance. Many features can be learned from training images by machines automatically. However, field images of sea surface target are not sufficient as training data. 3D scene simulation is a promising method to address this problem. For ocean scene simulation, sea surface height field generation is the key point to achieve high fidelity. In this paper, two spectra-based height field generation methods are evaluated. Comparison between the linear superposition and linear filter method is made quantitatively with a statistical model. 3D ocean scene simulating results show the different features between the methods, which can give reference for synthesizing sea surface target images with different ocean conditions.

  15. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah; Ross, Robert; Carns, Philip

    2016-05-15

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the model size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.

  16. Using Modeling and Simulation to Complement Testing for Increased Understanding of Weapon Subassembly Response.

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Michael K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Davidson, Megan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    As part of Sandia’s nuclear deterrence mission, the B61-12 Life Extension Program (LEP) aims to modernize the aging weapon system. Modernization requires requalification and Sandia is using high performance computing to perform advanced computational simulations to better understand, evaluate, and verify weapon system performance in conjunction with limited physical testing. The Nose Bomb Subassembly (NBSA) of the B61-12 is responsible for producing a fuzing signal upon ground impact. The fuzing signal is dependent upon electromechanical impact sensors producing valid electrical fuzing signals at impact. Computer generated models were used to assess the timing between the impact sensor’s response to the deceleration of impact and damage to major components and system subassemblies. The modeling and simulation team worked alongside the physical test team to design a large-scale reverse ballistic test to not only assess system performance, but to also validate their computational models. The reverse ballistic test conducted at Sandia’s sled test facility sent a rocket sled with a representative target into a stationary B61-12 (NBSA) to characterize the nose crush and functional response of NBSA components. Data obtained from data recorders and high-speed photometrics were integrated with previously generated computer models in order to refine and validate the model’s ability to reliably simulate real-world effects. Large-scale tests are impractical to conduct for every single impact scenario. By creating reliable computer models, we can perform simulations that identify trends and produce estimates of outcomes over the entire range of required impact conditions. Sandia’s HPCs enable geometric resolution that was unachievable before, allowing for more fidelity and detail, and creating simulations that can provide insight to support evaluation of requirements and performance margins. As computing resources continue to improve, researchers at Sandia are hoping

  17. A fire management simulation model using stochastic arrival times

    Science.gov (United States)

    Eric L. Smith

    1987-01-01

    Fire management simulation models are used to predict the impact of changes in the fire management program on fire outcomes. As with all models, the goal is to abstract reality without seriously distorting relationships between variables of interest. One important variable of fire organization performance is the length of time it takes to get suppression units to the...

  18. Integration of control and building performance simulation software by run-time coupling

    NARCIS (Netherlands)

    Yahiaoui, A.; Hensen, J.L.M.; Soethout, L.L.

    2003-01-01

    This paper presents the background, approach and initial results of a project, which aims to achieve better integrated building and systems control modeling in building performance simulation by runtime coupling of distributed computer programs. This paper focuses on one of the essential steps

  19. An Advanced HIL Simulation Battery Model for Battery Management System Testing

    DEFF Research Database (Denmark)

    Barreras, Jorge Varela; Fleischer, Christian; Christensen, Andreas Elkjær

    2016-01-01

    Developers and manufacturers of battery management systems (BMSs) require extensive testing of controller Hardware (HW) and Software (SW), such as analog front-end and performance of generated control code. In comparison with the tests conducted on real batteries, tests conducted on a state......-of-the-art hardware-in-the-loop (HIL) simulator can be more cost and time effective, easier to reproduce, and safer beyond the normal range of operation, especially at early stages in the development process or during fault insertion. In this paper, an HIL simulation battery model is developed for purposes of BMS...... testing on a commercial HIL simulator. A multicell electrothermal Li-ion battery (LIB) model is integrated in a system-level simulation. Then, the LIB system model is converted to C code and run in real time with the HIL simulator. Finally, in order to demonstrate the capabilities of the setup...

  20. Fracture modelling of a high performance armour steel

    Science.gov (United States)

    Skoglund, P.; Nilsson, M.; Tjernberg, A.

    2006-08-01

    The fracture characteristics of the high performance armour steel Armox 500T is investigated. Tensile mechanical experiments using samples with different notch geometries are used to investigate the effect of multi-axial stress states on the strain to fracture. The experiments are numerically simulated and from the simulation the stress at the point of fracture initiation is determined as a function of strain and these data are then used to extract parameters for fracture models. A fracture model based on quasi-static experiments is suggested and the model is tested against independent experiments done at both static and dynamic loading. The result show that the fracture model give reasonable good agreement between simulations and experiments at both static and dynamic loading condition. This indicates that multi-axial loading is more important to the strain to fracture than the deformation rate in the investigated loading range. However on-going work will further characterise the fracture behaviour of Armox 500T.

  1. Wall modeling for the simulation of highly non-isothermal unsteady flows

    International Nuclear Information System (INIS)

    Devesa, A.

    2006-12-01

    Nuclear industry flows are most of the time characterized by their high Reynolds number, density variations (at low Mach numbers) and a highly unsteady behaviour (low to moderate frequencies). High Reynolds numbers are un-affordable by direct simulation (DNS), and simulations must either be performed by solving averaged equations (RANS), or by solving only the large eddies (LES), both using a wall model. A first investigation of this thesis dealt with the derivation and test of two variable density wall models: an algebraic law (CWM) and a zonal approach dedicated to LES (TBLE-ρ). These models were validated in quasi-isothermal cases, before being used in academic and industrial non-isothermal flows with satisfactory results. Then, a numerical experiment of pulsed passive scalars was performed by DNS, were two forcing conditions were considered: oscillations are imposed in the outer flow; oscillations come from the wall. Several frequencies and amplitudes of oscillations were taken into account in order to gain insights in unsteady effects in the boundary layer, and to create a database for validating wall models in such context. The temporal behaviour of two wall models (algebraic and zonal wall models) were studied and showed that a zonal model produced better results when used in the simulation of unsteady flows. (author)

  2. Modeling Of A Reactive Distillation Column: Methyl Tertiary Butyl Ether (Mtbe Simulation Studies

    Directory of Open Access Journals (Sweden)

    Ismail Mohd Saaid Abdul Rahman Mohamed and Subhash Bhatia

    2012-10-01

    Full Text Available A process simulation stage-wise reactive distillation column model formulated from equilibrium stage theory was developed. The algorithm for solving mathematical model represented by sets of differential-algebraic equations was based on relaxation method. Numerical integration scheme based on backward differentiation formula was selected for solving the stiffness of differential-algebraic equations. Simulations were performed on a personal computer (PC Pentium processor through a developed computer program using FORTRAN90 programming language. The proposed model was validated by comparing the simulated results with the published simulation results and with the pilot plant data from the literature. The model was capable of predicting high isobutene conversion for heterogeneous system, as desirable in industrial MTBE production process. The comparisons on temperature profiles, liquid composition profile and operating conditions of reactive distillation column also showed promising results. Therefore the proposed model can be used as a tool for the development and simulation of reactive distillation column.Keywords: Modeling, simulation, reactive distillation, relaxation method, equilibrium stage, heterogeneous, MTBE

  3. Modified network simulation model with token method of bus access

    Directory of Open Access Journals (Sweden)

    L.V. Stribulevich

    2013-08-01

    Full Text Available Purpose. To study the characteristics of the local network with the marker method of access to the bus its modified simulation model was developed. Methodology. Defining characteristics of the network is carried out on the developed simulation model, which is based on the state diagram-layer network station with the mechanism of processing priorities, both in steady state and in the performance of control procedures: the initiation of a logical ring, the entrance and exit of the station network with a logical ring. Findings. A simulation model, on the basis of which can be obtained the dependencies of the application the maximum waiting time in the queue for different classes of access, and the reaction time usable bandwidth on the data rate, the number of network stations, the generation rate applications, the number of frames transmitted per token holding time, frame length was developed. Originality. The technique of network simulation reflecting its work in the steady condition and during the control procedures, the mechanism of priority ranking and handling was proposed. Practical value. Defining network characteristics in the real-time systems on railway transport based on the developed simulation model.

  4. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    Science.gov (United States)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  5. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    Science.gov (United States)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  6. MEMS 3-DoF gyroscope design, modeling and simulation through equivalent circuit lumped parameter model

    International Nuclear Information System (INIS)

    Mian, Muhammad Umer; Khir, M. H. Md.; Tang, T. B.; Dennis, John Ojur; Riaz, Kashif; Iqbal, Abid; Bazaz, Shafaat A.

    2015-01-01

    Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for the proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used

  7. MEMS 3-DoF gyroscope design, modeling and simulation through equivalent circuit lumped parameter model

    Energy Technology Data Exchange (ETDEWEB)

    Mian, Muhammad Umer, E-mail: umermian@gmail.com; Khir, M. H. Md.; Tang, T. B. [Department of Electrical and Electronic Engineering, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia); Dennis, John Ojur [Department of Fundamental & Applied Sciences, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia); Riaz, Kashif; Iqbal, Abid [Faculty of Electronics Engineering, GIK Institute of Engineering Sciences and Technology, Topi, Khyber Pakhtunkhaw (Pakistan); Bazaz, Shafaat A. [Department of Computer Science, Center for Advance Studies in Engineering, Islamabad (Pakistan)

    2015-07-22

    Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for the proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.

  8. A SIMULATION OF CONTRACT FARMING USING AGENT BASED MODELING

    Directory of Open Access Journals (Sweden)

    Yuanita Handayati

    2016-12-01

    Full Text Available This study aims to simulate the effects of contract farming and farmer commitment to contract farming on supply chain performance by using agent based modeling as a methodology. Supply chain performance is represented by profits and service levels. The simulation results indicate that farmers should pay attention to customer requirements and plan their agricultural activities in order to fulfill these requirements. Contract farming helps farmers deal with demand and price uncertainties. We also find that farmer commitment is crucial to fulfilling contract requirements. This study contributes to this field from a conceptual as well as a practical point of view. From the conceptual point of view, our simulation results show that different levels of farmer commitment have an impact on farmer performance when implementing contract farming. From a practical point of view, the uncertainty faced by farmers and the market can be managed by implementing cultivation and harvesting scheduling, information sharing, and collective learning as ways of committing to contract farming.

  9. SIMULATION OF POROSITY AND PTFE CONTENT IN GAS DIFFUSION LAYER ON PROTON EXCHANGE MEMBRANE FUEL CELL PERFORMANCE

    Directory of Open Access Journals (Sweden)

    NUR H. MASLAN

    2016-01-01

    Full Text Available Numerous research and development activities have been conducted to optimize the operating parameters of a proton exchange membrane fuel cell (PEMFC by experiments and simulations. This study explains the development of a 3D model by using ANSYS FLUENT 14.5 to determine the optimum PEMFC parameters, namely, porosity and polytetrafluoroethylene (PTFE content, in the gas diffusion layer (GDL. A 3D model was developed to analyze the properties and effects of GDL. Simulation results showed that the increase in GDL porosity significantly improved the performance of PEMFC in generating electrical power. However, the performance of PEMFC decreased with increasing PTFE content in GDL. Thus, the PTFE content in the GDL must be optimized and the optimum PTFE content should be 5 wt%. The model developed in this simulation showed good capability in simulating the PEMFC parameters to assist the development process of PEMFC design.

  10. Modelling, simulation and applications of longitudinal train dynamics

    Science.gov (United States)

    Cole, Colin; Spiryagin, Maksym; Wu, Qing; Sun, Yan Quan

    2017-10-01

    Significant developments in longitudinal train simulation and an overview of the approaches to train models and modelling vehicle force inputs are firstly presented. The most important modelling task, that of the wagon connection, consisting of energy absorption devices such as draft gears and buffers, draw gear stiffness, coupler slack and structural stiffness is then presented. Detailed attention is given to the modelling approaches for friction wedge damped and polymer draft gears. A significant issue in longitudinal train dynamics is the modelling and calculation of the input forces - the co-dimensional problem. The need to push traction performances higher has led to research and improvement in the accuracy of traction modelling which is discussed. A co-simulation method that combines longitudinal train simulation, locomotive traction control and locomotive vehicle dynamics is presented. The modelling of other forces, braking propulsion resistance, curve drag and grade forces are also discussed. As extensions to conventional longitudinal train dynamics, lateral forces and coupler impacts are examined in regards to interaction with wagon lateral and vertical dynamics. Various applications of longitudinal train dynamics are then presented. As an alternative to the tradition single wagon mass approach to longitudinal train dynamics, an example incorporating fully detailed wagon dynamics is presented for a crash analysis problem. Further applications of starting traction, air braking, distributed power, energy analysis and tippler operation are also presented.

  11. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  12. Simulated astigmatism impairs academic-related performance in children.

    Science.gov (United States)

    Narayanasamy, Sumithira; Vincent, Stephen J; Sampson, Geoff P; Wood, Joanne M

    2015-01-01

    Astigmatism is an important refractive condition in children. However, the functional impact of uncorrected astigmatism in this population is not well established, particularly with regard to academic performance. This study investigated the impact of simulated bilateral astigmatism on academic-related tasks before and after sustained near work in children. Twenty visually normal children (mean age: 10.8 ± 0.7 years; six males and 14 females) completed a range of standardised academic-related tests with and without 1.50 D of simulated bilateral astigmatism (with both academic-related tests and the visual condition administered in a randomised order). The simulated astigmatism was induced using a positive cylindrical lens while maintaining a plano spherical equivalent. Performance was assessed before and after 20 min of sustained near work, during two separate testing sessions. Academic-related measures included a standardised reading test (the Neale Analysis of Reading Ability), visual information processing tests (Coding and Symbol Search subtests from the Wechsler Intelligence Scale for Children) and a reading-related eye movement test (the Developmental Eye Movement test). Each participant was systematically assigned either with-the-rule (WTR, axis 180°) or against-the-rule (ATR, axis 90°) simulated astigmatism to evaluate the influence of axis orientation on any decrements in performance. Reading, visual information processing and reading-related eye movement performance were all significantly impaired by both simulated bilateral astigmatism (p  0.05). Simulated astigmatism led to a reduction of between 5% and 12% in performance across the academic-related outcome measures, but there was no significant effect of the axis (WTR or ATR) of astigmatism (p > 0.05). Simulated bilateral astigmatism impaired children's performance on a range of academic-related outcome measures irrespective of the orientation of the astigmatism. These findings have

  13. A particle based simulation model for glacier dynamics

    Directory of Open Access Journals (Sweden)

    J. A. Åström

    2013-10-01

    Full Text Available A particle-based computer simulation model was developed for investigating the dynamics of glaciers. In the model, large ice bodies are made of discrete elastic particles which are bound together by massless elastic beams. These beams can break, which induces brittle behaviour. At loads below fracture, beams may also break and reform with small probabilities to incorporate slowly deforming viscous behaviour in the model. This model has the advantage that it can simulate important physical processes such as ice calving and fracturing in a more realistic way than traditional continuum models. For benchmarking purposes the deformation of an ice block on a slip-free surface was compared to that of a similar block simulated with a Finite Element full-Stokes continuum model. Two simulations were performed: (1 calving of an ice block partially supported in water, similar to a grounded marine glacier terminus, and (2 fracturing of an ice block on an inclined plane of varying basal friction, which could represent transition to fast flow or surging. Despite several approximations, including restriction to two-dimensions and simplified water-ice interaction, the model was able to reproduce the size distributions of the debris observed in calving, which may be approximated by universal scaling laws. On a moderate slope, a large ice block was stable and quiescent as long as there was enough of friction against the substrate. For a critical length of frictional contact, global sliding began, and the model block disintegrated in a manner suggestive of a surging glacier. In this case the fragment size distribution produced was typical of a grinding process.

  14. Towards the Development of a Simulator for Investigating the Impact of People Management Practices on Retail Performance

    OpenAIRE

    Siebers, Peer-Olaf; Aickelin, Uwe; Celia, Helen; Clegg, Chris

    2010-01-01

    Often models for understanding the impact of management practices on retail performance are developed under the assumption of stability, equilibrium and linearity, whereas retail operations are considered in reality to be dynamic, non-linear and complex. Alternatively, discrete event and agent-based modelling are approaches that allow the development of simulation models of heterogeneous non-equilibrium systems for testing out different\\ud scenarios.\\ud When developing simulation models one h...

  15. DECISION WITH ARTIFICIAL NEURAL NETWORKS IN DISCRETE EVENT SIMULATION MODELS ON A TRAFFIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Marília Gonçalves Dutra da Silva

    2016-04-01

    Full Text Available ABSTRACT This work aims to demonstrate the use of a mechanism to be applied in the development of the discrete-event simulation models that perform decision operations through the implementation of an artificial neural network. Actions that involve complex operations performed by a human agent in a process, for example, are often modeled in simplified form with the usual mechanisms of simulation software. Therefore, it was chosen a traffic system controlled by a traffic officer with a flow of vehicles and pedestrians to demonstrate the proposed solution. From a module built in simulation software itself, it was possible to connect the algorithm for intelligent decision to the simulation model. The results showed that the model elaborated responded as expected when it was submitted to actions, which required different decisions to maintain the operation of the system with changes in the flow of people and vehicles.

  16. Physical robustness of canopy temperature models for crop heat stress simulation across environments and production conditions

    DEFF Research Database (Denmark)

    Webber, Heidi; White, Jeffrey W; Kimball, Bruce

    2018-01-01

    to simulate Tc. Model performance in predicting Tc was evaluated for two experiments in continental North America with various water, nitrogen and CO2 treatments. An empirical model fit to one dataset had the best performance, followed by the EBSC models. Stability conditions explained much of the differences...... between modeling approaches. More accurate simulation of heat stress will likely require use of energy balance approaches that consider atmospheric stability conditions....

  17. An Iterative Algorithm to Determine the Dynamic User Equilibrium in a Traffic Simulation Model

    Science.gov (United States)

    Gawron, C.

    An iterative algorithm to determine the dynamic user equilibrium with respect to link costs defined by a traffic simulation model is presented. Each driver's route choice is modeled by a discrete probability distribution which is used to select a route in the simulation. After each simulation run, the probability distribution is adapted to minimize the travel costs. Although the algorithm does not depend on the simulation model, a queuing model is used for performance reasons. The stability of the algorithm is analyzed for a simple example network. As an application example, a dynamic version of Braess's paradox is studied.

  18. Are water simulation models consistent with steady-state and ultrafast vibrational spectroscopy experiments?

    International Nuclear Information System (INIS)

    Schmidt, J.R.; Roberts, S.T.; Loparo, J.J.; Tokmakoff, A.; Fayer, M.D.; Skinner, J.L.

    2007-01-01

    Vibrational spectroscopy can provide important information about structure and dynamics in liquids. In the case of liquid water, this is particularly true for isotopically dilute HOD/D 2 O and HOD/H 2 O systems. Infrared and Raman line shapes for these systems were measured some time ago. Very recently, ultrafast three-pulse vibrational echo experiments have been performed on these systems, which provide new, exciting, and important dynamical benchmarks for liquid water. There has been tremendous theoretical effort expended on the development of classical simulation models for liquid water. These models have been parameterized from experimental structural and thermodynamic measurements. The goal of this paper is to determine if representative simulation models are consistent with steady-state, and especially with these new ultrafast, experiments. Such a comparison provides information about the accuracy of the dynamics of these simulation models. We perform this comparison using theoretical methods developed in previous papers, and calculate the experimental observables directly, without making the Condon and cumulant approximations, and taking into account molecular rotation, vibrational relaxation, and finite excitation pulses. On the whole, the simulation models do remarkably well; perhaps the best overall agreement with experiment comes from the SPC/E model

  19. Including the effects of filamentous bulking sludge during the simulation of wastewater treatment plants using a risk assessment model

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodriquez-Roda, I.

    2009-01-01

    The main objective of this paper is to demonstrate how including the occurrence of filamentous bulking sludge in a secondary clarifier model will affect the predicted process performance during the simulation of WWTPs. The IWA Benchmark Simulation Model No. 2 (BSM2) is hereby used as a simulation...... are automatically changed during the simulation by modifying the settling model parameters to mimic the effect of growth of filamentous bacteria. The simulation results demonstrate that including effects of filamentous bulking in the secondary clarifier model results in a more realistic plant performance...

  20. Simulation and Modeling of Flow in a Gas Compressor

    Directory of Open Access Journals (Sweden)

    Anna Avramenko

    2015-01-01

    Full Text Available The presented research demonstrates the results of a series of numerical simulations of gas flow through a single-stage centrifugal compressor with a vaneless diffuser. Numerical results were validated with experiments consisting of eight regimes with different mass flow rates. The steady-state and unsteady simulations were done in ANSYS FLUENT 13.0 and NUMECA FINE/TURBO 8.9.1 for one-period geometry due to periodicity of the problem. First-order discretization is insufficient due to strong dissipation effects. Results obtained with second-order discretization agree with the experiments for the steady-state case in the region of high mass flow rates. In the area of low mass flow rates, nonstationary effects significantly influence the flow leading stationary model to poor prediction. Therefore, the unsteady simulations were performed in the region of low mass flow rates. Results of calculation were compared with experimental data. The numerical simulation method in this paper can be used to predict compressor performance.

  1. Interactive virtual simulation using a 3D computer graphics model for microvascular decompression surgery.

    Science.gov (United States)

    Oishi, Makoto; Fukuda, Masafumi; Hiraishi, Tetsuya; Yajima, Naoki; Sato, Yosuke; Fujii, Yukihiko

    2012-09-01

    The purpose of this paper is to report on the authors' advanced presurgical interactive virtual simulation technique using a 3D computer graphics model for microvascular decompression (MVD) surgery. The authors performed interactive virtual simulation prior to surgery in 26 patients with trigeminal neuralgia or hemifacial spasm. The 3D computer graphics models for interactive virtual simulation were composed of the brainstem, cerebellum, cranial nerves, vessels, and skull individually created by the image analysis, including segmentation, surface rendering, and data fusion for data collected by 3-T MRI and 64-row multidetector CT systems. Interactive virtual simulation was performed by employing novel computer-aided design software with manipulation of a haptic device to imitate the surgical procedures of bone drilling and retraction of the cerebellum. The findings were compared with intraoperative findings. In all patients, interactive virtual simulation provided detailed and realistic surgical perspectives, of sufficient quality, representing the lateral suboccipital route. The causes of trigeminal neuralgia or hemifacial spasm determined by observing 3D computer graphics models were concordant with those identified intraoperatively in 25 (96%) of 26 patients, which was a significantly higher rate than the 73% concordance rate (concordance in 19 of 26 patients) obtained by review of 2D images only (p computer graphics model provided a realistic environment for performing virtual simulations prior to MVD surgery and enabled us to ascertain complex microsurgical anatomy.

  2. Hybrid Reynolds-Averaged/Large Eddy Simulation of a Cavity Flameholder; Assessment of Modeling Sensitivities

    Science.gov (United States)

    Baurle, R. A.

    2015-01-01

    Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. The cases simulated corresponded to those used to examine this flowfield experimentally using particle image velocimetry. A variety of turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged / large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This effort was undertaken to formally assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community. The numerical errors were quantified for both the steady-state and scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results showed a high degree of variability when comparing the predictions obtained from each turbulence model, with the non-linear eddy viscosity model (an explicit algebraic stress model) providing the most accurate prediction of the measured values. The hybrid Reynolds-averaged/large eddy simulation results were carefully scrutinized to ensure that even the coarsest grid had an acceptable level of resolution for large eddy simulation, and that the time-averaged statistics were acceptably accurate. The autocorrelation and its Fourier transform were the primary tools used for this assessment. The statistics extracted from the hybrid simulation strategy proved to be more accurate than the Reynolds-averaged results obtained using the linear eddy viscosity models. However, there was no predictive improvement noted over the results obtained from the explicit

  3. The validation of evacuation simulation models through the analysis of behavioural uncertainty

    International Nuclear Information System (INIS)

    Lovreglio, Ruggiero; Ronchi, Enrico; Borri, Dino

    2014-01-01

    Both experimental and simulation data on fire evacuation are influenced by a component of uncertainty caused by the impact of the unexplained variance in human behaviour, namely behavioural uncertainty (BU). Evacuation model validation studies should include the study of this type of uncertainty during the comparison of experiments and simulation results. An evacuation model validation procedure is introduced in this paper to study the impact of BU. This methodology is presented through a case study for the comparison between repeated experimental data and simulation results produced by FDS+Evac, an evacuation model for the simulation of human behaviour in fire, which makes use of distribution laws. - Highlights: • Validation of evacuation models is investigated. • Quantitative evaluation of behavioural uncertainty is performed. • A validation procedure is presented through an evacuation case study

  4. Dynamic Simulation of an Organic Rankine Cycle—Detailed Model of a Kettle Boiler

    Directory of Open Access Journals (Sweden)

    Roberto Pili

    2017-04-01

    Full Text Available Organic Rankine Cycles (ORCs are nowadays a valuable technology to produce electricity from low and medium temperature heat sources, e.g., in geothermal, biomass and waste heat recovery applications. Dynamic simulations can help improve the flexibility and operation of such plants, and guarantee a better economic performance. In this work, a dynamic model for a multi-pass kettle evaporator of a geothermal ORC power plant has been developed and its dynamics have been validated against measured data. The model combines the finite volume approach on the tube side and a two-volume cavity on the shell side. To validate the dynamic model, a positive and a negative step function in heat source flow rate is applied. The simulation model performed well in both cases. The liquid level appeared the most challenging quantity to simulate. A better agreement in temperature was achieved by increasing the volume flow rate of the geothermal brine by 2% over the entire simulation. Measurement errors, discrepancies in working fluid and thermal brine properties and uncertainties in heat transfer correlations can account for this. In the future, the entire geothermal power plant will be simulated, and suggestions to improve its dynamics and control by means of simulations will be provided.

  5. Improvements in Thermal Performance of Mango Hot-water Treatment Equipments: Data Analysis, Mathematical Modelling and Numerical-computational Simulation

    Directory of Open Access Journals (Sweden)

    Elder M. Mendoza Orbegoso

    2017-06-01

    Full Text Available Mango is one of the most popular and best paid tropical fruits in worldwide markets, its exportation is regulated within a phytosanitary quality control for killing the “fruit fly”. Thus, mangoes must be subject to hot-water treatment process that involves their immersion in hot water over a period of time. In this work, field measurements, analytical and simulation studies are developed on available hot-water treatment equipment called “Original” that only complies with United States phytosanitary protocols. These approaches are made to characterize the fluid-dynamic and thermal behaviours that occur during the mangoes’ hot-water treatment process. Then, analytical model and Computational fluid dynamics simulations are developed for designing new hot-water treatment equipment called “Hybrid” that simultaneously meets with both United States and Japan phytosanitary certifications. Comparisons of analytical results with data field measurements demonstrate that “Hybrid” equipment offers a better fluid-dynamic and thermal performance than “Original” ones.

  6. Comparison of Numerically Simulated and Experimentally Measured Performance of a Rotating Detonation Engine

    Science.gov (United States)

    Paxson, Daniel E.; Fotia, Matthew L.; Hoke, John; Schauer, Fred

    2015-01-01

    A quasi-two-dimensional, computational fluid dynamic (CFD) simulation of a rotating detonation engine (RDE) is described. The simulation operates in the detonation frame of reference and utilizes a relatively coarse grid such that only the essential primary flow field structure is captured. This construction and other simplifications yield rapidly converging, steady solutions. Viscous effects, and heat transfer effects are modeled using source terms. The effects of potential inlet flow reversals are modeled using boundary conditions. Results from the simulation are compared to measured data from an experimental RDE rig with a converging-diverging nozzle added. The comparison is favorable for the two operating points examined. The utility of the code as a performance optimization tool and a diagnostic tool are discussed.

  7. Performance simulation and analysis of a CMOS/nano hybrid nanoprocessor system

    International Nuclear Information System (INIS)

    Cabe, Adam C; Das, Shamik

    2009-01-01

    This paper provides detailed simulation results and analysis of the prospective performance of hybrid CMOS/nanoelectronic processor systems based upon the field-programmable nanowire interconnect (FPNI) architecture. To evaluate this architecture, a complete design was developed for an FPNI implementation using 90 nm CMOS with 15 nm wide nanowire interconnects. Detailed simulations of this design illustrate that critical design choices and tradeoffs exist beyond those specified by the architecture. This includes the selection of the types of junction nanodevices, as well as the implementation of low-level circuits. In particular, the simulation results presented here show that only nanodevices with an 'on/off' current ratio of 200 or more are suitable to produce correct system-level behaviour. Furthermore, the design of the CMOS logic gates in the FPNI system must be customized to accommodate the resistances of both 'on'-state and 'off'-state nanodevices. Using these customized designs together with models of suitable nanodevices, additional simulations demonstrate that, relative to conventional 90 nm CMOS FPGA systems, performance gains can be obtained of up to 70% greater speed or up to a ninefold reduction in energy consumption.

  8. Validating clustering of molecular dynamics simulations using polymer models

    Directory of Open Access Journals (Sweden)

    Phillips Joshua L

    2011-11-01

    Full Text Available Abstract Background Molecular dynamics (MD simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our

  9. Total dose and dose rate models for bipolar transistors in circuit simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Phillip Montgomery; Wix, Steven D.

    2013-05-01

    The objective of this work is to develop a model for total dose effects in bipolar junction transistors for use in circuit simulation. The components of the model are an electrical model of device performance that includes the effects of trapped charge on device behavior, and a model that calculates the trapped charge densities in a specific device structure as a function of radiation dose and dose rate. Simulations based on this model are found to agree well with measurements on a number of devices for which data are available.

  10. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  11. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  12. Efficient scatter model for simulation of ultrasound images from computed tomography data

    Science.gov (United States)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  13. NUMERICAL FLOW AND TRANSPORT SIMULATIONS SUPPORTING THE SALTSTONE FACILITY PERFORMANCE ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.

    2009-02-28

    The Saltstone Disposal Facility Performance Assessment (PA) is being revised to incorporate requirements of Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA), and updated data and understanding of vault performance since the 1992 PA (Cook and Fowler 1992) and related Special Analyses. A hybrid approach was chosen for modeling contaminant transport from vaults and future disposal cells to exposure points. A higher resolution, largely deterministic, analysis is performed on a best-estimate Base Case scenario using the PORFLOW numerical analysis code. a few additional sensitivity cases are simulated to examine alternative scenarios and parameter settings. Stochastic analysis is performed on a simpler representation of the SDF system using the GoldSim code to estimate uncertainty and sensitivity about the Base Case. This report describes development of PORFLOW models supporting the SDF PA, and presents sample results to illustrate model behaviors and define impacts relative to key facility performance objectives. The SDF PA document, when issued, should be consulted for a comprehensive presentation of results.

  14. OPNET Modeler Simulation Testing of the New Model Used to Cooperation Between QoS and Security Mechanisms

    Directory of Open Access Journals (Sweden)

    Jan Papaj

    2012-01-01

    Full Text Available In this article the performance analysis of the new model, used to integration between QoS and Security, is introduced. OPNET modeler simulation testing of the new model with comparation with the standard model is presented. This new model enables the process of cooperation between QoS and Security in MANET. The introduction how the model is implemented to the simulation OPNET modeler is also showed. Model provides possibilities to integration and cooperation of QoS and security by the cross layer design (CLD with modified security service vector (SSV. An overview of the simulation tested of the new model, comparative study in mobile ad-hoc networks, describe requirements and directions for adapted solutions are presented. Main idea of the testing is to show how QoS and Security related services could be provided simultaneously with using minimal interfering with each service.

  15. Comparison of HSPF and SWAT models performance for runoff and sediment yield prediction.

    Science.gov (United States)

    Im, Sangjun; Brannan, Kevin M; Mostaghimi, Saied; Kim, Sang Min

    2007-09-01

    A watershed model can be used to better understand the relationship between land use activities and hydrologic/water quality processes that occur within a watershed. The physically based, distributed parameter model (SWAT) and a conceptual, lumped parameter model (HSPF), were selected and their performance were compared in simulating runoff and sediment yields from the Polecat Creek watershed in Virginia, which is 12,048 ha in size. A monitoring project was conducted in Polecat Creek watershed during the period of October 1994 to June 2000. The observed data (stream flow and sediment yield) from the monitoring project was used in the calibration/validations of the models. The period of September 1996 to June 2000 was used for the calibration and October 1994 to December 1995 was used for the validation of the models. The outputs from the models were compared to the observed data at several sub-watershed outlets and at the watershed outlet of the Polecat Creek watershed. The results indicated that both models were generally able to simulate stream flow and sediment yields well during both the calibration/validation periods. For annual and monthly loads, HSPF simulated hydrologic and sediment yield more accurately than SWAT at all monitoring sites within the watershed. The results of this study indicate that both the SWAT and HSPF watershed models performed sufficiently well in the simulation of stream flow and sediment yield with HSPF performing moderately better than SWAT for simulation time-steps greater than a month.

  16. Impact of Loss Synchronization on Reliable High Speed Networks: A Model Based Simulation

    Directory of Open Access Journals (Sweden)

    Suman Kumar

    2014-01-01

    Full Text Available Contemporary nature of network evolution demands for simulation models which are flexible, scalable, and easily implementable. In this paper, we propose a fluid based model for performance analysis of reliable high speed networks. In particular, this paper aims to study the dynamic relationship between congestion control algorithms and queue management schemes, in order to develop a better understanding of the causal linkages between the two. We propose a loss synchronization module which is user configurable. We validate our model through simulations under controlled settings. Also, we present a performance analysis to provide insights into two important issues concerning 10 Gbps high speed networks: (i impact of bottleneck buffer size on the performance of 10 Gbps high speed network and (ii impact of level of loss synchronization on link utilization-fairness tradeoffs. The practical impact of the proposed work is to provide design guidelines along with a powerful simulation tool to protocol designers and network developers.

  17. Performance comparison of low and high temperature polymer electrolyte membrane fuel cells. Experimental examinations, modelling and numerical simulation; Leistungsvergleich von Nieder- und Hochtemperatur-Polymerelektrolytmembran-Brennstoffzellen. Experimentelle Untersuchungen, Modellierung und numerische Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Loehn, Helmut

    2010-11-03

    danger of washing out of the phosphoric acid. In an additional test row the Celtec-P-1000 HT-MEA was subjected to temperature change cycles (40 - 160 C), which lead to irreversible voltage losses. In a final test row performance tests were carried out with a HT-PEM fuel cell stack (16 cells /1 kW), developed in the fuel cell research centre of Volkswagen with a special gas diffusion electrode, which should avoid the degradation at deep temperatures. In these examinations no irreversible voltage losses could be detected, but the tests had to be aborted because of leakage problems. The by the experimental examinations gained insight of the superior operating behaviour and the further advantages of the HT-PEMFC in comparison to the LT-PEMFC were crucial for the construction of a simulation model for a single HT-PEM fuel cell in the theoretical part of this thesis, that also should be suitable as process simulation model for the computer based development of a virtual fuel cell within the interdisciplinary project ''Virtual Fuel Cell'' at the TU Darmstadt. The model is a numerical 2D ''along the channel'' - model, that was constructed with the finite element software COMSOL Multiphysics (version 3.5 a). The stationary, one phase model comprises altogether ten dependent variables in seven application modules in a highly complex, coupled non linear system of equations with 33713 degrees of freedom (1675 rectangle elements with 1768 nodes). The simulation model describes the mass transport processes and the electro-chemical reactions in a HT-PEM fuel cell with good accuracy, the model validation by comparing the model results with experimental data could be proved. So the 2D-model is basically suitable as process simulation model for the projecting of a virtual HT-PEM fuel cell. (orig.)

  18. Development of a simulation model of semi-active suspension for monorail

    Science.gov (United States)

    Hasnan, K.; Didane, D. H.; Kamarudin, M. A.; Bakhsh, Qadir; Abdulmalik, R. E.

    2016-11-01

    The new Kuala Lumpur Monorail Fleet Expansion Project (KLMFEP) uses semiactive technology in its suspension system. It is recognized that the suspension system influences the ride quality. Thus, among the way to further improve the ride quality is by fine- tuning the semi-active suspension system on the new KL Monorail. The semi-active suspension for the monorail specifically in terms of improving ride quality could be exploited further. Hence a simulation model which will act as a platform to test the design of a complete suspension system particularly to investigate the ride comfort performance is required. MSC Adams software was considered as the tool to develop the simulation platform, where all parameters and data are represented by mathematical equations; whereas the new KL Monorail being the reference model. In the simulation, the model went through step disturbance on the guideway for stability and ride comfort analysis. The model has shown positive results where the monorail is in stable condition as an outcome from stability analysis. The model also scores a Rating 1 classification in ISO 2631 Ride Comfort performance which is very comfortable as an overall outcome from ride comfort analysis. The model is also adjustable, flexibile and understandable by the engineers within the field for the purpose of further development.

  19. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    Directory of Open Access Journals (Sweden)

    Sofia Segkouli

    2015-01-01

    Full Text Available Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM simulating mild cognitive impairment (MCI through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users’ cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces’ design supported by increased tasks’ complexity to capture a more detailed profile of users’ capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces’ evaluation through simulation on the basis of virtual models of MCI users.

  20. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    Science.gov (United States)

    Segkouli, Sofia; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  1. On Improving 4-km Mesoscale Model Simulations

    Science.gov (United States)

    Deng, Aijun; Stauffer, David R.

    2006-03-01

    A previous study showed that use of analysis-nudging four-dimensional data assimilation (FDDA) and improved physics in the fifth-generation Pennsylvania State University National Center for Atmospheric Research Mesoscale Model (MM5) produced the best overall performance on a 12-km-domain simulation, based on the 18 19 September 1983 Cross-Appalachian Tracer Experiment (CAPTEX) case. However, reducing the simulated grid length to 4 km had detrimental effects. The primary cause was likely the explicit representation of convection accompanying a cold-frontal system. Because no convective parameterization scheme (CPS) was used, the convective updrafts were forced on coarser-than-realistic scales, and the rainfall and the atmospheric response to the convection were too strong. The evaporative cooling and downdrafts were too vigorous, causing widespread disruption of the low-level winds and spurious advection of the simulated tracer. In this study, a series of experiments was designed to address this general problem involving 4-km model precipitation and gridpoint storms and associated model sensitivities to the use of FDDA, planetary boundary layer (PBL) turbulence physics, grid-explicit microphysics, a CPS, and enhanced horizontal diffusion. Some of the conclusions include the following: 1) Enhanced parameterized vertical mixing in the turbulent kinetic energy (TKE) turbulence scheme has shown marked improvements in the simulated fields. 2) Use of a CPS on the 4-km grid improved the precipitation and low-level wind results. 3) Use of the Hong and Pan Medium-Range Forecast PBL scheme showed larger model errors within the PBL and a clear tendency to predict much deeper PBL heights than the TKE scheme. 4) Combining observation-nudging FDDA with a CPS produced the best overall simulations. 5) Finer horizontal resolution does not always produce better simulations, especially in convectively unstable environments, and a new CPS suitable for 4-km resolution is needed. 6

  2. Performance of GeantV EM Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2016-10-14

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  3. Performance of GeantV EM Physics Models

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2017-10-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  4. Performance of GeantV EM Physics Models

    CERN Document Server

    Amadio, G; Apostolakis, J; Aurora, A; Bandieramonte, M; Bhattacharyya, A; Bianchini, C; Brun, R; Canal P; Carminati, F; Cosmo, G; Duhem, L; Elvira, D; Folger, G; Gheata, A; Gheata, M; Goulas, I; Iope, R; Jun, S Y; Lima, G; Mohanty, A; Nikitina, T; Novak, M; Pokorski, W; Ribon, A; Seghal, R; Shadura, O; Vallecorsa, S; Wenzel, S; Zhang, Y

    2017-01-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  5. Modeling lift operations with SASmacr Simulation Studio

    Science.gov (United States)

    Kar, Leow Soo

    2016-10-01

    Lifts or elevators are an essential part of multistorey buildings which provide vertical transportation for its occupants. In large and high-rise apartment buildings, its occupants are permanent, while in buildings, like hospitals or office blocks, the occupants are temporary or users of the buildings. They come in to work or to visit, and thus, the population of such buildings are much higher than those in residential apartments. It is common these days that large office blocks or hospitals have at least 8 to 10 lifts serving its population. In order to optimize the level of service performance, different transportation schemes are devised to control the lift operations. For example, one lift may be assigned to solely service the even floors and another solely for the odd floors, etc. In this paper, a basic lift system is modelled using SAS Simulation Studio to study the effect of factors such as the number of floors, capacity of the lift car, arrival rate and exit rate of passengers at each floor, peak and off peak periods on the system performance. The simulation is applied to a real lift operation in Sunway College's North Building to validate the model.

  6. Can we continue to ignore gender differences in performance on simulation trainers?

    Science.gov (United States)

    Thorson, Chad M; Kelly, Jason P; Forse, R Armour; Turaga, Kiran K

    2011-05-01

    There are differences between the genders in their innate performances on simulation trainers, which may impair accurate assessment of psychomotor skills. The performance of fourth-year students with no exposure to the Minimally Invasive Surgical Trainer compared based on gender, and other psychomotor skills. Our study included 16 male and 16 female students. After adjusting for choice of medical specialty (Pvideo game use (P=.6), and experience in the operating room (P=.4), female sex was an independent factor for worse performance (P=.04) in multivariate models. Women took more time than men (P<.01) and made more errors (29 versus 25 on 3 reps, P<.01). Among medical students with no previous exposure to laparoscopic trainers, female students perform worse than male students after adjusting for confounding factors. This difference must be recognized by training programs while using simulators for training and evaluation.

  7. Direct numerical simulation of reactor two-phase flows enabled by high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Jun; Cambareri, Joseph J.; Brown, Cameron S.; Feng, Jinyong; Gouws, Andre; Li, Mengnan; Bolotnov, Igor A.

    2018-04-01

    Nuclear reactor two-phase flows remain a great engineering challenge, where the high-resolution two-phase flow database which can inform practical model development is still sparse due to the extreme reactor operation conditions and measurement difficulties. Owing to the rapid growth of computing power, the direct numerical simulation (DNS) is enjoying a renewed interest in investigating the related flow problems. A combination between DNS and an interface tracking method can provide a unique opportunity to study two-phase flows based on first principles calculations. More importantly, state-of-the-art high-performance computing (HPC) facilities are helping unlock this great potential. This paper reviews the recent research progress of two-phase flow DNS related to reactor applications. The progress in large-scale bubbly flow DNS has been focused not only on the sheer size of those simulations in terms of resolved Reynolds number, but also on the associated advanced modeling and analysis techniques. Specifically, the current areas of active research include modeling of sub-cooled boiling, bubble coalescence, as well as the advanced post-processing toolkit for bubbly flow simulations in reactor geometries. A novel bubble tracking method has been developed to track the evolution of bubbles in two-phase bubbly flow. Also, spectral analysis of DNS database in different geometries has been performed to investigate the modulation of the energy spectrum slope due to bubble-induced turbulence. In addition, the single-and two-phase analysis results are presented for turbulent flows within the pressurized water reactor (PWR) core geometries. The related simulations are possible to carry out only with the world leading HPC platforms. These simulations are allowing more complex turbulence model development and validation for use in 3D multiphase computational fluid dynamics (M-CFD) codes.

  8. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    In the present work a framework for optimizing the design of boilers for dynamic operation has been developed. A cost function to be minimized during the optimization has been formulated and for the present design variables related to the Boiler Volume and the Boiler load Gradient (i.e. ring rate...... on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...

  9. Leukocyte Motility Models Assessed through Simulation and Multi-objective Optimization-Based Model Selection.

    Directory of Open Access Journals (Sweden)

    Mark N Read

    2016-09-01

    Full Text Available The advent of two-photon microscopy now reveals unprecedented, detailed spatio-temporal data on cellular motility and interactions in vivo. Understanding cellular motility patterns is key to gaining insight into the development and possible manipulation of the immune response. Computational simulation has become an established technique for understanding immune processes and evaluating hypotheses in the context of experimental data, and there is clear scope to integrate microscopy-informed motility dynamics. However, determining which motility model best reflects in vivo motility is non-trivial: 3D motility is an intricate process requiring several metrics to characterize. This complicates model selection and parameterization, which must be performed against several metrics simultaneously. Here we evaluate Brownian motion, Lévy walk and several correlated random walks (CRWs against the motility dynamics of neutrophils and lymph node T cells under inflammatory conditions by simultaneously considering cellular translational and turn speeds, and meandering indices. Heterogeneous cells exhibiting a continuum of inherent translational speeds and directionalities comprise both datasets, a feature significantly improving capture of in vivo motility when simulated as a CRW. Furthermore, translational and turn speeds are inversely correlated, and the corresponding CRW simulation again improves capture of our in vivo data, albeit to a lesser extent. In contrast, Brownian motion poorly reflects our data. Lévy walk is competitive in capturing some aspects of neutrophil motility, but T cell directional persistence only, therein highlighting the importance of evaluating models against several motility metrics simultaneously. This we achieve through novel application of multi-objective optimization, wherein each model is independently implemented and then parameterized to identify optimal trade-offs in performance against each metric. The resultant Pareto

  10. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  11. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    Science.gov (United States)

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  12. Plant-Level Modeling and Simulation of Used Nuclear Fuel Dissolution

    Energy Technology Data Exchange (ETDEWEB)

    de Almeida, Valmor F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2012-09-07

    Plant-level modeling and simulation of a used nuclear fuel prototype dissolver is presented. Emphasis is given in developing a modeling and simulation approach to be explored by other processes involved in the recycle of used fuel. The commonality concepts presented in a previous communication were used to create a model and realize its software module. An initial model was established based on a theory of chemical thermomechanical network transport outlined previously. A software module prototype was developed with the required external behavior and internal mathematical structure. Results obtained demonstrate the generality of the design approach and establish an extensible mathematical model with its corresponding software module for a wide range of dissolvers. Scale up numerical tests were made varying the type of used fuel (breeder and light-water reactors) and the capacity of dissolution (0.5 t/d to 1.7 t/d). These tests were motivated by user requirements in the area of nuclear materials safeguards. A computer module written in high-level programing languages (MATLAB and Octave) was developed, tested, and provided as open-source code (MATLAB) for integration into the Separations and Safeguards Performance Model application in development at Sandia National Laboratories. The modeling approach presented here is intended to serve as a template for a rational modeling of all plant-level modules. This will facilitate the practical application of the commonality features underlying the unifying network transport theory proposed recently. In addition, by example, this model describes, explicitly, the needed data from sub-scale models, and logical extensions for future model development. For example, from thermodynamics, an off-line simulation of molecular dynamics could quantify partial molar volumes for the species in the liquid phase; this simulation is currently at reach for high-performance computing. From fluid mechanics, a hold-up capacity function is needed

  13. Time-domain simulation and nonlinear analysis on ride performance of four-wheel vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y S; He, H; Geng, A L [School of Automobile and Traffic Engineering, Liaoning University of Technology, Jinzhou 121001 (China)], E-mail: jzwbt@163.com

    2008-02-15

    A nonlinear dynamic model with eight DOFs of a four-wheel vehicle is established in this paper. After detaching the nonlinear characteristics of the leaf springs and shock absorbers, the multi-step linearizing method is used to simulate the vehicle vibration in time domain, under a correlated four-wheel road roughness model. Experimental verifications suggest that the newly built vehicle model and simulation procedure are reasonable and feasible to be used in vehicle vibration analysis. Furthermore, some nonlinear factors of the leaf springs and shock absorbers, which affect the vehicle ride performance (or comfort), are investigated under different vehicle running speeds. Some substaintial rules of the nonlinear vehicle vibrations are revealed in this paper.

  14. Time-domain simulation and nonlinear analysis on ride performance of four-wheel vehicles

    International Nuclear Information System (INIS)

    Wang, Y S; He, H; Geng, A L

    2008-01-01

    A nonlinear dynamic model with eight DOFs of a four-wheel vehicle is established in this paper. After detaching the nonlinear characteristics of the leaf springs and shock absorbers, the multi-step linearizing method is used to simulate the vehicle vibration in time domain, under a correlated four-wheel road roughness model. Experimental verifications suggest that the newly built vehicle model and simulation procedure are reasonable and feasible to be used in vehicle vibration analysis. Furthermore, some nonlinear factors of the leaf springs and shock absorbers, which affect the vehicle ride performance (or comfort), are investigated under different vehicle running speeds. Some substaintial rules of the nonlinear vehicle vibrations are revealed in this paper

  15. A Simulation Study on the Performance of Radiant Ceilings Combined with Free-Hanging Horizontal Sound Absorbers

    DEFF Research Database (Denmark)

    Kazanci, Ongun Berk; Domínguez, L. Marcos; Rage, Niels

    2018-01-01

    using TABS, most building simulation models assume an uncovered ceiling; however, this might not be the case in practice, due to the use of free-hanging horizontal (or vertical) sound absorbers for the control of room acoustic conditions. The use of sound absorbers will decrease the performance...... of radiant ceiling cooling systems. Therefore, the quantification of the effects during the design phase is important for predicting the resulting thermal indoor environment and for system dimensioning. In this study, a two-person office room equipped with TABS was simulated using a commercially available...... simulation software with a recently developed plug-in that allows simulating the effects of horizontal sound absorbers on the performance of TABS and on the thermal indoor environment. The change in thermal indoor environment and in performance of TABS were quantified, and the simulation results were...

  16. Simulation of Lake Surface Heat Fluxes by the Canadian Small Lake Model: Offline Performance Assessment for Future Coupling with a Regional Climate Model

    Science.gov (United States)

    Pernica, P.; Guerrero, J. L.; MacKay, M.; Wheater, H. S.

    2014-12-01

    Lakes strongly influence local and regional climate especially in regions where they are abundant. Development of a lake model for the purpose of integration within a regional climate model is therefore a subject of scientific interest. Of particular importance are the heat flux predictions provided by the lake model since they function as key forcings in a fully coupled atmosphere-land-lake system. The first step towards a coupled model is to validate and characterize the accuracy of the lake model over a range of conditions and to identify limitations. In this work, validation results from offline tests of the Canadian Small Lake Model; a deterministic, computationally efficient, 1D integral model, are presented. Heat fluxes (sensible and latent) and surface water temperatures simulated by the model are compared with in situ observations from two lakes; Landing Lake (NWT, Canada) and L239 (ELA, Canada) for the 2007-2009 period. Sensitivity analysis is performed to identify key parameters important for heat flux predictions. The results demonstrate the ability of the 1-D lake model to reproduce both diurnal and seasonal variations in heat fluxes and surface temperatures for the open water period. These results, in context of regional climate modelling are also discussed.

  17. A Monte Carlo simulation study of the impact of novel scintillation crystals on performance characteristics of PET scanners

    DEFF Research Database (Denmark)

    Ghabrial, Amir; Franklin, Daniel; Zaidi, Habib

    2018-01-01

    Objective: The purpose of this study is to validate a Monte Carlo simulation model for the clinical Siemens Biograph mCT PET scanner using the GATE simulation toolkit, and to evaluate the performance of six different scintillation materials in this model using the National Electrical Manufactures...

  18. Simulated training in colonoscopic stenting of colonic strictures: validation of a cadaver model.

    Science.gov (United States)

    Iordache, F; Bucobo, J C; Devlin, D; You, K; Bergamaschi, R

    2015-07-01

    There are currently no available simulation models for training in colonoscopic stent deployment. The aim of this study was to validate a cadaver model for simulation training in colonoscopy with stent deployment for colonic strictures. This was a prospective study enrolling surgeons at a single institution. Participants performed colonoscopic stenting on a cadaver model. Their performance was assessed by two independent observers. Measurements were performed for quantitative analysis (time to identify stenosis, time for deployment, accuracy) and a weighted score was devised for assessment. The Mann-Whitney U-test and Student's t-test were used for nonparametric and parametric data, respectively. Cohen's kappa coefficient was used for reliability. Twenty participants performed a colonoscopy with deployment of a self-expandable metallic stent in two cadavers (groups A and B) with 20 strictures overall. The median time was 206 s. The model was able to differentiate between experts and novices (P = 0. 013). The results showed a good consensus estimate of reliability, with kappa = 0.571 (P cadaver model described in this study has content, construct and concurrent validity for simulation training in colonoscopic deployment of self-expandable stents for colonic strictures. Further studies are needed to evaluate the predictive validity of this model in terms of skill transfer to clinical practice. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.

  19. Simulation and experimental validation of the performance of a absorption refrigerator

    International Nuclear Information System (INIS)

    Olbricht, Michael; Luke, Andrea

    2015-01-01

    The two biggest obstacles to a stronger market penetration of absorption refrigerators are their high cost and the size of the apparatus, which are due to the inaccurate methods for plant design. In order to contribute to an improved design a thermodynamic model is presented to describe the performance of a absorption refrigerator with the working fluid water/lithium. In this model, the processes are displayed in the single apparatus and coupled to each other in the systemic context. Thereby the interactions between the apparatus can specifically investigated and thus the process limiting component can be identified under the respective conditions. A validation of the simulation model and the boundary conditions used is done based on experimental data operating a self-developed absorption refrigerator. In the simulation, the heat transfer surfaces in accordance with the real system can be specified. The heat transport is taken into account based on typical values for the heat transfer in the individual apparatuses. Simulation results show good agreement with the experimental data. The physical relationships and influences externally defined operating parameters are correctly reproduced. Due to the chosen low heat transfer coefficient, the calculated cooling capacities by the model are below the experimentally measured. Finally, the possibilities and limitations are discussed by using the model and further improvement possibilities are suggested. [de

  20. Performance modelling of plasma microthruster nozzles in vacuum

    Science.gov (United States)

    Ho, Teck Seng; Charles, Christine; Boswell, Rod

    2018-05-01

    Computational fluid dynamics and plasma simulations of three geometrical variations of the Pocket Rocket radiofrequency plasma electrothermal microthruster are conducted, comparing pulsed plasma to steady state cold gas operation. While numerical limitations prevent plasma modelling in a vacuum environment, results may be obtained by extrapolating from plasma simulations performed in a pressurised environment, using the performance delta from cold gas simulations performed in both environments. Slip regime boundary layer effects are significant at these operating conditions. The present investigation targets a power budget of ˜10 W for applications on CubeSats. During plasma operation, the thrust force increases by ˜30% with a power efficiency of ˜30 μNW-1. These performance metrics represent instantaneous or pulsed operation and will increase over time as the discharge chamber attains thermal equilibrium with the heated propellant. Additionally, the sculpted nozzle geometry achieves plasma confinement facilitated by the formation of a plasma sheath at the nozzle throat, and fast recombination ensures a neutral exhaust plume that avoids the contamination of solar panels and interference with externally mounted instruments.

  1. Shuttle/TDRSS modelling and link simulation study

    Science.gov (United States)

    Braun, W. R.; Mckenzie, T. M.; Biederman, L.; Lindsey, W. C.

    1979-01-01

    A Shuttle/TDRSS S-band and Ku-band link simulation package called LinCsim was developed for the evaluation of link performance for specific Shuttle signal designs. The link models were described in detail and the transmitter distortion parameters or user constraints were carefully defined. The overall link degradation (excluding hardware degradations) relative to an ideal BPSK channel were given for various sets of user constraint values. The performance sensitivity to each individual user constraint was then illustrated. The effect of excessive Spacelab clock jitter on the return link BER performance was also investigated as was the problem of subcarrier recovery for the K-band Shuttle return link signal.

  2. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    Science.gov (United States)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M

  3. Utilization of Short-Simulations for Tuning High-Resolution Climate Model

    Science.gov (United States)

    Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.

    2016-12-01

    Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (tests have been found to be effective in numerous previous studies in identifying model biases due to parameterized fast physics, and we demonstrate that it is also useful for tuning. After the most egregious errors are addressed through an initial "rough" tuning phase, longer simulations are performed to "hone in" on model features that evolve over longer timescales. We explore these strategies to tune the DOE ACME (Accelerated Climate Modeling for Energy) model. For the ACME model at 0.25° resolution, it is confirmed that, given the same parameters, major biases in global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning cycle, with the former to survey the likely responses and narrow the parameter space, and the latter to verify the results in climate context along with assessment in

  4. Global climate model performance over Alaska and Greenland

    DEFF Research Database (Denmark)

    Walsh, John E.; Chapman, William L.; Romanovsky, Vladimir

    2008-01-01

    The performance of a set of 15 global climate models used in the Coupled Model Intercomparison Project is evaluated for Alaska and Greenland, and compared with the performance over broader pan-Arctic and Northern Hemisphere extratropical domains. Root-mean-square errors relative to the 1958...... to narrowing the uncertainty and obtaining more robust estimates of future climate change in regions such as Alaska, Greenland, and the broader Arctic....... of the models are generally much larger than the biases of the composite output, indicating that the systematic errors differ considerably among the models. There is a tendency for the models with smaller errors to simulate a larger greenhouse warming over the Arctic, as well as larger increases of Arctic...

  5. Evaluation of Three Models for Simulating Pesticide Runoff from Irrigated Agricultural Fields.

    Science.gov (United States)

    Zhang, Xuyang; Goh, Kean S

    2015-11-01

    Three models were evaluated for their accuracy in simulating pesticide runoff at the edge of agricultural fields: Pesticide Root Zone Model (PRZM), Root Zone Water Quality Model (RZWQM), and OpusCZ. Modeling results on runoff volume, sediment erosion, and pesticide loss were compared with measurements taken from field studies. Models were also compared on their theoretical foundations and ease of use. For runoff events generated by sprinkler irrigation and rainfall, all models performed equally well with small errors in simulating water, sediment, and pesticide runoff. The mean absolute percentage errors (MAPEs) were between 3 and 161%. For flood irrigation, OpusCZ simulated runoff and pesticide mass with the highest accuracy, followed by RZWQM and PRZM, likely owning to its unique hydrological algorithm for runoff simulations during flood irrigation. Simulation results from cold model runs by OpusCZ and RZWQM using measured values for model inputs matched closely to the observed values. The MAPE ranged from 28 to 384 and 42 to 168% for OpusCZ and RZWQM, respectively. These satisfactory model outputs showed the models' abilities in mimicking reality. Theoretical evaluations indicated that OpusCZ and RZWQM use mechanistic approaches for hydrology simulation, output data on a subdaily time-step, and were able to simulate management practices and subsurface flow via tile drainage. In contrast, PRZM operates at daily time-step and simulates surface runoff using the USDA Soil Conservation Service's curve number method. Among the three models, OpusCZ and RZWQM were suitable for simulating pesticide runoff in semiarid areas where agriculture is heavily dependent on irrigation. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  6. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón

    2013-08-01

    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  7. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  8. Modeling the Performance of Water-Zeolite 13X Adsorption Heat Pump

    Science.gov (United States)

    Kowalska, Kinga; Ambrożek, Bogdan

    2017-12-01

    The dynamic performance of cylindrical double-tube adsorption heat pump is numerically analysed using a non-equilibrium model, which takes into account both heat and mass transfer processes. The model includes conservation equations for: heat transfer in heating/cooling fluids, heat transfer in the metal tube, and heat and mass transfer in the adsorbent. The mathematical model is numerically solved using the method of lines. Numerical simulations are performed for the system water-zeolite 13X, chosen as the working pair. The effect of the evaporator and condenser temperatures on the adsorption and desorption kinetics is examined. The results of the numerical investigation show that both of these parameters have a significant effect on the adsorption heat pump performance. Based on computer simulation results, the values of the coefficients of performance for heating and cooling are calculated. The results show that adsorption heat pumps have relatively low efficiency compared to other heat pumps. The value of the coefficient of performance for heating is higher than for cooling

  9. LIAR -- A computer program for the modeling and simulation of high performance linacs

    International Nuclear Information System (INIS)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-04-01

    The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm

  10. Digital Quantum Simulation of Spin Models with Circuit Quantum Electrodynamics

    Directory of Open Access Journals (Sweden)

    Y. Salathé

    2015-06-01

    Full Text Available Systems of interacting quantum spins show a rich spectrum of quantum phases and display interesting many-body dynamics. Computing characteristics of even small systems on conventional computers poses significant challenges. A quantum simulator has the potential to outperform standard computers in calculating the evolution of complex quantum systems. Here, we perform a digital quantum simulation of the paradigmatic Heisenberg and Ising interacting spin models using a two transmon-qubit circuit quantum electrodynamics setup. We make use of the exchange interaction naturally present in the simulator to construct a digital decomposition of the model-specific evolution and extract its full dynamics. This approach is universal and efficient, employing only resources that are polynomial in the number of spins, and indicates a path towards the controlled simulation of general spin dynamics in superconducting qubit platforms.

  11. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    Energy Technology Data Exchange (ETDEWEB)

    Turinsky, Paul J., E-mail: turinsky@ncsu.edu [North Carolina State University, PO Box 7926, Raleigh, NC 27695-7926 (United States); Kothe, Douglas B., E-mail: kothe@ornl.gov [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6164 (United States)

    2016-05-15

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL

  12. Team Culture and Business Strategy Simulation Performance

    Science.gov (United States)

    Ritchie, William J.; Fornaciari, Charles J.; Drew, Stephen A. W.; Marlin, Dan

    2013-01-01

    Many capstone strategic management courses use computer-based simulations as core pedagogical tools. Simulations are touted as assisting students in developing much-valued skills in strategy formation, implementation, and team management in the pursuit of superior strategic performance. However, despite their rich nature, little is known regarding…

  13. Numerical Simulations of a Multiscale Model of Stratified Langmuir Circulation

    Science.gov (United States)

    Malecha, Ziemowit; Chini, Gregory; Julien, Keith

    2012-11-01

    Langmuir circulation (LC), a prominent form of wind and surface-wave driven shear turbulence in the ocean surface boundary layer (BL), is commonly modeled using the Craik-Leibovich (CL) equations, a phase-averaged variant of the Navier-Stokes (NS) equations. Although surface-wave filtering renders the CL equations more amenable to simulation than are the instantaneous NS equations, simulations in wide domains, hundreds of times the BL depth, currently earn the ``grand challenge'' designation. To facilitate simulations of LC in such spatially-extended domains, we have derived multiscale CL equations by exploiting the scale separation between submesoscale and BL flows in the upper ocean. The numerical algorithm for simulating this multiscale model resembles super-parameterization schemes used in meteorology, but retains a firm mathematical basis. We have validated our algorithm and here use it to perform multiscale simulations of the interaction between LC and upper ocean density stratification. ZMM, GPC, KJ gratefully acknowledge funding from NSF CMG Award 0934827.

  14. Reimplementation of the Biome-BGC model to simulate successional change.

    Science.gov (United States)

    Bond-Lamberty, Ben; Gower, Stith T; Ahl, Douglas E; Thornton, Peter E

    2005-04-01

    Biogeochemical process models are increasingly employed to simulate current and future forest dynamics, but most simulate only a single canopy type. This limitation means that mixed stands, canopy succession and understory dynamics cannot be modeled, severe handicaps in many forests. The goals of this study were to develop a version of Biome-BGC that supported multiple, interacting vegetation types, and to assess its performance and limitations by comparing modeled results to published data from a 150-year boreal black spruce (Picea mariana (Mill.) BSP) chronosequence in northern Manitoba, Canada. Model data structures and logic were modified to support an arbitrary number of interacting vegetation types; an explicit height calculation was necessary to prioritize radiation and precipitation interception. Two vegetation types, evergreen needle-leaf and deciduous broadleaf, were modeled based on site-specific meteorological and physiological data. The new version of Biome-BGC reliably simulated observed changes in leaf area, net primary production and carbon stocks, and should be useful for modeling the dynamics of mixed-species stands and ecological succession. We discuss the strengths and limitations of Biome-BGC for this application, and note areas in which further work is necessary for reliable simulation of boreal biogeochemical cycling at a landscape scale.

  15. Simulation of nuclear plant operation into a stochastic energy production model

    International Nuclear Information System (INIS)

    Pacheco, R.L.

    1983-04-01

    A simulation model of nuclear plant operation is developed to fit into a stochastic energy production model. In order to improve the stochastic model used, and also reduce its computational time burdened by the aggregation of the model of nuclear plant operation, a study of tail truncation of the unsupplied demand distribution function has been performed. (E.G.) [pt

  16. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    Science.gov (United States)

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  17. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, A.; Moore, J.C.; Cui, Xingquan; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D.M.; McGuire, A.D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-01-01

     We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135  ×  104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101  × 104 km2). However the uncertainty (1 to 128  ×  104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future

  18. Temperature sensitivity to the land-surface model in MM5 climate simulations over the Iberian Peninsula

    Energy Technology Data Exchange (ETDEWEB)

    Jerez, Sonia; Montavez, Juan P.; Gomez-Navarro, Juan J.; Jimenez-Guerrero, Pedro [Dept. de Fisica, Univ. de Murcia (Spain); Jimenez, Jose M.; Gonzalez-Rouco, Jesus F. [Dept. de Astrofisica y CC. de la Atmosfera, Univ. Complutense de Madrid (Spain)

    2010-06-15

    Three different Land Surface Models have been used in three high resolution climate simulations performed with the mesoscale model MM5 over the Iberian Peninsula. The main difference among them lies in the soil moisture treatment, which is dynamically modelled by only two of them (Noah and Pleim and Xiu models), while in the simplest model (Simple Five-Layers) it is fixed to climatological values. The simulated period covers 1958-2002, using the ERA40 reanalysis data as driving conditions. Focusing on near-surface air temperature, this work evaluates the skill of each simulation in reproducing mean values and temporal variability, by comparing the simulations with observed temperature series. When the simplest simulation was analyzed, the greatest discrepances were observed for the summer season, when both, the mean values and the temporal variability of the temperature series, were badly underestimated. These weaknesses are largely overcome in the other two simulations (performed by coupling a more advanced soil model to MM5), and there was greater concordance between the simulated and observed spatial patterns. The influence of a dynamic soil moisture parameterization and, therefore, a more realistic simulation of the latent and sensible heat fluxes between the land and the atmosphere, helps to explain these results. (orig.)

  19. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  20. Simulation of MILD combustion using Perfectly Stirred Reactor model

    KAUST Repository

    Chen, Z.

    2016-07-06

    A simple model based on a Perfectly Stirred Reactor (PSR) is proposed for moderate or intense low-oxygen dilution (MILD) combustion. The PSR calculation is performed covering the entire flammability range and the tabulated chemistry approach is used with a presumed joint probability density function (PDF). The jet, in hot and diluted coflow experimental set-up under MILD conditions, is simulated using this reactor model for two oxygen dilution levels. The computed results for mean temperature, major and minor species mass fractions are compared with the experimental data and simulation results obtained recently using a multi-environment transported PDF approach. Overall, a good agreement is observed at three different axial locations for these comparisons despite the over-predicted peak value of CO formation. This suggests that MILD combustion can be effectively modelled by the proposed PSR model with lower computational cost.

  1. The performance of simulated annealing in parameter estimation for vapor-liquid equilibrium modeling

    Directory of Open Access Journals (Sweden)

    A. Bonilla-Petriciolet

    2007-03-01

    Full Text Available In this paper we report the application and evaluation of the simulated annealing (SA optimization method in parameter estimation for vapor-liquid equilibrium (VLE modeling. We tested this optimization method using the classical least squares and error-in-variable approaches. The reliability and efficiency of the data-fitting procedure are also considered using different values for algorithm parameters of the SA method. Our results indicate that this method, when properly implemented, is a robust procedure for nonlinear parameter estimation in thermodynamic models. However, in difficult problems it still can converge to local optimums of the objective function.

  2. Experimental measurements and theoretical model of the cryogenic performance of bialkali photocathode and characterization with Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Huamu Xie

    2016-10-01

    Full Text Available High-average-current, high-brightness electron sources have important applications, such as in high-repetition-rate free-electron lasers, or in the electron cooling of hadrons. Bialkali photocathodes are promising high-quantum-efficiency (QE cathode materials, while superconducting rf (SRF electron guns offer continuous-mode operation at high acceleration, as is needed for high-brightness electron sources. Thus, we must have a comprehensive understanding of the performance of bialkali photocathode at cryogenic temperatures when they are to be used in SRF guns. To remove the heat produced by the radio-frequency field in these guns, the cathode should be cooled to cryogenic temperatures. We recorded an 80% reduction of the QE upon cooling the K_{2}CsSb cathode from room temperature down to the temperature of liquid nitrogen in Brookhaven National Laboratory (BNL’s 704 MHz SRF gun. We conducted several experiments to identify the underlying mechanism in this reduction. The change in the spectral response of the bialkali photocathode, when cooled from room temperature (300 K to 166 K, suggests that a change in the ionization energy (defined as the energy gap from the top of the valence band to vacuum level is the main reason for this reduction. We developed an analytical model of the process, based on Spicer’s three-step model. The change in ionization energy, with falling temperature, gives a simplified description of the QE’s temperature dependence. We also developed a 2D Monte Carlo code to simulate photoemission that accounts for the wavelength-dependent photon absorption in the first step, the scattering and diffusion in the second step, and the momentum conservation in the emission step. From this simulation, we established a correlation between ionization energy and reduction in the QE. The simulation yielded results comparable to those from the analytical model. The simulation offers us additional capabilities such as calculation

  3. MATLAB Simulation of Photovoltaic and Photovoltaic/Thermal Systems Performance

    Science.gov (United States)

    Nasir, Farah H. M.; Husaini, Yusnira

    2018-03-01

    The efficiency of the photovoltaic reduces when the photovoltaic cell temperature increased due to solar irradiance. One solution is come up with the cooling system photovoltaic system. This combination is forming the photovoltaic-thermal (PV/T) system. Not only will it generate electricity also heat at the same time. The aim of this research is to focus on the modeling and simulation of photovoltaic (PV) and photovoltaic-thermal (PV/T) electrical performance by using single-diode equivalent circuit model. Both PV and PV/T models are developed in Matlab/Simulink. By providing the cooling system in PV/T, the efficiency of the system can be increased by decreasing the PV cell temperature. The maximum thermal, electrical and total efficiency values of PV/T in the present research are 35.18%, 15.56% and 50.74% at solar irradiance of 400 W/m2, mass flow rate of 0.05kgs-1 and inlet temperature of 25 °C respectively has been obtained. The photovoltaic-thermal shows that the higher efficiency performance compared to the photovoltaic system.

  4. Operations planning simulation: Model study

    Science.gov (United States)

    1974-01-01

    The use of simulation modeling for the identification of system sensitivities to internal and external forces and variables is discussed. The technique provides a means of exploring alternate system procedures and processes, so that these alternatives may be considered on a mutually comparative basis permitting the selection of a mode or modes of operation which have potential advantages to the system user and the operator. These advantages are measurements is system efficiency are: (1) the ability to meet specific schedules for operations, mission or mission readiness requirements or performance standards and (2) to accomplish the objectives within cost effective limits.

  5. Simulations of an Offshore Wind Farm Using Large-Eddy Simulation and a Torque-Controlled Actuator Disc Model

    Science.gov (United States)

    Creech, Angus; Früh, Wolf-Gerrit; Maguire, A. Eoghan

    2015-05-01

    We present here a computational fluid dynamics (CFD) simulation of Lillgrund offshore wind farm, which is located in the Øresund Strait between Sweden and Denmark. The simulation combines a dynamic representation of wind turbines embedded within a large-eddy simulation CFD solver and uses hr-adaptive meshing to increase or decrease mesh resolution where required. This allows the resolution of both large-scale flow structures around the wind farm, and the local flow conditions at individual turbines; consequently, the response of each turbine to local conditions can be modelled, as well as the resulting evolution of the turbine wakes. This paper provides a detailed description of the turbine model which simulates the interaction between the wind, the turbine rotors, and the turbine generators by calculating the forces on the rotor, the body forces on the air, and instantaneous power output. This model was used to investigate a selection of key wind speeds and directions, investigating cases where a row of turbines would be fully aligned with the wind or at specific angles to the wind. Results shown here include presentations of the spin-up of turbines, the observation of eddies moving through the turbine array, meandering turbine wakes, and an extensive wind farm wake several kilometres in length. The key measurement available for cross-validation with operational wind farm data is the power output from the individual turbines, where the effect of unsteady turbine wakes on the performance of downstream turbines was a main point of interest. The results from the simulations were compared to the performance measurements from the real wind farm to provide a firm quantitative validation of this methodology. Having achieved good agreement between the model results and actual wind farm measurements, the potential of the methodology to provide a tool for further investigations of engineering and atmospheric science problems is outlined.

  6. Hydrological model performance and parameter estimation in the wavelet-domain

    Directory of Open Access Journals (Sweden)

    B. Schaefli

    2009-10-01

    Full Text Available This paper proposes a method for rainfall-runoff model calibration and performance analysis in the wavelet-domain by fitting the estimated wavelet-power spectrum (a representation of the time-varying frequency content of a time series of a simulated discharge series to the one of the corresponding observed time series. As discussed in this paper, calibrating hydrological models so as to reproduce the time-varying frequency content of the observed signal can lead to different results than parameter estimation in the time-domain. Therefore, wavelet-domain parameter estimation has the potential to give new insights into model performance and to reveal model structural deficiencies. We apply the proposed method to synthetic case studies and a real-world discharge modeling case study and discuss how model diagnosis can benefit from an analysis in the wavelet-domain. The results show that for the real-world case study of precipitation – runoff modeling for a high alpine catchment, the calibrated discharge simulation captures the dynamics of the observed time series better than the results obtained through calibration in the time-domain. In addition, the wavelet-domain performance assessment of this case study highlights the frequencies that are not well reproduced by the model, which gives specific indications about how to improve the model structure.

  7. Simulation of a G-tolerance curve using the pulsatile cardiovascular model

    Science.gov (United States)

    Solomon, M.; Srinivasan, R.

    1985-01-01

    A computer simulation study, performed to assess the ability of the cardiovascular model to reproduce the G tolerance curve (G level versus tolerance time) is reported. A composite strength duration curve derived from experimental data obtained in human centrifugation studies was used for comparison. The effects of abolishing automomic control and of blood volume loss on G tolerance were also simulated. The results provide additional validation of the model. The need for the presence of autonomic reflexes even at low levels of G is pointed out. The low margin of safety with a loss of blood volume indicated by the simulation results underscores the necessity for protective measures during Shuttle reentry.

  8. Efficient simulation and likelihood methods for non-neutral multi-allele models.

    Science.gov (United States)

    Joyce, Paul; Genz, Alan; Buzbas, Erkan Ozge

    2012-06-01

    Throughout the 1980s, Simon Tavaré made numerous significant contributions to population genetics theory. As genetic data, in particular DNA sequence, became more readily available, a need to connect population-genetic models to data became the central issue. The seminal work of Griffiths and Tavaré (1994a , 1994b , 1994c) was among the first to develop a likelihood method to estimate the population-genetic parameters using full DNA sequences. Now, we are in the genomics era where methods need to scale-up to handle massive data sets, and Tavaré has led the way to new approaches. However, performing statistical inference under non-neutral models has proved elusive. In tribute to Simon Tavaré, we present an article in spirit of his work that provides a computationally tractable method for simulating and analyzing data under a class of non-neutral population-genetic models. Computational methods for approximating likelihood functions and generating samples under a class of allele-frequency based non-neutral parent-independent mutation models were proposed by Donnelly, Nordborg, and Joyce (DNJ) (Donnelly et al., 2001). DNJ (2001) simulated samples of allele frequencies from non-neutral models using neutral models as auxiliary distribution in a rejection algorithm. However, patterns of allele frequencies produced by neutral models are dissimilar to patterns of allele frequencies produced by non-neutral models, making the rejection method inefficient. For example, in some cases the methods in DNJ (2001) require 10(9) rejections before a sample from the non-neutral model is accepted. Our method simulates samples directly from the distribution of non-neutral models, making simulation methods a practical tool to study the behavior of the likelihood and to perform inference on the strength of selection.

  9. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  10. Development of a Computational Simulation Model for Conflict Management in Team Building

    Directory of Open Access Journals (Sweden)

    W. M. Wang

    2011-05-01

    Full Text Available Conflict management is one of the most important issues in leveraging organizational competitiveness. However, traditional social scientists built theories or models in this area which were mostly expressed in words and diagrams are insufficient. Social science research based on computational modeling and simulation is beginning to augment traditional theory building. Simulation provides a method for people to try their actions out in a way that is cost effective, faster, appropriate, flexible, and ethical. In this paper, a computational simulation model for conflict management in team building is presented. The model is designed and used to explore the individual performances related to the combination of individuals who have a range of conflict handling styles, under various types of resources and policies. The model is developed based on agent-based modeling method. Each of the agents has one of the five conflict handling styles: accommodation, compromise, competition, contingency, and learning. There are three types of scenarios: normal, convex, and concave. There are two types of policies: no policy, and a reward and punishment policy. Results from running the model are also presented. The simulation has led us to derive two implications concerning conflict management. First, a concave type of resource promotes competition, while convex type of resource promotes compromise and collaboration. Second, the performance ranking of different styles can be influenced by introducing different policies. On the other hand, it is possible for us to promote certain style by introducing different policies.

  11. Computer models and simulations of IGCC power plants with Canadian coals

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, L.; Furimsky, E.

    1999-07-01

    In this paper, three steady state computer models for simulation of IGCC power plants with Shell, Texaco and BGL (British Gas Lurgi) gasifiers will be presented. All models were based on a study by Bechtel for Nova Scotia Power Corporation. They were built by using Advanced System for Process Engineering (ASPEN) steady state simulation software together with Fortran programs developed in house. Each model was integrated from several sections which can be simulated independently, such as coal preparation, gasification, gas cooling, acid gas removing, sulfur recovery, gas turbine, heat recovery steam generation, and steam cycle. A general description of each process, model's overall structure, capability, testing results, and background reference will be given. The performance of some Canadian coals on these models will be discussed as well. The authors also built a computer model of IGCC power plant with Kellogg-Rust-Westinghouse gasifier, however, due to limitation of paper length, it is not presented here.

  12. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operati...

  13. Hybrid Reynolds-Averaged/Large Eddy Simulation of the Flow in a Model SCRamjet Cavity Flameholder

    Science.gov (United States)

    Baurle, R. A.

    2016-01-01

    Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. Experimental data available for this configuration include velocity statistics obtained from particle image velocimetry. Several turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged/large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This e ort was undertaken to not only assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community, but to also begin to understand how this capability can best be used to augment standard Reynolds-averaged simulations. The numerical errors were quantified for the steady-state simulations, and at least qualitatively assessed for the scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results displayed a high degree of variability when comparing the flameholder fuel distributions obtained from each turbulence model. This prompted the consideration of applying the higher-fidelity scale-resolving simulations as a surrogate "truth" model to calibrate the Reynolds-averaged closures in a non-reacting setting prior to their use for the combusting simulations. In general, the Reynolds-averaged velocity profile predictions at the lowest fueling level matched the particle imaging measurements almost as well as was observed for the non-reacting condition. However, the velocity field predictions proved to be more sensitive to the flameholder fueling rate than was indicated in the measurements.

  14. Modeling of HVAC operational faults in building performance simulation

    International Nuclear Information System (INIS)

    Zhang, Rongpeng; Hong, Tianzhen

    2017-01-01

    Highlights: •Discuss significance of capturing operational faults in existing buildings. •Develop a novel feature in EnergyPlus to model operational faults of HVAC systems. •Compare three approaches to faults modeling using EnergyPlus. •A case study demonstrates the use of the fault-modeling feature. •Future developments of new faults are discussed. -- Abstract: Operational faults are common in the heating, ventilating, and air conditioning (HVAC) systems of existing buildings, leading to a decrease in energy efficiency and occupant comfort. Various fault detection and diagnostic methods have been developed to identify and analyze HVAC operational faults at the component or subsystem level. However, current methods lack a holistic approach to predicting the overall impacts of faults at the building level—an approach that adequately addresses the coupling between various operational components, the synchronized effect between simultaneous faults, and the dynamic nature of fault severity. This study introduces the novel development of a fault-modeling feature in EnergyPlus which fills in the knowledge gap left by previous studies. This paper presents the design and implementation of the new feature in EnergyPlus and discusses in detail the fault-modeling challenges faced. The new fault-modeling feature enables EnergyPlus to quantify the impacts of faults on building energy use and occupant comfort, thus supporting the decision making of timely fault corrections. Including actual building operational faults in energy models also improves the accuracy of the baseline model, which is critical in the measurement and verification of retrofit or commissioning projects. As an example, EnergyPlus version 8.6 was used to investigate the impacts of a number of typical operational faults in an office building across several U.S. climate zones. The results demonstrate that the faults have significant impacts on building energy performance as well as on occupant

  15. The effects of model complexity and calibration period on groundwater recharge simulations

    Science.gov (United States)

    Moeck, Christian; Van Freyberg, Jana; Schirmer, Mario

    2017-04-01

    A significant number of groundwater recharge models exist that vary in terms of complexity (i.e., structure and parametrization). Typically, model selection and conceptualization is very subjective and can be a key source of uncertainty in the recharge simulations. Another source of uncertainty is the implicit assumption that model parameters, calibrated over historical periods, are also valid for the simulation period. To the best of our knowledge there is no systematic evaluation of the effect of the model complexity and calibration strategy on the performance of recharge models. To address this gap, we utilized a long-term recharge data set (20 years) from a large weighting lysimeter. We performed a differential split sample test with four groundwater recharge models that vary in terms of complexity. They were calibrated using six calibration periods with climatically contrasting conditions in a constrained Monte Carlo approach. Despite the climatically contrasting conditions, all models performed similarly well during the calibration. However, during validation a clear effect of the model structure on model performance was evident. The more complex, physically-based models predicted recharge best, even when calibration and prediction periods had very different climatic conditions. In contrast, more simplistic soil-water balance and lumped model performed poorly under such conditions. For these models we found a strong dependency on the chosen calibration period. In particular, our analysis showed that this can have relevant implications when using recharge models as decision-making tools in a broad range of applications (e.g. water availability, climate change impact studies, water resource management, etc.).

  16. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    Science.gov (United States)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M

  17. Digital Quantum Simulation of Spin Models with Circuit Quantum Electrodynamics

    OpenAIRE

    Salathé, Y.; Mondal, M.; Oppliger, M.; Heinsoo, J.; Kurpiers, P.; Potočnik, A.; Mezzacapo, Antonio; Las Heras García, Urtzi; Lamata Manuel, Lucas; Solano Villanueva, Enrique Leónidas; Filipp, S.; Wallraff, A.

    2015-01-01

    Systems of interacting quantum spins show a rich spectrum of quantum phases and display interesting many-body dynamics. Computing characteristics of even small systems on conventional computers poses significant challenges. A quantum simulator has the potential to outperform standard computers in calculating the evolution of complex quantum systems. Here, we perform a digital quantum simulation of the paradigmatic Heisenberg and Ising interacting spin models using a two transmon-qubit circuit...

  18. A new approach to flow simulation using hybrid models

    Science.gov (United States)

    Solgi, Abazar; Zarei, Heidar; Nourani, Vahid; Bahmani, Ramin

    2017-11-01

    The necessity of flow prediction in rivers, for proper management of water resource, and the need for determining the inflow to the dam reservoir, designing efficient flood warning systems and so forth, have always led water researchers to think about models with high-speed response and low error. In the recent years, the development of Artificial Neural Networks and Wavelet theory and using the combination of models help researchers to estimate the river flow better and better. In this study, daily and monthly scales were used for simulating the flow of Gamasiyab River, Nahavand, Iran. The first simulation was done using two types of ANN and ANFIS models. Then, using wavelet theory and decomposing input signals of the used parameters, sub-signals were obtained and were fed into the ANN and ANFIS to obtain hybrid models of WANN and WANFIS. In this study, in addition to the parameters of precipitation and flow, parameters of temperature and evaporation were used to analyze their effects on the simulation. The results showed that using wavelet transform improved the performance of the models in both monthly and daily scale. However, it had a better effect on the monthly scale and the WANFIS was the best model.

  19. Material model validation for laser shock peening process simulation

    International Nuclear Information System (INIS)

    Amarchinta, H K; Grandhi, R V; Langer, K; Stargel, D S

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 10 6  s −1 , which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic–plastic behavior of materials. Elastic perfectly plastic, Johnson–Cook and Zerilli–Armstrong models are used, and the performance of each model is compared with available experimental results

  20. Comments on ''Use of conditional simulation in nuclear waste site performance assessment'' by Carol Gotway

    International Nuclear Information System (INIS)

    Downing, D.J.

    1993-01-01

    This paper discusses Carol Gotway's paper, ''The Use of Conditional Simulation in Nuclear Waste Site Performance Assessment.'' The paper centers on the use of conditional simulation and the use of geostatistical methods to simulate an entire field of values for subsequent use in a complex computer model. The issues of sampling designs for geostatistics, semivariogram estimation and anisotropy, turning bands method for random field generation, and estimation of the comulative distribution function are brought out