WorldWideScience

Sample records for model simulations performed

  1. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  2. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  3. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  4. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    : a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat...

  5. Modeling, simulation and performance evaluation of parabolic ...

    African Journals Online (AJOL)

    Model of a parabolic trough power plant, taking into consideration the different losses associated with collection of the solar irradiance and thermal losses is presented. MATLAB software is employed to model the power plant at reference state points. The code is then used to find the different reference values which are ...

  6. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  7. Impact of reactive settler models on simulated WWTP performance.

    Science.gov (United States)

    Gernaey, K V; Jeppsson, U; Batstone, D J; Ingildsen, P

    2006-01-01

    Including a reactive settler model in a wastewater treatment plant model allows representation of the biological reactions taking place in the sludge blanket in the settler, something that is neglected in many simulation studies. The idea of including a reactive settler model is investigated for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takács settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate, combined with a non-reactive Takács settler. The second is a fully reactive ASM1 Takács settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively. The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler.

  8. Simulation model of a twin-tail, high performance airplane

    Science.gov (United States)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  9. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  10. Water desalination price from recent performances: Modelling, simulation and analysis

    International Nuclear Information System (INIS)

    Metaiche, M.; Kettab, A.

    2005-01-01

    The subject of the present article is the technical simulation of seawater desalination, by a one stage reverse osmosis system, the objectives of which are the recent valuation of cost price through the use of new membrane and permeator performances, the use of new means of simulation and modelling of desalination parameters, and show the main parameters influencing the cost price. We have taken as the simulation example the Seawater Desalting centre of Djannet (Boumerdes, Algeria). The present performances allow water desalting at a price of 0.5 $/m 3 , which is an interesting and promising price, corresponding with the very acceptable water product quality, in the order of 269 ppm. It is important to run the desalting systems by reverse osmosis under high pressure, resulting in further decrease of the desalting cost and the production of good quality water. Aberration in choice of functioning conditions produces high prices and unacceptable quality. However there exists the possibility of decreasing the price by decreasing the requirement on the product quality. The seawater temperature has an effect on the cost price and quality. The installation of big desalting centres, contributes to the decrease in prices. A very important, long and tedious calculation is effected, which is impossible to conduct without programming and informatics tools. The use of the simulation model has been much efficient in the design of desalination centres that can perform at very improved prices. (author)

  11. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  12. A New Model to Simulate Energy Performance of VRF Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  13. Maintenance personnel performance simulation (MAPPS): a model for predicting maintenance performance reliability in nuclear power plants

    International Nuclear Information System (INIS)

    Knee, H.E.; Krois, P.A.; Haas, P.M.; Siegel, A.I.; Ryan, T.G.

    1983-01-01

    The NRC has developed a structured, quantitative, predictive methodology in the form of a computerized simulation model for assessing maintainer task performance. Objective of the overall program is to develop, validate, and disseminate a practical, useful, and acceptable methodology for the quantitative assessment of NPP maintenance personnel reliability. The program was organized into four phases: (1) scoping study, (2) model development, (3) model evaluation, and (4) model dissemination. The program is currently nearing completion of Phase 2 - Model Development

  14. Impact of reactive settler models on simulated WWTP performance

    DEFF Research Database (Denmark)

    Gernaey, Krist; Jeppsson, Ulf; Batstone, Damien J.

    2006-01-01

    Including a reactive settler model in a wastewater treatment plant model allows representation of the biological reactions taking place in the sludge blanket in the settler, something that is neglected in many simulation studies. The idea of including a reactive settler model is investigated for ...

  15. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  16. OPNET Modeler simulations of performance for multi nodes wireless systems

    Directory of Open Access Journals (Sweden)

    Krupanek Beata

    2016-01-01

    Full Text Available Paper presents a study under the Quality of Service in modern wireless sensor networks. Such a networks are characterized by small amount of data transmitted in fixed periods. Very often this data must by transmitted in real time so data transmission delays should be well known. This article shows multimode network simulated in packet OPNET Modeler. Also nowadays the quality of services is very important especially in multi-nodes systems such a home automation or measurement systems.

  17. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    National Research Council Canada - National Science Library

    Hill, Raymond

    2001-01-01

    ... Laboratory, Logistics Research Division, Logistics Readiness Branch to propose a research agenda entitled, "Models, Web-based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance...

  18. Building performance simulation in the early design stage: An introduction to integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2015-01-01

    Designing with building performance simulation feedback in the early design stage has existed since the early days of computational modeling. However, as a consequence of a fragmented building industry building performance simulations (BPSs) in the early design stage are closely related to who is...

  19. Modeling, Simulation and Performance Evaluation of Parabolic Trough

    African Journals Online (AJOL)

    Mekuannint

    demand. Heat exchangers are used to transfer heat energy from the heat transfer fluid (HTF) to water coming from feedwater heaters. In this paper a proposed .... flexibility. The TRNSYS modeling includes the TRNSYS field model and power model. The solar field model shown in Fig. 4 includes weather data processors ...

  20. The WRF model performance for the simulation of heavy ...

    Indian Academy of Sciences (India)

    ... underestimated by both the cumulus parameterization schemes.The quantitative validation of the simulated rainfall is done by calculating the categorical skill scores like frequency bias,threat scores (TS)and equitable threat scores (ETS).In this case the KF scheme has outperformed the GD scheme for the low precipitation ...

  1. Using Modeling and Simulation to Evaluate Stability and Traction Performance of a Track Laying Robotic Vehicle

    National Research Council Canada - National Science Library

    Gunter, Dave D; Bylsma, Wesley W; Edgar, Kevin; Letherwood, Mike D; Gorsich, David J

    2005-01-01

    The objective of this paper will be to describe the computer-based modeling, simulation, and limited field testing effort that has been undertaken to investigate the dynamic performance of an unmanned...

  2. Modeling and Simulation of Ceramic Arrays to Improve Ballistic Performance

    Science.gov (United States)

    2014-04-30

    Impact with no Adhesive DOP (mm) 10.3 □ □ An adhesive layer of Epoxy Resin was added in between the SiC tile and the Al backing The tile...Tile Gap 0.508 mm with No Adhesive DOP (mm) 17.2 □ An adhesive layer of Epoxy Resin was added in between the SiC tile and the Al backing The...Baseline performance seam assessment (2 ft x 2 ft panels) □ Sintered 4’sq. SiC (Superior Graphite ) on Kevlar/Phenolic with 2-ply cover Cover (a

  3. Surrogate model approach for improving the performance of reactive transport simulations

    Science.gov (United States)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines

  4. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    Science.gov (United States)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  5. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  6. An AnyLogic simulation model for power and performance analysis of data centres

    NARCIS (Netherlands)

    Postema, Björn Frits; Haverkort, Boudewijn R.H.M.; Beltrán, Marta; Knottenbelt, William; Bradley, Jeremy

    In this paper we propose a simulation framework that allows for the analysis of power and performance trade-offs for data centres that save energy via power management. The models are cooperating discrete-event and agent-based models, which enable a variety of data centre configurations, including

  7. EMU Suit Performance Simulation

    Science.gov (United States)

    Cowley, Matthew S.; Benson, Elizabeth; Harvill, Lauren; Rajulu, Sudhakar

    2014-01-01

    Introduction: Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. To verify that new suit designs meet requirements, full prototypes must be built and tested with human subjects. However, numerous design iterations will occur before the hardware meets those requirements. Traditional draw-prototype-test paradigms for research and development are prohibitively expensive with today's shrinking Government budgets. Personnel at NASA are developing modern simulation techniques that focus on a human-centric design paradigm. These new techniques make use of virtual prototype simulations and fully adjustable physical prototypes of suit hardware. This is extremely advantageous and enables comprehensive design down-selections to be made early in the design process. Objectives: The primary objective was to test modern simulation techniques for evaluating the human performance component of two EMU suit concepts, pivoted and planar style hard upper torso (HUT). Methods: This project simulated variations in EVA suit shoulder joint design and subject anthropometry and then measured the differences in shoulder mobility caused by the modifications. These estimations were compared to human-in-the-loop test data gathered during past suited testing using four subjects (two large males, two small females). Results: Results demonstrated that EVA suit modeling and simulation are feasible design tools for evaluating and optimizing suit design based on simulated performance. The suit simulation model was found to be advantageous in its ability to visually represent complex motions and volumetric reach zones in three dimensions, giving designers a faster and deeper comprehension of suit component performance vs. human performance. Suit models were able to discern differing movement capabilities between EMU HUT configurations, generic suit fit concerns, and specific suit fit concerns for crewmembers based

  8. Discharge simulations performed with a hydrological model using bias corrected regional climate model input

    Directory of Open Access Journals (Sweden)

    S. C. van Pelt

    2009-12-01

    Full Text Available Studies have demonstrated that precipitation on Northern Hemisphere mid-latitudes has increased in the last decades and that it is likely that this trend will continue. This will have an influence on discharge of the river Meuse. The use of bias correction methods is important when the effect of precipitation change on river discharge is studied. The objective of this paper is to investigate the effect of using two different bias correction methods on output from a Regional Climate Model (RCM simulation. In this study a Regional Atmospheric Climate Model (RACMO2 run is used, forced by ECHAM5/MPIOM under the condition of the SRES-A1B emission scenario, with a 25 km horizontal resolution. The RACMO2 runs contain a systematic precipitation bias on which two bias correction methods are applied. The first method corrects for the wet day fraction and wet day average (WD bias correction and the second method corrects for the mean and coefficient of variance (MV bias correction. The WD bias correction initially corrects well for the average, but it appears that too many successive precipitation days were removed with this correction. The second method performed less well on average bias correction, but the temporal precipitation pattern was better. Subsequently, the discharge was calculated by using RACMO2 output as forcing to the HBV-96 hydrological model. A large difference was found between the simulated discharge of the uncorrected RACMO2 run, the WD bias corrected run and the MV bias corrected run. These results show the importance of an appropriate bias correction.

  9. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  10. Performance of the general circulation models in simulating temperature and precipitation over Iran

    Science.gov (United States)

    Abbasian, Mohammadsadegh; Moghim, Sanaz; Abrishamchi, Ahmad

    2018-03-01

    General Circulation Models (GCMs) are advanced tools for impact assessment and climate change studies. Previous studies show that the performance of the GCMs in simulating climate variables varies significantly over different regions. This study intends to evaluate the performance of the Coupled Model Intercomparison Project phase 5 (CMIP5) GCMs in simulating temperature and precipitation over Iran. Simulations from 37 GCMs and observations from the Climatic Research Unit (CRU) were obtained for the period of 1901-2005. Six measures of performance including mean bias, root mean square error (RMSE), Nash-Sutcliffe efficiency (NSE), linear correlation coefficient (r), Kolmogorov-Smirnov statistic (KS), Sen's slope estimator, and the Taylor diagram are used for the evaluation. GCMs are ranked based on each statistic at seasonal and annual time scales. Results show that most GCMs perform reasonably well in simulating the annual and seasonal temperature over Iran. The majority of the GCMs have a poor skill to simulate precipitation, particularly at seasonal scale. Based on the results, the best GCMs to represent temperature and precipitation simulations over Iran are the CMCC-CMS (Euro-Mediterranean Center on Climate Change) and the MRI-CGCM3 (Meteorological Research Institute), respectively. The results are valuable for climate and hydrometeorological studies and can help water resources planners and managers to choose the proper GCM based on their criteria.

  11. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    Science.gov (United States)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  12. Simulation, Characterization, and Optimization of Metabolic Models with the High Performance Systems Biology Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Lunacek, M.; Nag, A.; Alber, D. M.; Gruchalla, K.; Chang, C. H.; Graf, P. A.

    2011-01-01

    The High Performance Systems Biology Toolkit (HiPer SBTK) is a collection of simulation and optimization components for metabolic modeling and the means to assemble these components into large parallel processing hierarchies suiting a particular simulation and optimization need. The components come in a variety of different categories: model translation, model simulation, parameter sampling, sensitivity analysis, parameter estimation, and optimization. They can be configured at runtime into hierarchically parallel arrangements to perform nested combinations of simulation characterization tasks with excellent parallel scaling to thousands of processors. We describe the observations that led to the system, the components, and how one can arrange them. We show nearly 90% efficient scaling to over 13,000 processors, and we demonstrate three complex yet typical examples that have run on {approx}1000 processors and accomplished billions of stiff ordinary differential equation simulations. This work opens the door for the systems biology metabolic modeling community to take effective advantage of large scale high performance computing resources for the first time.

  13. Integrating Soft Set Theory and Fuzzy Linguistic Model to Evaluate the Performance of Training Simulation Systems.

    Science.gov (United States)

    Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu

    2016-01-01

    The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance-performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system.

  14. Comparison of Two Models for Damage Accumulation in Simulations of System Performance

    Energy Technology Data Exchange (ETDEWEB)

    Youngblood, R. [Idaho National Laboratory, Idaho Falls, ID (United States); Mandelli, D. [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-11-01

    A comprehensive simulation study of system performance needs to address variations in component behavior, variations in phenomenology, and the coupling between phenomenology and component failure. This paper discusses two models of this: 1. damage accumulation is modeled as a random walk process in each time history, with component failure occurring when damage accumulation reaches a specified threshold; or 2. damage accumulation is modeled mechanistically within each time history, but failure occurs when damage reaches a time-history-specific threshold, sampled at time zero from each component’s distribution of damage tolerance. A limiting case of the latter is classical discrete-event simulation, with component failure times sampled a priori from failure time distributions; but in such models, the failure times are not typically adjusted for operating conditions varying within a time history. Nowadays, as discussed below, it is practical to account for this. The paper compares the interpretations and computational aspects of the two models mentioned above.

  15. Modeling and simulation of a high-performance PACS based on a shared file system architecture

    Science.gov (United States)

    Meredith, Glenn; Anderson, Kenneth R.; Wirsz, Emil; Prior, Fred W.; Wilson, Dennis L.

    1992-07-01

    Siemens and Loral Western Development Labs have designed a Picture Archiving and Communication System capable of supporting a large, fully digital hospital. Its functions include the management, storage and retrieval of medical images. The system may be modeled as a heterogeneous network of processing elements, transfer devices and storage units. Several discrete event simulation models have been designed to investigate different levels of the design. These models include the System Model, focusing on the flow of image traffic throughout the system, the Workstation Models, focusing on the internal processing in the different types of workstations, and the Communication Network Model, focusing on the control communication and host computer processing. The first two of these models are addressed here, with reference being made to a separate paper regarding the Communication Network Model. This paper describes some of the issues addressed with the models, the modeling techniques used and the performance results from the simulations. Important parameters of interest include: time to retrieve images from different possible storage locations and the utilization levels of the transfer devices and other key hardware components. To understand system performance under fully loaded conditions, the proposed system for the Madigan Army Medical Center was modeled in detail, as part of the Medical Diagnostic Imaging Support System (MDIS) proposal.

  16. High-Performance Modeling of Carbon Dioxide Sequestration by Coupling Reservoir Simulation and Molecular Dynamics

    KAUST Repository

    Bao, Kai

    2015-10-26

    The present work describes a parallel computational framework for carbon dioxide (CO2) sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel high-performance-computing (HPC) systems. In this framework, a parallel reservoir simulator, reservoir-simulation toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, whereas the MD simulations are performed to provide the required physical parameters. Technologies from several different fields are used to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large-scale CO2 sequestration for long-term storage in subsurface geological formations, such as depleted oil and gas reservoirs and deep saline aquifers, which has been proposed as one of the few attractive and practical solutions to reduce CO2 emissions and address the global-warming threat. Fine grids and accurate prediction of the properties of fluid mixtures under geological conditions are essential for accurate simulations. In this work, CO2 sequestration is presented as a first example for coupling reservoir simulation and MD, although the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical processes in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability is observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well-demonstrated with several experiments with hundreds of millions to one billion cells. To the best of our knowledge, the present work represents the first attempt to couple reservoir simulation and molecular simulation for large-scale modeling. Because of the complexity of

  17. Performance Analysis of Transposition Models Simulating Solar Radiation on Inclined Surfaces: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Yu; Sengupta, Manajit

    2016-06-01

    Transposition models are widely used in the solar energy industry to simulate solar radiation on inclined photovoltaic (PV) panels. These transposition models have been developed using various assumptions about the distribution of the diffuse radiation, and most of the parameterizations in these models have been developed using hourly ground data sets. Numerous studies have compared the performance of transposition models, but this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty using high-resolution ground measurements in the plane of array. Our results suggest that the amount of aerosol optical depth can affect the accuracy of isotropic models. The choice of empirical coefficients and the use of decomposition models can both result in uncertainty in the output from the transposition models. It is expected that the results of this study will ultimately lead to improvements of the parameterizations as well as the development of improved physical models.

  18. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations

    Science.gov (United States)

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  19. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  20. A Distributed Electrochemistry Modeling Tool for Simulating SOFC Performance and Degradation

    Energy Technology Data Exchange (ETDEWEB)

    Recknagle, Kurtis P.; Ryan, Emily M.; Khaleel, Mohammad A.

    2011-10-13

    This report presents a distributed electrochemistry (DEC) model capable of investigating the electrochemistry and local conditions with the SOFC MEA based on the local microstructure and multi-physics. The DEC model can calculate the global current-voltage (I-V) performance of the cell as determined by the spatially varying local conditions through the thickness of the electrodes and electrolyte. The simulation tool is able to investigate the electrochemical performance based on characteristics of the electrode microstructure, such as particle size, pore size, electrolyte and electrode phase volume fractions, and triple-phase-boundary length. It can also investigate performance as affected by fuel and oxidant gas flow distributions and other environmental/experimental conditions such as temperature and fuel gas composition. The long-term objective for the DEC modeling tool is to investigate factors that cause electrode degradation and the decay of SOFC performance which decrease longevity.

  1. Endoscopic simulator curriculum improves colonoscopy performance in novice surgical interns as demonstrated in a swine model.

    Science.gov (United States)

    Telem, Dana A; Rattner, David W; Gee, Denise W

    2014-05-01

    The purpose of this study was to determine whether independent virtual endoscopic training accelerates the acquisition of endoscopic skill by novice surgical interns. Nine novice surgical interns participated in a prospective study comparing colonoscopy performance in a swine model before and after an independent simulator curriculum. An independent observer evaluated each intern for the ability to reach the cecum within 20 min and technical ability as determined by Global Assessment of Gastrointestinal Endoscopic Skills--Colonoscopy (GAGES-C) score and performance compared. In addition, at the conclusion of training, a post test of two basic simulated colonoscopy modules was completed and metrics evaluated. As a control, three attending physicians who routinely perform colonoscopy also completed colonoscopy in the swine model. Prior to endoscopic training, one (11 %) intern successfully intubated the cecum in 19.56 min. Following training, six (67 %) interns reached the cecum with mean time of 9.2 min (p curriculum intern times demonstrated the experts to be significantly faster (p curriculum demonstrated significantly improved GI Mentor™ performance in the efficiency (79 vs. 67.1 %, p = 0.05) and time to cecum (3.37 vs. 5.59 min, p = 0.01) metrics. No other significant difference was demonstrated in GAGES-C categories or other simulator parameter. Simulator training on the GI Mentor™ alone significantly improved endoscopic skills in novice surgical interns as demonstrated in a swine model. This study also identified parameters on the GI Mentor™ that could indicate 'clinical readiness'. This study supports the role for endoscopic simulator training in surgical resident education as an adjunct to clinical experience.

  2. A Dynamic Simulation Model of Organizational Culture and Business Strategy Effects on Performance

    Science.gov (United States)

    Trivellas, Panagiotis; Reklitis, Panagiotis; Konstantopoulos, Nikolaos

    2007-12-01

    In the past two decades, organizational culture literature has gained tremendous interest for both academic and practitioners. This is based not only on the suggestion that culture is related to performance, but also on the view that it is subject of direct managerial control and manipulation to the desired direction. In the present paper, we adopt Competing Values Framework (CVF) to operationalise organizational culture and Porter's typology to conceptualize business strategy (cost leadership, innovative and marketing differentiation, and focus). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic model to simulate the organizational culture and strategy effects on financial performance. Data obtained from a six-year survey in the banking sector of a European developing economy was used for the proposed dynamic model development.

  3. High-performance modeling of CO2 sequestration by coupling reservoir simulation and molecular dynamics

    KAUST Repository

    Bao, Kai

    2013-01-01

    The present work describes a parallel computational framework for CO2 sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel HPC systems. In this framework, a parallel reservoir simulator, Reservoir Simulation Toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, while the molecular dynamics simulations are performed to provide the required physical parameters. Numerous technologies from different fields are employed to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large scale CO2 sequestration for long-term storage in the subsurface geological formations, such as depleted reservoirs and deep saline aquifers, which has been proposed as one of the most attractive and practical solutions to reduce the CO2 emission problem to address the global-warming threat. To effectively solve such problems, fine grids and accurate prediction of the properties of fluid mixtures are essential for accuracy. In this work, the CO2 sequestration is presented as our first example to couple the reservoir simulation and molecular dynamics, while the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical process in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability are observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well demonstrated with several experiments with hundreds of millions to a billion cells. To our best knowledge, the work represents the first attempt to couple the reservoir simulation and molecular simulation for large scale modeling. Due to the complexity of the subsurface systems

  4. Teamwork skills, shared mental models, and performance in simulated trauma teams: an independent group design.

    Science.gov (United States)

    Westli, Heidi Kristina; Johnsen, Bjørn Helge; Eid, Jarle; Rasten, Ingvil; Brattebø, Guttorm

    2010-08-31

    Non-technical skills are seen as an important contributor to reducing adverse events and improving medical management in healthcare teams. Previous research on the effectiveness of teams has suggested that shared mental models facilitate coordination and team performance. The purpose of the study was to investigate whether demonstrated teamwork skills and behaviour indicating shared mental models would be associated with observed improved medical management in trauma team simulations. Revised versions of the 'Anesthetists' Non-Technical Skills Behavioural marker system' and 'Anti-Air Teamwork Observation Measure' were field tested in moment-to-moment observation of 27 trauma team simulations in Norwegian hospitals. Independent subject matter experts rated medical management in the teams. An independent group design was used to explore differences in teamwork skills between higher-performing and lower-performing teams. Specific teamwork skills and behavioural markers were associated with indicators of good team performance. Higher and lower-performing teams differed in information exchange, supporting behaviour and communication, with higher performing teams showing more effective information exchange and communication, and less supporting behaviours. Behavioural markers of shared mental models predicted effective medical management better than teamwork skills. The present study replicates and extends previous research by providing new empirical evidence of the significance of specific teamwork skills and a shared mental model for the effective medical management of trauma teams. In addition, the study underlines the generic nature of teamwork skills by demonstrating their transferability from different clinical simulations like the anaesthesia environment to trauma care, as well as the potential usefulness of behavioural frequency analysis in future research on non-technical skills.

  5. Teamwork skills, shared mental models, and performance in simulated trauma teams: an independent group design

    Directory of Open Access Journals (Sweden)

    Westli Heidi

    2010-08-01

    Full Text Available Abstract Background Non-technical skills are seen as an important contributor to reducing adverse events and improving medical management in healthcare teams. Previous research on the effectiveness of teams has suggested that shared mental models facilitate coordination and team performance. The purpose of the study was to investigate whether demonstrated teamwork skills and behaviour indicating shared mental models would be associated with observed improved medical management in trauma team simulations. Methods Revised versions of the 'Anesthetists' Non-Technical Skills Behavioural marker system' and 'Anti-Air Teamwork Observation Measure' were field tested in moment-to-moment observation of 27 trauma team simulations in Norwegian hospitals. Independent subject matter experts rated medical management in the teams. An independent group design was used to explore differences in teamwork skills between higher-performing and lower-performing teams. Results Specific teamwork skills and behavioural markers were associated with indicators of good team performance. Higher and lower-performing teams differed in information exchange, supporting behaviour and communication, with higher performing teams showing more effective information exchange and communication, and less supporting behaviours. Behavioural markers of shared mental models predicted effective medical management better than teamwork skills. Conclusions The present study replicates and extends previous research by providing new empirical evidence of the significance of specific teamwork skills and a shared mental model for the effective medical management of trauma teams. In addition, the study underlines the generic nature of teamwork skills by demonstrating their transferability from different clinical simulations like the anaesthesia environment to trauma care, as well as the potential usefulness of behavioural frequency analysis in future research on non-technical skills.

  6. Evaluation of Al-Najaf Hospital Intersection Performance Using Simulation model: Case Study

    Directory of Open Access Journals (Sweden)

    Hamid Athab Eedan Al-Jameel

    2016-03-01

    Full Text Available Traffic congestion is a widely spreading problem through the world. It is mainly observed around intersections in urban areas. In this study, Al-Najaf Hospital (Ibn Blal intersection has been evaluated because it is considered the congested T-intersection on Kufa-Nafa road. This T-intersection suffers from high congestion especially in the morning peak. This could be due to a lot of centers of activities (trip generation and attractive on that road such as University of Kufa, four hospitals and other facilities. Although the Highway Capacity Manual (HCM 2000 suffers from several shortcomings and limitations, it is used widely in the evaluation of intersections in Iraq. On the other hand, simulation models have been proved to be accurate tools in the evaluation of intersections. Therefore, a simulation model (S-Paramics model has been used to assess the performance of the current intersection. Then, the simulation model was calibrated with field data. Data was collected from the intersection using video camera installing over Al-Najaf Hospital building. The results of this study show that the developed model clearly mimics the reality. Then, different alternatives have been implemented using the developed model. Consequently, the construction of an overpass coming from Najaf-Kufa road towards Al-Sahlaa road is the best alternative with protecting U-turn.

  7. Improving firm performance in out-of-equilibrium, deregulated markets using feedback simulation models

    International Nuclear Information System (INIS)

    Gary, S.; Larsen, E.R.

    2000-01-01

    Deregulation has reshaped the utility sector in many countries around the world. Organisations in these deregulated industries must adopt new polices which guide strategic decisions, in an uncertain and unfamiliar environment, that determine the short- and long-term fate of their companies. Traditional economic equilibrium models do not adequately address the issues facing these organisations in the shift towards deregulated market competition. Equilibrium assumptions break down in the out-of-equilibrium transition to competitive markets, and therefore different underpinning assumptions must be adopted in order to guide management in these periods. Simulation models incorporating information feedback through behavioural policies fill the void left by equilibrium models and support strategic policy analysis in out-of-equilibrium markets. As an example, we present a feedback simulation model developed to examine firm and industry level performance consequences of new generation capacity investment policies in the deregulated UK electricity sector. The model explicitly captures behavioural decision polices of boundedly rational managers and avoids equilibrium assumptions. Such models are essential to help managers evaluate the performance impact of various strategic policies in environments in which disequilibrum behaviour dominates. (Author)

  8. Towards a benchmark simulation model for plant-wide control strategy performance evaluation of WWTPs

    DEFF Research Database (Denmark)

    Jeppsson, Ulf; Rosen, Christian; Alex, Jens

    2006-01-01

    worldwide, demonstrates the interest in such a tool within the research community In this paper, an extension of the benchmark simulation model no 1 (BSM1) is proposed. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently...... the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one-week BSM1 evaluation period. In the paper, the extended plant......The COST/IWA benchmark simulation model has been available for seven years. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the benchmark has resulted in more than 100 publications, not only in Europe but also...

  9. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    Science.gov (United States)

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  10. PERFORMANCE STUDIES OF INTEGRATED FUZZY LOGIC CONTROLLER FOR BRUSHLESS DC MOTOR DRIVES USING ADVANCED SIMULATION MODEL

    Directory of Open Access Journals (Sweden)

    C. Subba Rami Reddy

    2011-07-01

    Full Text Available This paper introduces an Integrated fuzzy logic controller (IFLC for brushless dc (BLDC motor drives using advanced simulation model and presents a comparative study of performances of PID controller and IFLC. The dynamic characteristics of speed and torque are effectively monitored and analyzed using the proposed model. The aim of IFLC is to obtain improved performance in terms of disturbance rejection or parameter variation than obtained using PID controller. The IFLC is constructed by using Fuzzy logic controller (FLC and PID controller. A performance comparison of the controllers is also given based on the integral of the absolute value of the error (IAE, the integral of the squared error (ISE, the integral of the time-weighted absolute error (ITAE and the integral of the time-weighted squared error (ITSE. The results show the effectiveness of the proposed controller.

  11. Performance of a bulb turbine suitable for low prototype head: model test and transient numerical simulation

    Science.gov (United States)

    Zhu, L.; Zhang, H. P.; Zhang, J. G.; Meng, X. C.; Lu, L.

    2012-11-01

    In this paper, a bulb turbine, with unit specific speed of nq=223.1 min-1 suitable for low prototype head was studied from aspect of its performance. Hydraulic model of the turbine was developed firstly, and then model turbine was designed and manufactured. Performance tests were carried out on high-accuracy hydraulic machinery model universal test rig located at IWHR, including energy, cavitation and pressure fluctuation tests, etc. In order to investigate internal flow field, three-dimensional transient turbulence numerical simulation was conducted on the tested turbine, adopting Reynolds-averaged Navier-Stocks control equations and RNG k-ɛ turbulence model. Test and simulation results show that: (1) hydraulic efficiency of model turbine ηM is up to 91.7%, at the optimum operating point of n11o=165.54 r/min versus Q11o=1.93 m3/s; (2) numerical results agree well with experimental resultsby comparing pressure fluctuation, which shows that pressure amplitude is very low at the optimum operating point; (3) hydraulic loss in Outflow domain accounts for more than 50% total hydraulic loss due to flow separation and secondary flow.

  12. An individual-based model simulating goat response variability and long-term herd performance.

    Science.gov (United States)

    Puillet, L; Martin, O; Sauvant, D; Tichit, M

    2010-12-01

    Finding ways of increasing the efficiency of production systems is a key issue of sustainability. System efficiency is based on long-term individual efficiency, which is highly variable and management driven. To study the effects of management on herd and individual efficiency, we developed the model simulation of goat herd management (SIGHMA). This dynamic model is individual-based and represents the interactions between technical operations (relative to replacement, reproduction and feeding) and individual biological processes (performance dynamics based on energy partitioning and production potential). It simulates outputs at both herd and goat levels over 20 years. A farmer's production project (i.e. a targeted milk production pattern) is represented by configuring the herd into female groups reflecting the organisation of kidding periods. Each group is managed by discrete events applying decision rules to simulate the carrying out of technical operations. The animal level is represented by a set of individual goat models. Each model simulates a goat's biological dynamics through its productive life. It integrates the variability of biological responses driven by genetic scaling parameters (milk production potential and mature body weight), by the regulations of energy partitioning among physiological functions and by responses to diet energy defined by the feeding strategy. A sensitivity analysis shows that herd efficiency was mainly affected by feeding management and to a lesser extent by the herd production potential. The same effects were observed on herd milk feed costs with an even lower difference between production potential and feeding management. SIGHMA was used in a virtual experiment to observe the effects of feeding strategies on herd and individual performances. We found that overfeeding led to a herd production increase and a feed cost decrease. However, this apparent increase in efficiency at the herd level (as feed cost decreased) was related

  13. Performance and Evaluation of the Global Modeling and Assimilation Office Observing System Simulation Experiment

    Science.gov (United States)

    Prive, Nikki; Errico, R. M.; Carvalho, D.

    2018-01-01

    The National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASA/GMAO) has spent more than a decade developing and implementing a global Observing System Simulation Experiment framework for use in evaluting both new observation types as well as the behavior of data assimilation systems. The NASA/GMAO OSSE has constantly evolved to relect changes in the Gridpoint Statistical Interpolation data assimiation system, the Global Earth Observing System model, version 5 (GEOS-5), and the real world observational network. Software and observational datasets for the GMAO OSSE are publicly available, along with a technical report. Substantial modifications have recently been made to the NASA/GMAO OSSE framework, including the character of synthetic observation errors, new instrument types, and more sophisticated atmospheric wind vectors. These improvements will be described, along with the overall performance of the current OSSE. Lessons learned from investigations into correlated errors and model error will be discussed.

  14. Improving streamflow simulations and forecasting performance of SWAT model by assimilating remotely sensed soil moisture observations

    Science.gov (United States)

    Patil, Amol; Ramsankaran, RAAJ

    2017-12-01

    This article presents a study carried out using EnKF based assimilation of coarser-scale SMOS soil moisture retrievals to improve the streamflow simulations and forecasting performance of SWAT model in a large catchment. This study has been carried out in Munneru river catchment, India, which is about 10,156 km2. In this study, an EnkF based new approach is proposed for improving the inherent vertical coupling of soil layers of SWAT hydrological model during soil moisture data assimilation. Evaluation of the vertical error correlation obtained between surface and subsurface layers indicates that the vertical coupling can be improved significantly using ensemble of soil storages compared to the traditional static soil storages based EnKF approach. However, the improvements in the simulated streamflow are moderate, which is due to the limitations in SWAT model in reflecting the profile soil moisture updates in surface runoff computations. Further, it is observed that the durability of streamflow improvements is longer when the assimilation system effectively updates the subsurface flow component. Overall, the results of the present study indicate that the passive microwave-based coarser-scale soil moisture products like SMOS hold significant potential to improve the streamflow estimates when assimilating into large-scale distributed hydrological models operating at a daily time step.

  15. Performance evaluation of RANS-based turbulence models in simulating a honeycomb heat sink

    Science.gov (United States)

    Subasi, Abdussamet; Ozsipahi, Mustafa; Sahin, Bayram; Gunes, Hasan

    2017-07-01

    As well-known, there is not a universal turbulence model that can be used to model all engineering problems. There are specific applications for each turbulence model that make it appropriate to use, and it is vital to select an appropriate model and wall function combination that matches the physics of the problem considered. Therefore, in this study, performance of six well-known Reynolds-Averaged Navier-Stokes ( RANS) based turbulence models which are the Standard k {{-}} ɛ, the Renormalized Group k- ɛ, the Realizable k- ɛ, the Reynolds Stress Model, the k- ω and the Shear Stress Transport k- ω and accompanying wall functions which are the standard, the non-equilibrium and the enhanced are evaluated via 3D simulation of a honeycomb heat sink. The CutCell method is used to generate grid for the part including heat sink called test section while a hexahedral mesh is employed to discretize to inlet and outlet sections. A grid convergence study is conducted for verification process while experimental data and well-known correlations are used to validate the numerical results. Prediction of pressure drop along the test section, mean base plate temperature of the heat sink and temperature at the test section outlet are regarded as a measure of the performance of employed models and wall functions. The results indicate that selection of turbulence models and wall functions has a great influence on the results and, therefore, need to be selected carefully. Hydraulic and thermal characteristics of the honeycomb heat sink can be determined in a reasonable accuracy using RANS- based turbulence models provided that a suitable turbulence model and wall function combination is selected.

  16. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  17. NASA-STD-7009 Guidance Document for Human Health and Performance Models and Simulations

    Science.gov (United States)

    Walton, Marlei; Mulugeta, Lealem; Nelson, Emily S.; Myers, Jerry G.

    2014-01-01

    Rigorous verification, validation, and credibility (VVC) processes are imperative to ensure that models and simulations (MS) are sufficiently reliable to address issues within their intended scope. The NASA standard for MS, NASA-STD-7009 (7009) [1] was a resultant outcome of the Columbia Accident Investigation Board (CAIB) to ensure MS are developed, applied, and interpreted appropriately for making decisions that may impact crew or mission safety. Because the 7009 focus is engineering systems, a NASA-STD-7009 Guidance Document is being developed to augment the 7009 and provide information, tools, and techniques applicable to the probabilistic and deterministic biological MS more prevalent in human health and performance (HHP) and space biomedical research and operations.

  18. Performance assessment of geospatial simulation models of land-use change--a landscape metric-based approach.

    Science.gov (United States)

    Sakieh, Yousef; Salmanmahiny, Abdolrassoul

    2016-03-01

    Performance evaluation is a critical step when developing land-use and cover change (LUCC) models. The present study proposes a spatially explicit model performance evaluation method, adopting a landscape metric-based approach. To quantify GEOMOD model performance, a set of composition- and configuration-based landscape metrics including number of patches, edge density, mean Euclidean nearest neighbor distance, largest patch index, class area, landscape shape index, and splitting index were employed. The model takes advantage of three decision rules including neighborhood effect, persistence of change direction, and urbanization suitability values. According to the results, while class area, largest patch index, and splitting indices demonstrated insignificant differences between spatial pattern of ground truth and simulated layers, there was a considerable inconsistency between simulation results and real dataset in terms of the remaining metrics. Specifically, simulation outputs were simplistic and the model tended to underestimate number of developed patches by producing a more compact landscape. Landscape-metric-based performance evaluation produces more detailed information (compared to conventional indices such as the Kappa index and overall accuracy) on the model's behavior in replicating spatial heterogeneity features of a landscape such as frequency, fragmentation, isolation, and density. Finally, as the main characteristic of the proposed method, landscape metrics employ the maximum potential of observed and simulated layers for a performance evaluation procedure, provide a basis for more robust interpretation of a calibration process, and also deepen modeler insight into the main strengths and pitfalls of a specific land-use change model when simulating a spatiotemporal phenomenon.

  19. On the performance of a viscoelastic constitutive model for micro injection moulding simulations

    Directory of Open Access Journals (Sweden)

    G. Lucchetta

    2012-05-01

    Full Text Available The numerical simulation of the injection moulding process involving microstructures presents several challenges, mainly due to the surface effects that dominate the flow behaviour at the microscale. In this paper a new approach, which employs weld lines as flow markers, is used to evaluate whether the numerical codes that are normally used to simulate the conventional injection moulding process, are suitable to characterize the melt flow patterns in the filling of micro features. The Cross-WLF viscous model and the Giesekus viscoelastic model were evaluated using 3D models of a micro part implemented in two different numerical codes. A micro cavity was designed in order to compare the results of numerical simulations and experiments. While the viscous simulations were found to be inappropriate for multi-scale structures, the accuracy of micro filling predictions was significantly improved by implementing a viscoelastic material model.

  20. High Performance Simulation Tool for Multiphysics Propulsion Using Fidelity-Adaptive Combustion Modeling, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is a fidelity-adaptive combustion model (FAM) implemented into the Loci-STREAM CFD code for use at NASA for simulation of rocket...

  1. The Impact of 3D Data Quality on Improving GNSS Performance Using City Models Initial Simulations

    Science.gov (United States)

    Ellul, C.; Adjrad, M.; Groves, P.

    2016-10-01

    There is an increasing demand for highly accurate positioning information in urban areas, to support applications such as people and vehicle tracking, real-time air quality detection and navigation. However systems such as GPS typically perform poorly in dense urban areas. A number of authors have made use of 3D city models to enhance accuracy, obtaining good results, but to date the influence of the quality of the 3D city model on these results has not been tested. This paper addresses the following question: how does the quality, and in particular the variation in height, level of generalization and completeness and currency of a 3D dataset, impact the results obtained for the preliminary calculations in a process known as Shadow Matching, which takes into account not only where satellite signals are visible on the street but also where they are predicted to be absent. We describe initial simulations to address this issue, examining the variation in elevation angle - i.e. the angle above which the satellite is visible, for three 3D city models in a test area in London, and note that even within one dataset using different available height values could cause a difference in elevation angle of up to 29°. Missing or extra buildings result in an elevation variation of around 85°. Variations such as these can significantly influence the predicted satellite visibility which will then not correspond to that experienced on the ground, reducing the accuracy of the resulting Shadow Matching process.

  2. A high-performance model for shallow-water simulations in distributed and heterogeneous architectures

    Science.gov (United States)

    Conde, Daniel; Canelas, Ricardo B.; Ferreira, Rui M. L.

    2017-04-01

    unstructured nature of the mesh topology with the corresponding employed solution, based on space-filling curves, being analyzed and discussed. Intra-node parallelism is achieved through OpenMP for CPUs and CUDA for GPUs, depending on which kind of device the process is running. Here the main difficulty is associated with the Object-Oriented approach, where the presence of complex data structures can degrade model performance considerably. STAV-2D now supports fully distributed and heterogeneous simulations where multiple different devices can be used to accelerate computation time. The advantages, short-comings and specific solutions for the employed unified Object-Oriented approach, where the source code for CPU and GPU has the same compilation units (no device specific branches like seen in available models), are discussed and quantified with a thorough scalability and performance analysis. The assembled parallel model is expected to achieve faster than real-time simulations for high resolutions (from meters to sub-meter) in large scaled problems (from cities to watersheds), effectively bridging the gap between detailed and timely simulation results. Acknowledgements This research as partially supported by Portuguese and European funds, within programs COMPETE2020 and PORL-FEDER, through project PTDC/ECM-HID/6387/2014 and Doctoral Grant SFRH/BD/97933/2013 granted by the National Foundation for Science and Technology (FCT). References Canelas, R.; Murillo, J. & Ferreira, R.M.L. (2013), Two-dimensional depth-averaged modelling of dam-break flows over mobile beds. Journal of Hydraulic Research, 51(4), 392-407. Conde, D. A. S.; Baptista, M. A. V.; Sousa Oliveira, C. & Ferreira, R. M. L. (2013), A shallow-flow model for the propagation of tsunamis over complex geometries and mobile beds, Nat. Hazards and Earth Syst. Sci., 13, 2533-2542. Conde, D. A. S.; Telhado, M. J.; Viana Baptista, M. A. & Ferreira, R. M. L. (2015) Severity and exposure associated with tsunami actions in

  3. LIAR -- A computer program for the modeling and simulation of high performance linacs

    Energy Technology Data Exchange (ETDEWEB)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-04-01

    The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm.

  4. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    leaving students. It is a probabilistic model. In the next part of this article, two more models - 'input/output model' used for production systems or economic studies and a. 'discrete event simulation model' are introduced. Aircraft Performance Model.

  5. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  6. Performance modelling and simulation of an absorption solar cooling system for Malaysia

    International Nuclear Information System (INIS)

    Assilzadeh, F.; Ali, Y.; Kamaruzzaman Sopian

    2006-01-01

    Solar radiation contains huge amounts of energy and is required for almost all the natural processes on earth. Solar-powered air-conditioning has many advantages when compared to normal electricity system. This paper presents a solar cooling system that has been designed for Malaysia and other tropical regions using evacuated tube solar collector and LiBr absorption system. A modelling and simulation of absorption solar cooling system is modeled in Transient System Simulation (TRNSYS) environment. The typical meteorological year file containing the weather parameters is used to simulate the system. Then a system optimization is carried out in order to select the appropriate type of collector, the optimum size of storage tank, the optimum collector slope and area and the optimum thermostat setting of the auxiliary boiler

  7. Comparing the performance of 11 crop simulation models in predicting yield response to nitrogen fertilization

    DEFF Research Database (Denmark)

    Salo, T J; Palosuo, T; Kersebaum, K C

    2016-01-01

    Eleven widely used crop simulation models (APSIM, CERES, CROPSYST, COUP, DAISY, EPIC, FASSET, HERMES, MONICA, STICS and WOFOST) were tested using spring barley (Hordeum vulgare L.) data set under varying nitrogen (N) fertilizer rates from three experimental years in the boreal climate of Jokioinen...

  8. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  9. Impact of spatial data resolution on simulated catchment water balances and model performance of the multi-scale TOPLATS model

    Directory of Open Access Journals (Sweden)

    H. Bormann

    2006-01-01

    Full Text Available This paper analyses the effect of spatial input data resolution on the simulated water balances and flow components using the multi-scale hydrological model TOPLATS. A data set of 25m resolution of the central German Dill catchment (693 km2 is used for investigation. After an aggregation of digital elevation model, soil map and land use classification to 50 m, 75 m, 100 m, 150 m, 200 m, 300 m, 500 m, 1000 m and 2000 m, water balances and water flow components are calculated for the entire Dill catchment as well as for 3 subcatchments without any recalibration. The study shows that model performance measures and simulated water balances almost remain constant for most of the aggregation steps for all investigated catchments. Slight differences in the simulated water balances and statistical quality measures occur for single catchments at the resolution of 50 m to 500 m (e.g. 0–3% for annual stream flow, significant differences at the resolution of 1000 m and 2000 m (e.g. 2–12% for annual stream flow. These differences can be explained by the fact that the statistics of certain input data (land use data in particular as well as soil physical characteristics changes significantly at these spatial resolutions. The impact of smoothing the relief by aggregation occurs continuously but is barely reflected by the simulation results. To study the effect of aggregation of land use data in detail, in addition to current land use the effect of aggregation on the water balance calculations based on three different land use scenarios is investigated. Land use scenarios were available aiming on economic optimisation of agricultural and forestry practices at different field sizes (0.5 ha, 1.5 ha and 5.0 ha. The changes in water balance terms, induced by aggregation of the land use scenarios, are comparable with respect to catchment water balances compared to the current land use. A correlation analysis between statistics of input data and simulated annual

  10. Mathematical modelling and simulation of the thermal performance of a solar heated indoor swimming pool

    Directory of Open Access Journals (Sweden)

    Mančić Marko V.

    2014-01-01

    Full Text Available Buildings with indoor swimming pools have a large energy footprint. The source of major energy loss is the swimming pool hall where air humidity is increased by evaporation from the pool water surface. This increases energy consumption for heating and ventilation of the pool hall, fresh water supply loss and heat demand for pool water heating. In this paper, a mathematical model of the swimming pool was made to assess energy demands of an indoor swimming pool building. The mathematical model of the swimming pool is used with the created multi-zone building model in TRNSYS software to determine pool hall energy demand and pool losses. Energy loss for pool water and pool hall heating and ventilation are analyzed for different target pool water and air temperatures. The simulation showed that pool water heating accounts for around 22%, whereas heating and ventilation of the pool hall for around 60% of the total pool hall heat demand. With a change of preset controller air and water temperatures in simulations, evaporation loss was in the range 46-54% of the total pool losses. A solar thermal sanitary hot water system was modelled and simulated to analyze it's potential for energy savings of the presented demand side model. The simulation showed that up to 87% of water heating demands could be met by the solar thermal system, while avoiding stagnation. [Projekat Ministarstva nauke Republike Srbije, br. III 42006: Research and development of energy and environmentally highly effective polygeneration systems based on using renewable energy sources

  11. Assessing the Impact of Equipment Aging on System Performance Using Simulation Modeling Methods

    International Nuclear Information System (INIS)

    Gupta, N. K.

    2005-01-01

    The radiological Inductively Coupled Plasma Mass Spectrometer (ICP-MS) is used to analyze the radioactive samples collected from different radioactive material processing operations at Savannah River Site (SRS). The expeditious processing of these samples is important for safe and reliable operations at SRS. As the radiological (RAD) ICP-MS machine ages, the experience shows that replacement parts and repairs are difficult to obtain on time for reliable operations after 5 years of service. A discrete event model using commercial software EXTEND was prepared to assess the impact on sample turn around times as the ICP-MS gets older. The model was prepared using the sample statistics from the previous 4 years. Machine utilization rates were calculated for the new machine, 5 year old machine, 10 year old machine, and a 12 year old machine. Computer simulations were run for these periods and the sample delay times calculated. The model was validated against the sample statistics collected from the previous 4 quarters. 90% confidence intervals were calculated for the 10th, 25th, 50th, and 90th quantiles of the samples. The simulation results show that if 50% of the samples are needed on time for efficient site operations, a 10 year old machine could take nearly 50 days longer to process these samples than a 5-year old machine. This simulation effort quantifies the impact on sample turn around time as the ICP-MS gets older

  12. The performance of simulated annealing in parameter estimation for vapor-liquid equilibrium modeling

    Directory of Open Access Journals (Sweden)

    A. Bonilla-Petriciolet

    2007-03-01

    Full Text Available In this paper we report the application and evaluation of the simulated annealing (SA optimization method in parameter estimation for vapor-liquid equilibrium (VLE modeling. We tested this optimization method using the classical least squares and error-in-variable approaches. The reliability and efficiency of the data-fitting procedure are also considered using different values for algorithm parameters of the SA method. Our results indicate that this method, when properly implemented, is a robust procedure for nonlinear parameter estimation in thermodynamic models. However, in difficult problems it still can converge to local optimums of the objective function.

  13. Interactions of Team Mental Models and Monitoring Behaviors Predict Team Performance in Simulated Anesthesia Inductions

    Science.gov (United States)

    Burtscher, Michael J.; Kolbe, Michaela; Wacker, Johannes; Manser, Tanja

    2011-01-01

    In the present study, we investigated how two team mental model properties (similarity vs. accuracy) and two forms of monitoring behavior (team vs. systems) interacted to predict team performance in anesthesia. In particular, we were interested in whether the relationship between monitoring behavior and team performance was moderated by team…

  14. Performance and Uncertainty Evaluation of Snow Models on Snowmelt Flow Simulations over a Nordic Catchment (Mistassibi, Canada

    Directory of Open Access Journals (Sweden)

    Magali Troin

    2015-11-01

    Full Text Available An analysis of hydrological response to a multi-model approach based on an ensemble of seven snow models (SM; degree-day and mixed degree-day/energy balance models coupled with three hydrological models (HM is presented for a snowmelt-dominated basin in Canada. The present study aims to compare the performance and the reliability of different types of SM-HM combinations at simulating snowmelt flows over the 1961–2000 historical period. The multi-model approach also allows evaluating the uncertainties associated with the structure of the SM-HM ensemble to better predict river flows in Nordic environments. The 20-year calibration shows a satisfactory performance of the ensemble of 21 SM-HM combinations at simulating daily discharges and snow water equivalents (SWEs, with low streamflow volume biases. The validation of the ensemble of 21 SM-HM combinations is conducted over a 20-year period. Performances are similar to the calibration in simulating the daily discharges and SWEs, again with low model biases for streamflow. The spring-snowmelt-generated peak flow is captured only in timing by the ensemble of 21 SM-HM combinations. The results of specific hydrologic indicators show that the uncertainty related to the choice of the given HM in the SM-HM combinations cannot be neglected in a more quantitative manner in simulating snowmelt flows. The selection of the SM plays a larger role than the choice of the SM approach (degree-day versus mixed degree-day/energy balance in simulating spring flows. Overall, the snow models provide a low degree of uncertainty to the total uncertainty in hydrological modeling for snow hydrology studies.

  15. An Assessment of Mean Areal Precipitation Methods on Simulated Stream Flow: A SWAT Model Performance Assessment

    Directory of Open Access Journals (Sweden)

    Sean Zeiger

    2017-06-01

    Full Text Available Accurate mean areal precipitation (MAP estimates are essential input forcings for hydrologic models. However, the selection of the most accurate method to estimate MAP can be daunting because there are numerous methods to choose from (e.g., proximate gauge, direct weighted average, surface-fitting, and remotely sensed methods. Multiple methods (n = 19 were used to estimate MAP with precipitation data from 11 distributed monitoring sites, and 4 remotely sensed data sets. Each method was validated against the hydrologic model simulated stream flow using the Soil and Water Assessment Tool (SWAT. SWAT was validated using a split-site method and the observed stream flow data from five nested-scale gauging sites in a mixed-land-use watershed of the central USA. Cross-validation results showed the error associated with surface-fitting and remotely sensed methods ranging from −4.5 to −5.1%, and −9.8 to −14.7%, respectively. Split-site validation results showed the percent bias (PBIAS values that ranged from −4.5 to −160%. Second order polynomial functions especially overestimated precipitation and subsequent stream flow simulations (PBIAS = −160 in the headwaters. The results indicated that using an inverse-distance weighted, linear polynomial interpolation or multiquadric function method to estimate MAP may improve SWAT model simulations. Collectively, the results highlight the importance of spatially distributed observed hydroclimate data for precipitation and subsequent steam flow estimations. The MAP methods demonstrated in the current work can be used to reduce hydrologic model uncertainty caused by watershed physiographic differences.

  16. Microcomputer simulation model for facility performance assessment: a case study of nuclear spent fuel handling facility operations

    Energy Technology Data Exchange (ETDEWEB)

    Chockie, A.D.; Hostick, C.J.; Otis, P.T.

    1985-10-01

    A microcomputer based simulation model was recently developed at the Pacific Northwest Laboratory (PNL) to assist in the evaluation of design alternatives for a proposed facility to receive, consolidate and store nuclear spent fuel from US commercial power plants. Previous performance assessments were limited to deterministic calculations and Gantt chart representations of the facility operations. To insure that the design of the facility will be adequate to meet the specified throughput requirements, the simulation model was used to analyze such factors as material flow, equipment capability and the interface between the MRS facility and the nuclear waste transportation system. The simulation analysis model was based on commercially available software and application programs designed to represent the MRS waste handling facility operations. The results of the evaluation were used by the design review team at PNL to identify areas where design modifications should be considered. 4 figs.

  17. A cycle simulation model for predicting the performance of a diesel engine fuelled by diesel and biodiesel blends

    International Nuclear Information System (INIS)

    Gogoi, T.K.; Baruah, D.C.

    2010-01-01

    Among the alternative fuels, biodiesel and its blends are considered suitable and the most promising fuel for diesel engine. The properties of biodiesel are found similar to that of diesel. Many researchers have experimentally evaluated the performance characteristics of conventional diesel engines fuelled by biodiesel and its blends. However, experiments require enormous effort, money and time. Hence, a cycle simulation model incorporating a thermodynamic based single zone combustion model is developed to predict the performance of diesel engine. The effect of engine speed and compression ratio on brake power and brake thermal efficiency is analysed through the model. The fuel considered for the analysis are diesel, 20%, 40%, 60% blending of diesel and biodiesel derived from Karanja oil (Pongamia Glabra). The model predicts similar performance with diesel, 20% and 40% blending. However, with 60% blending, it reveals better performance in terms of brake power and brake thermal efficiency.

  18. Mode of operation and performance of a simulation model for the electricity management

    International Nuclear Information System (INIS)

    Weible, H.

    1981-01-01

    In the first two main parts of this report the structure of a simulation model to define the consequences of decisions in the electricity supply sector is explained, based on a careful revision of relevant problems. By means of the system analysis the model represents an attempt to describe all essential relations between the electricity demand on the one hand and the consequences of the generation of electricity (indluding transport) for consumers, state, environment, market of capital and fuel on the other hand. The multifarious ways of operation and application of the model are demonstrated taking the public electricity management of Baden-Wuerttemberg as an example. Besides a validation of the model for 1970 to 1977, possible trends until the year 2000 are shown. As an essential result of the analyses it can be shown, that a renunciation on the further extension of nuclear energy turns out to be unrealistic by a supposed average increase of the electricity demand of 4% p.a.. A comparison of different model conceptions shows, that the information loss, which proceeds from the approximation of year-load-curves, leads to important deviations in the model results. According to the sensitivity analyses the increase of the electricity consumption turns out to be the most essential influence on the time dependent results. (orig./UA) [de

  19. Modeling and Simulation of Long-Term Performance of Near-Surface Barriers

    International Nuclear Information System (INIS)

    Piet, S. J.; Jacobson, J. J.; Martian, P.; Martineau, R.; Soto, R.

    2003-01-01

    INEEL started a new project on long-term barrier integrity in April 2002 that aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late, prior to system-level failure). This paper describes our computer simulation approach for better understanding the relationships and dynamics between the various components and management decisions in a cap. The simulation is designed to clarify the complex relationships between the various components within the cap system and the various management practices that affect the barrier performance. We have also conceptualized a time-dependent 3-D simulation with rigorous solution to unsaturated flow physics with complex surface boundary conditions

  20. Modeling and Simulation of Long-Term Performance of Near-Surface Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Piet, S. J.; Jacobson, J. J.; Martian, P.; Martineau, R.; Soto, R.

    2003-02-25

    . Thus, the INEEL started a new project on long-term barrier integrity in April 2002 that aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late, prior to system-level failure). This paper describes our computer simulation approach for better understanding the relationships and dynamics between the various components and management decisions in a cap. The simulation is designed to clarify the complex relationships between the various components within the cap system and the various management practices that affect the barrier performance. We have also conceptualized a time-dependent 3-D simulation with rigorous solution to unsaturated flow physics with complex surface boundary conditions.

  1. Modeling and Simulation of Long-Term Performance of Near-Surface Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Piet, Steven James; Jacobson, Jacob Jordan; Soto, Rafael; Martian, Pete; Martineau, Richard Charles

    2003-02-01

    , the INEEL started a new project on long-term barrier integrity in April 2002 that aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late, prior to system-level failure). This paper describes our computer simulation approach for better understanding the relationships and dynamics between the various components and management decisions in a cap. The simulation is designed to clarify the complex relationships between the various components within the cap system and the various management practices that affect the barrier performance. We have also conceptualized a time-dependent 3-D simulation with rigorous solution to unsaturated flow physics with complex surface boundary conditions.

  2. Predictive Maturity of Multi-Scale Simulation Models for Fuel Performance

    International Nuclear Information System (INIS)

    Atamturktur, Sez; Unal, Cetin; Hemez, Francois; Williams, Brian; Tome, Carlos

    2015-01-01

    The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy's resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed framework is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this

  3. Predictive Maturity of Multi-Scale Simulation Models for Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Atamturktur, Sez [Clemson Univ., SC (United States); Unal, Cetin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hemez, Francois [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Brian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tome, Carlos [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-16

    The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed framework is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this

  4. Signal and image processing systems performance evaluation, simulation, and modeling; Proceedings of the Meeting, Orlando, FL, Apr. 4, 5, 1991

    Science.gov (United States)

    Nasr, Hatem N.; Bazakos, Michael E.

    The various aspects of the evaluation and modeling problems in algorithms, sensors, and systems are addressed. Consideration is given to a generic modular imaging IR signal processor, real-time architecture based on the image-processing module family, application of the Proto Ware simulation testbed to the design and evaluation of advanced avionics, development of a fire-and-forget imaging infrared seeker missile simulation, an adaptive morphological filter for image processing, laboratory development of a nonlinear optical tracking filter, a dynamic end-to-end model testbed for IR detection algorithms, wind tunnel model aircraft attitude and motion analysis, an information-theoretic approach to optimal quantization, parametric analysis of target/decoy performance, neural networks for automated target recognition parameters adaptation, performance evaluation of a texture-based segmentation algorithm, evaluation of image tracker algorithms, and multisensor fusion methodologies. (No individual items are abstracted in this volume)

  5. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  6. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1990-12-01

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  7. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation.

    Science.gov (United States)

    Palme, M; Inostroza, L; Villacreses, G; Lobato, A; Carrasco, C

    2017-10-01

    This data article presents files supporting calculation for urban heat island (UHI) inclusion in building performance simulation (BPS). Methodology is used in the research article "From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect" (Palme et al., 2017) [1]. In this research, a Geographical Information System (GIS) study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso). Then, a Principal Component Analysis (PCA) is done to obtain reference Urban Tissues Categories (UTC) to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG) software (version 4.1 beta). Finally, BPS is run out with the Transient System Simulation (TRNSYS) software (version 17). In this data paper, four sets of data are presented: 1) PCA data (excel) to explain how to group different urban samples in representative UTC; 2) UWG data (text) to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso); 3) weather data (text) with the resulting rural and urban weather; 4) BPS models (text) data containing the TRNSYS models (four building models).

  8. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation

    Directory of Open Access Journals (Sweden)

    M. Palme

    2017-10-01

    Full Text Available This data article presents files supporting calculation for urban heat island (UHI inclusion in building performance simulation (BPS. Methodology is used in the research article “From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect” (Palme et al., 2017 [1]. In this research, a Geographical Information System (GIS study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso. Then, a Principal Component Analysis (PCA is done to obtain reference Urban Tissues Categories (UTC to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG software (version 4.1 beta. Finally, BPS is run out with the Transient System Simulation (TRNSYS software (version 17. In this data paper, four sets of data are presented: 1 PCA data (excel to explain how to group different urban samples in representative UTC; 2 UWG data (text to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso; 3 weather data (text with the resulting rural and urban weather; 4 BPS models (text data containing the TRNSYS models (four building models.

  9. Modeling and Simulation of Thermal Performance of Solar-Assisted Air Conditioning System under Iraq Climate

    Directory of Open Access Journals (Sweden)

    Najim Abid Jassim

    2016-08-01

    Full Text Available In Iraq most of the small buildings deployed a conventional air conditioning technology which typically uses electrically driven compressor systems which exhibits several clear disadvantages such as high energy consumption, high electricity at peak loads. In this work a thermal performance of air conditioning system combined with a solar collector is investigated theoretically. The hybrid air conditioner consists of a semi hermetic compressor, water cooled shell and tube condenser, thermal expansion valve and coil with tank evaporator. The theoretical analysis included a simulation for the solar assisted air-conditioning system using EES software to analyze the effect of different parameters on the power consumption of compressor and the performance of system. The results show that refrigeration capacity is increased from 2.7 kW to 4.4kW, as the evaporating temperature increased from 3 to 18 ºC. Also the power consumption is increased from 0.89 kW to 1.08 kW. So the COP of the system is increased from 3.068 to 4.117. The power consumption is increased from 0.897 kW to 1.031 kW as the condensing temperature increased from 35 ºC to 45 ºC. While the COP is decreased from 3.89 to 3.1. The power consumption is decreased from 1.05 kW to 0.7kW as the solar radiation intensity increased from 300 W/m2 to 1000 W/m2, while the COP is increased from 3.15 to 4.8. A comparison between the simulation and available experimental data showed acceptable agreement.

  10. SOLID OXIDE FUEL CELL MANUFACTURING COST MODEL: SIMULATING RELATIONSHIPS BETWEEN PERFORMANCE, MANUFACTURING, AND COST OF PRODUCTION

    Energy Technology Data Exchange (ETDEWEB)

    Eric J. Carlson; Yong Yang; Chandler Fulton

    2004-04-20

    The successful commercialization of fuel cells will depend on the achievement of competitive system costs and efficiencies. System cost directly impacts the capital equipment component of cost of electricity (COE) and is a major contributor to the O and M component. The replacement costs for equipment (also heavily influenced by stack life) is generally a major contributor to O and M costs. In this project, they worked with the SECA industrial teams to estimate the impact of general manufacturing issues of interest on stack cost using an activities-based cost model for anode-supported planar SOFC stacks with metallic interconnects. An earlier model developed for NETL for anode supported planar SOFCs was enhanced by a linkage to a performance/thermal/mechanical model, by addition of Quality Control steps to the process flow with specific characterization methods, and by assessment of economies of scale. The 3-dimensional adiabatic performance model was used to calculate the average power density for the assumed geometry and operating conditions (i.e., inlet and exhaust temperatures, utilization, and fuel composition) based on publicly available polarizations curves. The SECA team provided guidance on what manufacturing and design issues should be assessed in this Phase I demonstration of cost modeling capabilities. They considered the impact of the following parameters on yield and cost: layer thickness (i.e., anode, electrolyte, and cathode) on cost and stress levels, statistical nature of ceramic material failure on yield, and Quality Control steps and strategies. In this demonstration of the capabilities of the linked model, only the active stack (i.e., anode, electrolyte, and cathode) and interconnect materials were included in the analysis. Factory costs are presented on an area and kilowatt basis to allow developers to extrapolate to their level of performance, stack design, materials, seal and system configurations, and internal corporate overheads and margin

  11. Simulator validation results and proposed reporting format from flight testing a software model of a complex, high-performance airplane.

    Science.gov (United States)

    2008-01-01

    Computer simulations are often used in aviation studies. These simulation tools may require complex, high-fidelity aircraft models. Since many of the flight models used are third-party developed products, independent validation is desired prior to im...

  12. Performance evaluation of land surface models and cumulus convection schemes in the simulation of Indian summer monsoon using a regional climate model

    Science.gov (United States)

    Maity, S.; Satyanarayana, A. N. V.; Mandal, M.; Nayak, S.

    2017-11-01

    In this study, an attempt has been made to investigate the sensitivity of land surface models (LSM) and cumulus convection schemes (CCS) using a regional climate model, RegCM Version-4.1 in simulating the Indian Summer Monsoon (ISM). Numerical experiments were conducted in seasonal scale (May-September) for three consecutive years: 2007, 2008, 2009 with two LSMs (Biosphere Atmosphere Transfer Scheme (BATS), Community Land Model (CLM 3.5) and five CCSs (MIT, KUO, GRELL, GRELL over land and MIT over ocean (GL_MO), GRELL over ocean and MIT over land (GO_ML)). Important synoptic features are validated using various reanalysis datasets and satellite derived products from TRMM and CRU data. Seasonally averaged surface temperature is reasonably well simulated by the model using both the LSMs along with CCSs namely, MIT, GO_ML and GL_MO schemes. Model simulations reveal slight warm bias using these schemes whereas significant cold bias is seen with KUO and GRELL schemes during all three years. It is noticed that the simulated Somali Jet (SJ) is weak in all simulations except MIT scheme in the simulations with (both BATS and CLM) in which the strength of SJ reasonably well captured. Although the model is able to simulate the Tropical Easterly Jet (TEJ) and Sub-Tropical Westerly Jet (STWJ) with all the CCSs in terms of their location and strength, the performance of MIT scheme seems to be better than the rest of the CCSs. Seasonal rainfall is not well simulated by the model. Significant underestimation of Indian Summer Monsoon Rainfall (ISMR) is observed over Central and North West India. Spatial distribution of seasonal ISMR is comparatively better simulated by the model with MIT followed by GO_ML scheme in combination with CLM although it overestimates rainfall over heavy precipitation zones. On overall statistical analysis, it is noticed that RegCM4 shows better skill in simulating ISM with MIT scheme using CLM.

  13. Identifying a key physical factor sensitive to the performance of Madden-Julian oscillation simulation in climate models

    Science.gov (United States)

    Kim, Go-Un; Seo, Kyong-Hwan

    2018-01-01

    A key physical factor in regulating the performance of Madden-Julian oscillation (MJO) simulation is examined by using 26 climate model simulations from the World Meteorological Organization's Working Group for Numerical Experimentation/Global Energy and Water Cycle Experiment Atmospheric System Study (WGNE and MJO-Task Force/GASS) global model comparison project. For this, intraseasonal moisture budget equation is analyzed and a simple, efficient physical quantity is developed. The result shows that MJO skill is most sensitive to vertically integrated intraseasonal zonal wind convergence (ZC). In particular, a specific threshold value of the strength of the ZC can be used as distinguishing between good and poor models. An additional finding is that good models exhibit the correct simultaneous convection and large-scale circulation phase relationship. In poor models, however, the peak circulation response appears 3 days after peak rainfall, suggesting unfavorable coupling between convection and circulation. For an improving simulation of the MJO in climate models, we propose that this delay of circulation in response to convection needs to be corrected in the cumulus parameterization scheme.

  14. Qualification of a Plant Disease Simulation Model: Performance of the LATEBLIGHT Model Across a Broad Range of Environments.

    Science.gov (United States)

    Andrade-Piedra, Jorge L; Forbes, Gregory A; Shtienberg, Dani; Grünwald, Niklaus J; Chacón, María G; Taipe, Marco V; Hijmans, Robert J; Fry, William E

    2005-12-01

    ABSTRACT The concept of model qualification, i.e., discovering the domain over which a validated model may be properly used, was illustrated with LATEBLIGHT, a mathematical model that simulates the effect of weather, host growth and resistance, and fungicide use on asexual development and growth of Phytophthora infestans on potato foliage. Late blight epidemics from Ecuador, Mexico, Israel, and the United States involving 13 potato cultivars (32 epidemics in total) were compared with model predictions using graphical and statistical tests. Fungicides were not applied in any of the epidemics. For the simulations, a host resistance level was assigned to each cultivar based on general categories reported by local investigators. For eight cultivars, the model predictions fit the observed data. For four cultivars, the model predictions overestimated disease, likely due to inaccurate estimates of host resistance. Model predictions were inconsistent for one cultivar and for one location. It was concluded that the domain of applicability of LATEBLIGHT can be extended from the range of conditions in Peru for which it has been previously validated to those observed in this study. A sensitivity analysis showed that, within the range of values observed empirically, LATEBLIGHT is more sensitive to changes in variables related to initial inoculum and to weather than to changes in variables relating to host resistance.

  15. A comprehensive simulation model of the performance of photochromic films in absorbance-modulation-optical-lithography

    Directory of Open Access Journals (Sweden)

    Apratim Majumder

    2016-03-01

    Full Text Available Optical lithography is the most prevalent method of fabricating micro-and nano-scale structures in the semiconductor industry due to the fact that patterning using photons is fast, accurate and provides high throughput. However, the resolution of this technique is inherently limited by the physical phenomenon of diffraction. Absorbance-Modulation-Optical Lithography (AMOL, a recently developed technique has been successfully demonstrated to be able to circumvent this diffraction limit. AMOL employs a dual-wavelength exposure system in conjunction with spectrally selective reversible photo-transitions in thin films of photochromic molecules to achieve patterning of features with sizes beyond the far-field diffraction limit. We have developed a finite-element-method based full-electromagnetic-wave solution model that simulates the photo-chemical processes that occur within the thin film of the photochromic molecules under illumination by the exposure and confining wavelengths in AMOL. This model allows us to understand how the material characteristics influence the confinement to sub-diffraction dimensions, of the transmitted point spread function (PSF of the exposure wavelength inside the recording medium. The model reported here provides the most comprehensive analysis of the AMOL process to-date, and the results show that the most important factors that govern the process, are the polarization of the two beams, the ratio of the intensities of the two wavelengths, the relative absorption coefficients and the concentration of the photochromic species, the thickness of the photochromic layer and the quantum yields of the photoreactions at the two wavelengths. The aim of this work is to elucidate the requirements of AMOL in successfully circumventing the far-field diffraction limit.

  16. EMC Simulation and Modeling

    Science.gov (United States)

    Takahashi, Takehiro; Schibuya, Noboru

    The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.

  17. Modeling a photovoltaic energy storage system based on super capacitor, simulation and evaluation of experimental performance

    Science.gov (United States)

    Ben Fathallah, Mohamed Ali; Ben Othman, Afef; Besbes, Mongi

    2018-02-01

    Photovoltaic energy is very important to meet the consumption needs of electrical energy in remote areas and for other applications. Energy storage systems are essential to avoid the intermittent production of photovoltaic energy and to cover peaks in energy demand. The super capacitor, also known as electrochemical double layer capacitor, is a storage device which has a very high power density compared to conventional battery and is capable of storing a large amount of electrical energy in short time periods, which reflects its interest to be used for the storage of photovoltaic energy. From this principle, this paper represents a three-branch RC model of super capacitor to describe its different dynamics of operation during the charging, discharging and rest phases. After having validated the good functioning of this model with the experimental study of Zubieta, The super capacitor performance has been demonstrated and compared with a conventional battery in a photovoltaic converter chain to power AC machine.

  18. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC): FY10 development and integration

    International Nuclear Information System (INIS)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-01-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  19. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  20. Bootstrap model selection had similar performance for selecting authentic and noise variables compared to backward variable elimination: a simulation study.

    Science.gov (United States)

    Austin, Peter C

    2008-10-01

    Researchers have proposed using bootstrap resampling in conjunction with automated variable selection methods to identify predictors of an outcome and to develop parsimonious regression models. Using this method, multiple bootstrap samples are drawn from the original data set. Traditional backward variable elimination is used in each bootstrap sample, and the proportion of bootstrap samples in which each candidate variable is identified as an independent predictor of the outcome is determined. The performance of this method for identifying predictor variables has not been examined. Monte Carlo simulation methods were used to determine the ability of bootstrap model selection methods to correctly identify predictors of an outcome when those variables that are selected for inclusion in at least 50% of the bootstrap samples are included in the final regression model. We compared the performance of the bootstrap model selection method to that of conventional backward variable elimination. Bootstrap model selection tended to result in an approximately equal proportion of selected models being equal to the true regression model compared with the use of conventional backward variable elimination. Bootstrap model selection performed comparatively to backward variable elimination for identifying the true predictors of a binary outcome.

  1. Determining the energy performance of manually controlled solar shades: A stochastic model based co-simulation analysis

    International Nuclear Information System (INIS)

    Yao, Jian

    2014-01-01

    Highlights: • Driving factor for adjustment of manually controlled solar shades was determined. • A stochastic model for manual solar shades was constructed using Markov method. • Co-simulation with Energyplus was carried out in BCVTB. • External shading even manually controlled should be used prior to LOW-E windows. • Previous studies on manual solar shades may overestimate energy savings. - Abstract: Solar shading devices play a significant role in reducing building energy consumption and maintaining a comfortable indoor condition. In this paper, a typical office building with internal roller shades in hot summer and cold winter zone was selected to determine the driving factor of control behavior of manual solar shades. Solar radiation was determined as the major factor in driving solar shading adjustment based on field measurements and logit analysis and then a stochastic model for manually adjusted solar shades was constructed by using Markov method. This model was used in BCVTB for further co-simulation with Energyplus to determine the impact of the control behavior of solar shades on energy performance. The results show that manually adjusted solar shades, whatever located inside or outside, have a relatively high energy saving performance than clear-pane windows while only external shades perform better than regularly used LOW-E windows. Simulation also indicates that using an ideal assumption of solar shade adjustment as most studies do in building simulation may lead to an overestimation of energy saving by about 16–30%. There is a need to improve occupants’ actions on shades to more effectively respond to outdoor conditions in order to lower energy consumption, and this improvement can be easily achieved by using simple strategies as a guide to control manual solar shades

  2. Comparison of model performance and simulated water balance using NASIM and SWAT for the Wupper River Basin, Germany

    Science.gov (United States)

    Lorza, Paula; Nottebohm, Martin; Scheibel, Marc; aus der Beek, Tim

    2017-04-01

    Under the framework of the Horizon 2020 project BINGO (Bringing INnovation to onGOing water management), climate change impacts on the water cycle in the Wupper catchment area are being studied. With this purpose, a set of hydrological models in NASIM and SWAT have been set up, calibrated, and validated for past conditions using available data. NASIM is a physically-based, lumped, hydrological model based on the water balance equation. For the upper part of the Dhünn catchment area - Wupper River's main tributary - a SWAT model was also implemented. Observed and simulated discharge by NASIM and SWAT for the drainage area upstream of Neumühle hydrometric station (close to Große Dhünn reservoir's inlet) are compared. Comparison of simulated water balance for several hydrological years between the two models is also carried out. While NASIM offers high level of detail for modelling of complex urban areas and the possibility of entering precipitation time series at fine temporal resolution (e.g. minutely data), SWAT enables to study long-term impacts offering a huge variety of input and output variables including different soil properties, vegetation and land management practices. Beside runoff, also sediment and nutrient transport can be simulated. For most calculations, SWAT operates on a daily time step. The objective of this and future work is to determine catchment responses on different meteorological events and to study parameter sensitivity of stationary inputs such as soil parameters, vegetation or land use. Model performance is assessed with different statistical metrics (relative volume error, coefficient of determination, and Nash-Sutcliffe Efficiency).

  3. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  4. New Concept for Museum Storage Buildings – Evaluation of Building Performance Model for Simulation of Storage

    DEFF Research Database (Denmark)

    Christensen, Jørgen Erik; Knudsen, Lise Ræder; Kollias, Christos Georgios

    2016-01-01

    is close to be CO2 neutral. The analysis shows very good agreement between simulations and measurements, meaning that the proposed methods can be used for designing museum storage buildings. The analysis also shows, that the weather conditions of previous years, affect the indoor environment...... and physical decay of the objects as low as possible. Museum storage buildings should be able to provide a considerable stable indoor environment in terms of temperature and relative humidity. This paper explores how to simulate and build low energy museums storage buildings, and the paper shows...... that it is possible to make a building of low building expenses, very low running expenses and very high quality. In addition it is described that the energy consumption is only 2% compared to normal HVAC solutions, and the 2% can be delivered by excess wind power from Danish windmills resulting in that the building...

  5. LIAR: A COMPUTER PROGRAM FOR THE SIMULATION AND MODELING OF HIGH PERFORMANCE LINACS

    International Nuclear Information System (INIS)

    Adolphsen, Chris

    2003-01-01

    The computer program LIAR (''LInear Accelerator Research code'') is a numerical simulation and tracking program for linear colliders. The LIAR project was started at SLAC in August 1995 in order to provide a computing and simulation tool that specifically addresses the needs of high energy linear colliders. LIAR is designed to be used for a variety of different linear accelerators. It has been applied for and checked against the existing Stanford Linear Collider (SLC) as well as the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS). The program includes wakefield effects, a 4D coupled beam description, specific optimization algorithms and other advanced features. We describe the most important concepts and highlights of the program. After having presented the LIAR program at the LINAC96 and the PAC97 conferences, we do now introduce it to the European particle accelerator community

  6. Comparing the performance of 11 crop simulation models in predicting yield response to nitrogen fertilization

    Czech Academy of Sciences Publication Activity Database

    Salo, T.; Palosuo, T.; Kersebaum, K. C.; Nendel, C.; Angulo, C.; Ewert, F.; Bindi, M.; Calanca, P.; Klein, T.; Moriondo, M.; Ferrise, R.; Olesen, J. E.; Patil, R. H.; Ruget, F.; Takáč, J.; Hlavinka, Petr; Trnka, Miroslav; Rötter, R. P.

    2016-01-01

    Roč. 154, č. 7 (2016), s. 1218-1240 ISSN 0021-8596 R&D Projects: GA MŠk(CZ) LO1415; GA MZe QJ1310123; GA MŠk(CZ) LD13030 EU Projects: European Commission(XE) 268277; European Commission(XE) 292944 Institutional support: RVO:67179843 Keywords : Northern growing conditions * climate change impacts * spring barley * system simulations * soil properties * winter-wheat * dynamics * growth Subject RIV: GC - Agronomy Impact factor: 1.291, year: 2016

  7. Performance of process-based models for simulation of grain N in crop rotations across Europe

    DEFF Research Database (Denmark)

    Yin, Xiaogang; Kersebaum, KC; Kollas, C

    2017-01-01

    and rainfed treatments. Moreover, the multi-model mean provided better predictions of grain N compared to any individual model. In regard to the Individual models, DAISY, FASSET, HERMES, MONICA and STICS are suitable for predicting grain N of the main crops in typical European crop rotations, which all...

  8. Modeling and simulation on temperature performance in fiber optic gyroscope fiber coil of shipborne strapdown inertial navigation system

    Science.gov (United States)

    Wang, Yueze; Ma, Lin; Yu, Hao; Gao, Hongyu; Yuan, Yujie

    2016-10-01

    Compared with the traditional gyros, Fiber optic gyroscope (FOG) based on sagnac effect has the significant features, such as, long life, low cost, wide dynamic range, etc. These features have developed new applications of the gyroscope not only in industrial application area but also in aerospace application area. Now, the FOG has played a very important role in shipborne Strapdown Inertial Navigation System (SINS). The fiber coil, as one of the most critical components in FOG, is extremely sensitive to changes in temperature. Here, by study the environment temperature in shipborne SINS, the temperature performance of the FOG was analyzed. Firstly, on the base of the research about the theory of Shupe non-reciprocal errors caused by temperature, the discrete mathematics formula of the temperature error in FOG of SINS was built .Then the element model of the fiber coil in SINS was built based on the discrete model of the fiber coil in temperature error in FOG. A turn-by-turn quantization temperature bias error model is establish. Finally, based on the temperature models mentioned above, the temperature performance of FOG in shipborne SINS was analyzed. With finite element analysis, numerical simulations were carried out to quantitatively analyze the angular error induced by temperature excitation in SINS. The model was validated by comparing numerical and experimental results.

  9. Performance of process-based models for simulation of grain N in crop rotations across Europe

    Czech Academy of Sciences Publication Activity Database

    Xiaogang, Y.; Kesebaum, K. C.; Kollas, C.; Manevski, K.; Baby, S.; Beaudoin, N.; Öztürk, I.; Gaiser, T.; Wu, L.; Hoffmann, M.; Charfeddine, M.; Conradt, T.; Constantin, J.; Ewert, F.; de Cortazar-Atauri, I. G.; Giglio, L.; Hlavinka, Petr; Hoffmann, H.; Launay, M.; Louarn, G.; Manderscheid, R.; Mary, B.; Mirschel, W.; Nendel, C.; Pacholski, A.; Palouso, T.; Ripoche-Wachter, D.; Rötter, R. P.; Ruget, F.; Sharif, B.; Trnka, Miroslav; Ventrella, D.; Weigel, H-J.; Olesen, J. E.

    2017-01-01

    Roč. 154, JUN (2017), s. 63-77 ISSN 0308-521X R&D Projects: GA MŠk(CZ) LO1415; GA MZe QJ1310123 Institutional support: RVO:67179843 Keywords : Calibration * Crop model * Crop rotation * Grain N content * Model evaluation * Model initialization Subject RIV: EH - Ecology, Behaviour OBOR OECD: Environmental sciences (social aspects to be 5.7) Impact factor: 2.571, year: 2016

  10. Challenge problem and milestones for : Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe, Jr.

    2010-09-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  11. Challenge problem and milestones for: Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC)

    International Nuclear Information System (INIS)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe Jr.

    2010-01-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  12. Contribution to the modelling and analysis of logistics system performance by Petri nets and simulation models: Application in a supply chain

    Science.gov (United States)

    Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said

    2016-02-01

    In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.

  13. A high performance computing framework for physics-based modeling and simulation of military ground vehicles

    Science.gov (United States)

    Negrut, Dan; Lamb, David; Gorsich, David

    2011-06-01

    This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and

  14. Performance Evaluation of a PID and a Fuzzy PID Controllers Designed for Controlling a Simulated Quadcopter Rotational Dynamics Model

    Directory of Open Access Journals (Sweden)

    Laith Jasim Saud

    2017-07-01

    Full Text Available This work is concerned with designing two types of controllers, a PID and a Fuzzy PID, to be used for flying and stabilizing a quadcopter. The designed controllers have been tuned, tested, and compared using two performance indices which are the Integral Square Error (ISE and the Integral Absolute Error (IAE, and also some response characteristics like the rise time, overshoot, settling time, and the steady state error. To try and test the controllers, a quadcopter mathematical model has been developed. The model concentrated on the rotational dynamics of the quadcopter, i.e. the roll, pitch, and yaw variables. The work has been simulated with “MATLAB”. To make testing the simulated model and the controllers more realistic, the testing signals have been applied by a user through a joystick interfaced to the computer. The results obtained indicated a general superiority in performance for the Fuzzy PID controller over the PID controller used in this work. This conclusion is based by the following figures:lesser ISA for the roll, pitch, and yaw consequently, lesser IAE for the roll, pitch, and yaw consequently, lesser rise time and settling time for the roll and pitch consequently, and lesser settling time for the yaw. Moreover, the FPID gave zero overshoot versus and in the PID case for the roll, pitch, and yaw consequently. Both controllers gave zero steady state error with close rise times for the yaw. This superiority of the FPID controller is gained as the fuzzy part of it continuously and online adapts the parameters of the PID part.

  15. A Simulation Model for Setting Terms for Performance Based Contract Terms

    Science.gov (United States)

    2010-05-01

    called performance-based life cycle product support) and the fictional superhero Batman. Like Batman, PBL has received a poor reputation because of...problem with a service constraint. Working Paper. University of Texas at Dallas, Richardson, TX. Fowler, A. (2009). Misunderstood superheroes ... Superheroes , Internet www.dau.mil/pubscats/PubsCats/atl/fow_jf09.pdf, Accessed on March 28, 2010. PBL is Not Contracting Out Logistics via Product Support

  16. Modeling and Simulation of PV Array and its Performance Enhancement Using MPPT (P&O) Technique

    OpenAIRE

    T.Chaitanya; J.Surya Kumari; Ch.Saibabu

    2011-01-01

    The renewable energy will be an increasingly important part of power generation in the new millennium. Photovoltaic (PV) systems produce DC electricity when sunlight shines on the PV array, requiring little maintenance, and emitting no noise, among others. Day-by –day the energy demand is increasing and thus the need for a renewable source that will not harm the environment are of prime importance. The proposed model uses basic circuit equation of the photovoltaic solar cells including the ef...

  17. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  18. The Effect of Bypass Nozzle Exit Area on Fan Aerodynamic Performance and Noise in a Model Turbofan Simulator

    Science.gov (United States)

    Hughes, Christopher E.; Podboy, Gary, G.; Woodward, Richard P.; Jeracki, Robert, J.

    2013-01-01

    The design of effective new technologies to reduce aircraft propulsion noise is dependent on identifying and understanding the noise sources and noise generation mechanisms in the modern turbofan engine, as well as determining their contribution to the overall aircraft noise signature. Therefore, a comprehensive aeroacoustic wind tunnel test program was conducted called the Fan Broadband Source Diagnostic Test as part of the NASA Quiet Aircraft Technology program. The test was performed in the anechoic NASA Glenn 9- by 15-Foot Low Speed Wind Tunnel using a 1/5 scale model turbofan simulator which represented a current generation, medium pressure ratio, high bypass turbofan aircraft engine. The investigation focused on simulating in model scale only the bypass section of the turbofan engine. The test objectives were to: identify the noise sources within the model and determine their noise level; investigate several component design technologies by determining their impact on the aerodynamic and acoustic performance of the fan stage; and conduct detailed flow diagnostics within the fan flow field to characterize the physics of the noise generation mechanisms in a turbofan model. This report discusses results obtained for one aspect of the Source Diagnostic Test that investigated the effect of the bypass or fan nozzle exit area on the bypass stage aerodynamic performance, specifically the fan and outlet guide vanes or stators, as well as the farfield acoustic noise level. The aerodynamic performance, farfield acoustics, and Laser Doppler Velocimeter flow diagnostic results are presented for the fan and four different fixed-area bypass nozzle configurations. The nozzles simulated fixed engine operating lines and encompassed the fan stage operating envelope from near stall to cruise. One nozzle was selected as a baseline reference, representing the nozzle area which would achieve the design point operating conditions and fan stage performance. The total area change from

  19. Designing Citizen Business Loan Model to Reduce Non-Performing Loan: An Agent-based Modeling and Simulation Approach in Regional Development

    Directory of Open Access Journals (Sweden)

    Moses L Singgih

    2015-09-01

    Full Text Available Citizen Business Loan (CBL constitutes a program poverty alleviation based on economic empowerment of small and medium enterprise. This study focuses on implementation of CBL at Regional Development Bank branch X. The problem is the existing of interdependencies between CBL’s implements (Bank and the uncertainty of debtor’s capability in returning the credit. The impact of this circumstance is non-performing loan (NPL becomes relatively high (22%. The ultimate objective is to minimize NPL by designing the model based on the agent that can represent the problem through a simulation using agent-based modeling and simulation (ABMS. The model is considered by managing the probability of the debtor to pay or not based on 5 C categories, they are: character, capacity, capital, condition, and collateral that inherent to each debtor. There are two improvement scenarios proposed in this model. The first scenario only involves the first category of debtor in simulation. The result of this scenario is NPL value as 0%. The second scenario includes the first and second of debtor’s category in simulation and resulting NPL value between 4.6% and 11.4%.

  20. Solar Irradiance from GOES Albedo performance in a Hydrologic Model Simulation of Snowmelt Runoff

    Science.gov (United States)

    Sumargo, E.; Cayan, D. R.; McGurk, B. J.

    2015-12-01

    In many hydrologic modeling applications, solar radiation has been parameterized using commonly available measures, such as the daily temperature range, due to scarce in situ solar radiation measurement network. However, these parameterized estimates often produce significant biases. Here we test hourly solar irradiance derived from the Geostationary Operational Environmental Satellite (GOES) visible albedo product, using several established algorithms. Focusing on the Sierra Nevada and White Mountain in California, we compared the GOES irradiance and that from a traditional temperature-based algorithm with incoming irradiance from pyranometers at 19 stations. The GOES based estimates yielded 21-27% reduction in root-mean-squared error (average over 19 sites). The derived irradiance is then prescribed as an input to Precipitation-Runoff Modeling System (PRMS). We constrain our experiment to the Tuolumne River watershed and focus our attention on the winter and spring of 1996-2014. A root-mean-squared error reduction of 2-6% in daily inflow to Hetch Hetchy at the lower end of the Tuolumne catchment was achieved by incorporating the insolation estimates at only 8 out of 280 Hydrologic Response Units (HRUs) within the basin. Our ongoing work endeavors to apply satellite-derived irradiance at each individual HRU.

  1. Methods for implementing Building Information Modeling and Building Performance Simulation approaches

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø

    methodologies. Thesis studies showed that BIM approaches have the potential to improve AEC/FM communication and collaboration. BIM is by its nature multidisciplinary, bringing AEC/FM project participants together and creating constant communication. However, BIM adoption can lead to technical challenges......, for example, getting BIM-compatible tools to communicate properly. Furthermore, BIM adoption requires organizational change, that is changes in AEC/FM work practices and interpersonal dynamics. Consequently, to ensure that the adoption of BIM is successful, it is recommended that common IT regulations......, such as the Industry Foundation Classes (IFC), Information Delivery Manual (IDM), and Model View Definition (MVD), are proposed to provide clarification and consistency for BIM and BPS adoption, particularly, for BIM and BPS information exchange. As part of the thesis, a modular IDM Framework to define and organize...

  2. Experimental measurements and theoretical model of the cryogenic performance of bialkali photocathode and characterization with Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Huamu Xie

    2016-10-01

    Full Text Available High-average-current, high-brightness electron sources have important applications, such as in high-repetition-rate free-electron lasers, or in the electron cooling of hadrons. Bialkali photocathodes are promising high-quantum-efficiency (QE cathode materials, while superconducting rf (SRF electron guns offer continuous-mode operation at high acceleration, as is needed for high-brightness electron sources. Thus, we must have a comprehensive understanding of the performance of bialkali photocathode at cryogenic temperatures when they are to be used in SRF guns. To remove the heat produced by the radio-frequency field in these guns, the cathode should be cooled to cryogenic temperatures. We recorded an 80% reduction of the QE upon cooling the K_{2}CsSb cathode from room temperature down to the temperature of liquid nitrogen in Brookhaven National Laboratory (BNL’s 704 MHz SRF gun. We conducted several experiments to identify the underlying mechanism in this reduction. The change in the spectral response of the bialkali photocathode, when cooled from room temperature (300 K to 166 K, suggests that a change in the ionization energy (defined as the energy gap from the top of the valence band to vacuum level is the main reason for this reduction. We developed an analytical model of the process, based on Spicer’s three-step model. The change in ionization energy, with falling temperature, gives a simplified description of the QE’s temperature dependence. We also developed a 2D Monte Carlo code to simulate photoemission that accounts for the wavelength-dependent photon absorption in the first step, the scattering and diffusion in the second step, and the momentum conservation in the emission step. From this simulation, we established a correlation between ionization energy and reduction in the QE. The simulation yielded results comparable to those from the analytical model. The simulation offers us additional capabilities such as calculation

  3. Investigation the performance of 0-D and 3-d combustion simulation softwares for modelling HCCI engine with high air excess ratios

    Directory of Open Access Journals (Sweden)

    Gökhan Coşkun

    2017-10-01

    Full Text Available In this study, performance of zero and three dimensional simulations codes that used for simulate a homogenous charge compression ignition (HCCI engine fueled with Primary Reference Fuel PRF (85% iso-octane and 15% n-heptane were investigated. 0-D code, called as SRM Suite (Stochastic Reactor Model which can simulate engine combustion by using stochastic reactor model technique were used. Ansys-Fluent which can simulate computational fluid dynamics (CFD was used for 3-D engine combustion simulations. Simulations were evaluated for both commercial codes in terms of combustion, heat transfer and emissions in a HCCI engine. Chemical kinetic mechanisms which developed by Tsurushima including 33 species and 38 reactions for surrogate PRF fuel were used for combustion simulations. Analysis showed that both codes have advantages over each other.

  4. Human Performance Modeling in Military Simulation: Current State of the Art and the Way Ahead (2002 TTCP HUM Group Meeting)

    National Research Council Canada - National Science Library

    2004-01-01

    .... This report examines the requirements for human performance modeling within the military, assesses the state of the practice in current operational models, documents ongoing human performance research and development (R and D...

  5. Improvements in Thermal Performance of Mango Hot-water Treatment Equipments: Data Analysis, Mathematical Modelling and Numerical-computational Simulation

    Directory of Open Access Journals (Sweden)

    Elder M. Mendoza Orbegoso

    2017-06-01

    Full Text Available Mango is one of the most popular and best paid tropical fruits in worldwide markets, its exportation is regulated within a phytosanitary quality control for killing the “fruit fly”. Thus, mangoes must be subject to hot-water treatment process that involves their immersion in hot water over a period of time. In this work, field measurements, analytical and simulation studies are developed on available hot-water treatment equipment called “Original” that only complies with United States phytosanitary protocols. These approaches are made to characterize the fluid-dynamic and thermal behaviours that occur during the mangoes’ hot-water treatment process. Then, analytical model and Computational fluid dynamics simulations are developed for designing new hot-water treatment equipment called “Hybrid” that simultaneously meets with both United States and Japan phytosanitary certifications. Comparisons of analytical results with data field measurements demonstrate that “Hybrid” equipment offers a better fluid-dynamic and thermal performance than “Original” ones.

  6. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  7. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  8. Simulating Performance Risk for Lighting Retrofit Decisions

    Directory of Open Access Journals (Sweden)

    Jia Hu

    2015-05-01

    Full Text Available In building retrofit projects, dynamic simulations are performed to simulate building performance. Uncertainty may negatively affect model calibration and predicted lighting energy savings, which increases the chance of default on performance-based contracts. Therefore, the aim of this paper is to develop a simulation-based method that can analyze lighting performance risk in lighting retrofit decisions. The model uses a surrogate model, which is constructed by adaptively selecting sample points and generating approximation surfaces with fast computing time. The surrogate model is a replacement of the computation intensive process. A statistical method is developed to generate extreme weather profile based on the 20-year historical weather data. A stochastic occupancy model was created using actual occupancy data to generate realistic occupancy patterns. Energy usage of lighting, and heating, ventilation, and air conditioning (HVAC is simulated using EnergyPlus. The method can evaluate the influence of different risk factors (e.g., variation of luminaire input wattage, varying weather conditions on lighting and HVAC energy consumption and lighting electricity demand. Probability distributions are generated to quantify the risk values. A case study was conducted to demonstrate and validate the methods. The surrogate model is a good solution for quantifying the risk factors and probability distribution of the building performance.

  9. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  10. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  11. Improving the performance of a Seawater Greenhouse desalination system by assessment of simulation models for different condensers

    Energy Technology Data Exchange (ETDEWEB)

    Mahmoudi, Hacene [Laboratory of Water and Environment, Hassiba Ben Bouali University, Chlef BP151 (Algeria); Faculty of Sciences and Engineering Sciences, Hassiba Ben Bouali University, Chlef BP151 (Algeria); Spahis, Nawel [Faculty of Sciences and Engineering Sciences, Hassiba Ben Bouali University, Chlef BP151 (Algeria); Abdul-Wahab, Sabah A. [College of Engineering, P.O. Box 33, Sultan Qaboos University, Al-Khod 123, Muscat (Oman); Sablani, Shyam S. [Biological Systems Engineering, Washington State University, Pullman, WA (United States); Goosen, Mattheus F.A. [Alfaisal University, P.O. Box 50927, Riyadh 11533 (Saudi Arabia)

    2010-10-15

    The main aim of this paper was the development of a mathematical model for a new proposed passive condenser in order to enhance the performance of a humidification-dehumidification Seawater Greenhouse desalination system. Seawater Greenhouse desalination is used to create a cool environment and at the same time to produce fresh water for irrigation of crops grown inside the unit. The condenser in particular is currently one of the main bottlenecks in the commercialization of the technology. In addition to the current pump driven condenser, two new designs were considered: a passive cooling system with a condenser immersed in a water basin, and an external passive condenser connected to a basin of water placed on top of the cooling unit. The simulated condensate values for the proposed passive cooling condenser were compared with that of the actual measured values of the installed condenser. Preliminary results suggest that the passive condenser has a much greater water production capacity than the existing pump driven system. While the model for the proposed system still needs to be validated experimentally the initial study indicates that the passive containment cooling system is a promising improvement in the further development of greenhouse desalination. (author)

  12. Using sea surface temperatures to improve performance of single dynamical downscaling model in flood simulation under climate change

    Science.gov (United States)

    Chao, Y.; Cheng, C. T.; Hsiao, Y. H.; Hsu, C. T.; Yeh, K. C.; Liu, P. L.

    2017-12-01

    There are 5.3 typhoons hit Taiwan per year on average in last decade. Typhoon Morakot in 2009, the most severe typhoon, causes huge damage in Taiwan, including 677 casualties and roughly NT 110 billion (3.3 billion USD) in economic loss. Some researches documented that typhoon frequency will decrease but increase in intensity in western North Pacific region. It is usually preferred to use high resolution dynamical model to get better projection of extreme events; because coarse resolution models cannot simulate intense extreme events. Under that consideration, dynamical downscaling climate data was chosen to describe typhoon satisfactorily, this research used the simulation data from AGCM of Meteorological Research Institute (MRI-AGCM). Considering dynamical downscaling methods consume massive computing power, and typhoon number is very limited in a single model simulation, using dynamical downscaling data could cause uncertainty in disaster risk assessment. In order to improve the problem, this research used four sea surfaces temperatures (SSTs) to increase the climate change scenarios under RCP 8.5. In this way, MRI-AGCMs project 191 extreme typhoons in Taiwan (when typhoon center touches 300 km sea area of Taiwan) in late 21th century. SOBEK, a two dimensions flood simulation model, was used to assess the flood risk under four SSTs climate change scenarios in Tainan, Taiwan. The results show the uncertainty of future flood risk assessment is significantly decreased in Tainan, Taiwan in late 21th century. Four SSTs could efficiently improve the problems of limited typhoon numbers in single model simulation.

  13. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  14. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  15. High performance electromagnetic simulation tools

    Science.gov (United States)

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  16. Interaction and Impact Studies for Distributed Energy Resource, Transactive Energy, and Electric Grid, using High Performance Computing ?based Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, B M

    2017-02-10

    The electric utility industry is undergoing significant transformations in its operation model, including a greater emphasis on automation, monitoring technologies, and distributed energy resource management systems (DERMS). With these changes and new technologies, while driving greater efficiencies and reliability, these new models may introduce new vectors of cyber attack. The appropriate cybersecurity controls to address and mitigate these newly introduced attack vectors and potential vulnerabilities are still widely unknown and performance of the control is difficult to vet. This proposal argues that modeling and simulation (M&S) is a necessary tool to address and better understand these problems introduced by emerging technologies for the grid. M&S will provide electric utilities a platform to model its transmission and distribution systems and run various simulations against the model to better understand the operational impact and performance of cybersecurity controls.

  17. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  18. Verification of Temperature and Precipitation Simulated Data by Individual and Ensemble Performance of Five AOGCM Models for North East of Iran

    Directory of Open Access Journals (Sweden)

    B. Ashraf

    2014-08-01

    Full Text Available Scince climatic models are the basic tools to study climate change and because of the multiplicity of these models, selecting the most appropriate model for the studying location is very considerable. In this research the temperature and precipitation simulated data by BCM2, CGCM3, CNRMCM3, MRICGCM2.3 and MIROC3 models are downscaled with proportional method according A1B, A2 and B1 emission scenarios for Torbat-heydariye, Sabzevar and Mashhad initially. Then using coefficient of determination (R2, index of agreement (D and mean-square deviations (MSD, models were verified individually and as ensemble performance. The results showed that, based on individual performance and three emission scenarios, MRICGCM2.3 model in Torbat-heydariye and Mashhad and MIROC3.2 model in Sabzevar had the best performance in simulation of temperature and MIROC3.2, MRICGCM2.3 and CNRMCM3 models have provided the most accurate predictions for precipitation in Torbat-heydariye, Sabzevar and Mashahad respectively. Also simulated temperature by all models in Torbat-heydariye and Sabzevar base on B1 scenario and, in Mashhad based on A2 scenario had the lowest uncertainty. The most accuracy in modeling of precipitation was resulted based on A2 scenario in Torbat-heydariye and, B1 scenario in Sabzevar and Mashhad. Investigation of calculated statistics driven from ensemble performance of 5 selected models caused notable reduction of simulation error and thus increase the accuracy of predictions based on all emission scenarios generally. In this case, the best fitting of simulated and observed temperature data were achieved based on B1 scenario in Torbat-heydariye and Sabzevar and, A2 scenario in Mashhad. And the best fitting simulated and observed precipitation data were obtained based on A2 scenario in Torbat-heydariye and, B1 scenario in Sabzevar and Mashhad. According to the results of this research, before any climate change research it is necessary to select the

  19. Terrestrial ecosystem model performance in simulating productivity and its vulnerability to climate change in the northern permafrost region

    Science.gov (United States)

    Xia, Jianyang; McGuire, A. David; Lawrence, David; Burke, Eleanor J.; Chen, Guangsheng; Chen, Xiaodong; Delire, Christine; Koven, Charles; MacDougall, Andrew; Peng, Shushi; Rinke, Annette; Saito, Kazuyuki; Zhang, Wenxin; Alkama, Ramdane; Bohn, Theodore J.; Ciais, Philippe; Decharme, Bertrand; Gouttevin, Isabelle; Hajima, Tomohiro; Hayes, Daniel J.; Huang, Kun; Ji, Duoying; Krinner, Gerhard; Lettenmaier, Dennis P.; Miller, Paul A.; Moore, John C.; Smith, Benjamin; Sueyoshi, Tetsuo; Shi, Zheng; Yan, Liming; Liang, Junyi; Jiang, Lifen; Zhang, Qian; Luo, Yiqi

    2017-01-01

    Realistic projection of future climate-carbon (C) cycle feedbacks requires better understanding and an improved representation of the C cycle in permafrost regions in the current generation of Earth system models. Here we evaluated 10 terrestrial ecosystem models for their estimates of net primary productivity (NPP) and responses to historical climate change in permafrost regions in the Northern Hemisphere. In comparison with the satellite estimate from the Moderate Resolution Imaging Spectroradiometer (MODIS; 246 ± 6 g C m−2 yr−1), most models produced higher NPP (309 ± 12 g C m−2 yr−1) over the permafrost region during 2000–2009. By comparing the simulated gross primary productivity (GPP) with a flux tower-based database, we found that although mean GPP among the models was only overestimated by 10% over 1982–2009, there was a twofold discrepancy among models (380 to 800 g C m−2 yr−1), which mainly resulted from differences in simulated maximum monthly GPP (GPPmax). Most models overestimated C use efficiency (CUE) as compared to observations at both regional and site levels. Further analysis shows that model variability of GPP and CUE are nonlinearly correlated to variability in specific leaf area and the maximum rate of carboxylation by the enzyme Rubisco at 25°C (Vcmax_25), respectively. The models also varied in their sensitivities of NPP, GPP, and CUE to historical changes in climate and atmospheric CO2 concentration. These results indicate that model predictive ability of the C cycle in permafrost regions can be improved by better representation of the processes controlling CUE and GPPmax as well as their sensitivity to climate change.

  20. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents. Part 2: IDAC performance influencing factors model

    International Nuclear Information System (INIS)

    Chang, Y.H.J.; Mosleh, A.

    2007-01-01

    This is the second in a series of five papers describing the information, decision, and action in crew context (IDAC) model for human reliability analysis. An example application of this modeling technique is also discussed in this series. The model is developed to probabilistically predict the responses of the nuclear power plant control room operating crew in accident conditions. The operator response spectrum includes cognitive, psychological, and physical activities during the course of an accident. This paper identifies the IDAC set of performance influencing factors (PIFs), providing their definitions and causal organization in the form of a modular influence diagram. Fifty PIFs are identified to support the IDAC model to be implemented in a computer simulation environment. They are classified into eleven hierarchically structured groups. The PIFs within each group are independent to each other; however, dependencies may exist between PIFs within different groups. The supporting evidence for the selection and organization of the influence paths based on psychological literature, observations, and various human reliability analysis methodologies is also indicated

  1. Modeling and simulation performance of photovoltaic system integration battery and supercapacitor paralellization of MPPT prototipe for solar vehicle

    Science.gov (United States)

    Ajiatmo, Dwi; Robandi, Imam

    2017-03-01

    This paper proposes a control scheme photovoltaic, battery and super capacitor connected in parallel for use in a solar vehicle. Based on the features of battery charging, the control scheme consists of three modes, namely, mode dynamic irradian, constant load mode and constant voltage charging mode. The shift of the three modes can be realized by controlling the duty cycle of the mosffet Boost converter system. Meanwhile, the high voltage which is more suitable for the application can be obtained. Compared with normal charging method with parallel connected current limiting detention and charging method with dynamic irradian mode, constant load mode and constant voltage charging mode, the control scheme is proposed to shorten the charging time and increase the use of power generated from the PV array. From the simulation results and analysis conducted to determine the performance of the system in state transient and steady-state by using simulation software Matlab / Simulink. Response simulation results demonstrate the suitability of the proposed concept.

  2. Evaluating the performance of coupled snow-soil models in SURFEXv8 to simulate the permafrost thermal regime at a high Arctic site

    Science.gov (United States)

    Barrere, Mathieu; Domine, Florent; Decharme, Bertrand; Morin, Samuel; Vionnet, Vincent; Lafaysse, Matthieu

    2017-09-01

    Climate change projections still suffer from a limited representation of the permafrost-carbon feedback. Predicting the response of permafrost temperature to climate change requires accurate simulations of Arctic snow and soil properties. This study assesses the capacity of the coupled land surface and snow models ISBA-Crocus and ISBA-ES to simulate snow and soil properties at Bylot Island, a high Arctic site. Field measurements complemented with ERA-Interim reanalyses were used to drive the models and to evaluate simulation outputs. Snow height, density, temperature, thermal conductivity and thermal insulance are examined to determine the critical variables involved in the soil and snow thermal regime. Simulated soil properties are compared to measurements of thermal conductivity, temperature and water content. The simulated snow density profiles are unrealistic, which is most likely caused by the lack of representation in snow models of the upward water vapor fluxes generated by the strong temperature gradients within the snowpack. The resulting vertical profiles of thermal conductivity are inverted compared to observations, with high simulated values at the bottom of the snowpack. Still, ISBA-Crocus manages to successfully simulate the soil temperature in winter. Results are satisfactory in summer, but the temperature of the top soil could be better reproduced by adequately representing surface organic layers, i.e., mosses and litter, and in particular their water retention capacity. Transition periods (soil freezing and thawing) are the least well reproduced because the high basal snow thermal conductivity induces an excessively rapid heat transfer between the soil and the snow in simulations. Hence, global climate models should carefully consider Arctic snow thermal properties, and especially the thermal conductivity of the basal snow layer, to perform accurate predictions of the permafrost evolution under climate change.

  3. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... to operate a boiler plant dynamically means that the boiler designs must be able to absorb any fluctuations in water level and temperature gradients resulting from the pressure change in the boiler. On the one hand a large water-/steam space may be required, i.e. to build the boiler as big as possible. Due...

  4. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... to the internal pressure the consequence of the increased volume (i.e. water-/steam space) is an increased wall thickness in the pressure part of the boiler. The stresses introduced in the boiler pressure part as a result of the temperature gradients are proportional to the square of the wall thickness...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...

  5. Modeling and Simulation for Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Swinhoe, Martyn T. [Los Alamos National Laboratory

    2012-07-26

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R&D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  6. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  7. Software-In-the-Loop based Modeling and Simulation of Unmanned Semi-submersible Vehicle for Performance Verification of Autonomous Navigation

    Science.gov (United States)

    Lee, Kwangkook; Jeong, Mijin; Kim, Dong Hun

    2017-12-01

    Since an unmanned semi-submersible is mainly used for the purpose of carrying out dangerous missions in the sea, it is possible to work in a region where it is difficult to access due to safety reasons. In this study, an USV hull design was determined using Myring hull profile, and reinforcement work was performed by designing and implementing inner stiffener member for 3D printing. In order to simulate a sea state 5.0 or more at sea, which is difficult to implement in practice, a regular and irregular wave equation was implemented in Matlab / Simulink. We performed modeling and simulation of semi - submersible simulation based on DMWorks considering the rolling motion in wave. To verify and improve unpredicted errors, we implemented a numeric and physical simulation model of the USV based on software-in-the-loop (SIL) method. This simulation allows shipbuilders to participate in new value-added markets such as engineering, procurement, construction, installation, commissioning, operation, and maintenance for the USV.

  8. Comparison of the performance of traditional advection-dispersion equation and mobile-immobile model for simulating solute transport in heterogeneous soils

    Directory of Open Access Journals (Sweden)

    Haizhu HU,Xiaomin MAO

    2016-09-01

    Full Text Available The traditional advection-dispersion equation (ADE and the mobile-immobile model (MIM are widely used to describe solute transport in heterogeneous porous media. However, the fitness of the two models is case-dependent. In this paper, the transport of conservative, adsorbing and degradable solutes through a 1 m heterogeneous soil column under steady flow condition was simulated by ADE and MIM, and sensitivity analysis was conducted. Results show that MIM tends to prolong the breakthrough process and decrease peak concentration for all three solutes, and tailing and skewness are more pronounced with increasing dispersivity. Breakthrough curves of the adsorbing solute simulated by MIM are less sensitive to the retardation factor compared with the results simulated by ADE. The breakthrough curves of degradable solute obtained by MIM and ADE nearly overlap with a high degradation rate coefficient, indicating that MIM and ADE perform similarly for simulating degradable solute transport when biochemical degradation prevails over the mass exchange between mobile and immobile zones. The results suggest that the physical significance of dispersivity should be carefully considered when MIM is applied to simulate the degradable solute transport and/or ADE is applied to simulate the adsorbing solute transport in highly dispersive soils.

  9. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  10. Stereoscopic (3D) versus monoscopic (2D) laparoscopy: comparative study of performance using advanced HD optical systems in a surgical simulator model.

    Science.gov (United States)

    Schoenthaler, Martin; Schnell, Daniel; Wilhelm, Konrad; Schlager, Daniel; Adams, Fabian; Hein, Simon; Wetterauer, Ulrich; Miernik, Arkadiusz

    2016-04-01

    To compare task performances of novices and experts using advanced high-definition 3D versus 2D optical systems in a surgical simulator model. Fifty medical students (novices in laparoscopy) were randomly assigned to perform five standardized tasks adopted from the Fundamentals of Laparoscopic Surgery (FLS) curriculum in either a 2D or 3D laparoscopy simulator system. In addition, eight experts performed the same tasks. Task performances were evaluated using a validated scoring system of the SAGES/FLS program. Participants were asked to rate 16 items in a questionnaire. Overall task performance of novices was significantly better using stereoscopic visualization. Superiority of performances in 3D reached a level of significance for tasks peg transfer and precision cutting. No significant differences were noted in performances of experts when using either 2D or 3D. Overall performances of experts compared to novices were better in both 2D and 3D. Scorings in the questionnaires showed a tendency toward lower scores in the group of novices using 3D. Stereoscopic imaging significantly improves performance of laparoscopic phantom tasks of novices. The current study confirms earlier data based on a large number of participants and a standardized task and scoring system. Participants felt more confident and comfortable when using a 3D laparoscopic system. However, the question remains open whether these findings translate into faster and safer operations in a clinical setting.

  11. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  12. Simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  13. A predictive analytic model for high-performance tunneling field-effect transistors approaching non-equilibrium Green's function simulations

    International Nuclear Information System (INIS)

    Salazar, Ramon B.; Appenzeller, Joerg; Ilatikhameneh, Hesameddin; Rahman, Rajib; Klimeck, Gerhard

    2015-01-01

    A new compact modeling approach is presented which describes the full current-voltage (I-V) characteristic of high-performance (aggressively scaled-down) tunneling field-effect-transistors (TFETs) based on homojunction direct-bandgap semiconductors. The model is based on an analytic description of two key features, which capture the main physical phenomena related to TFETs: (1) the potential profile from source to channel and (2) the elliptic curvature of the complex bands in the bandgap region. It is proposed to use 1D Poisson's equations in the source and the channel to describe the potential profile in homojunction TFETs. This allows to quantify the impact of source/drain doping on device performance, an aspect usually ignored in TFET modeling but highly relevant in ultra-scaled devices. The compact model is validated by comparison with state-of-the-art quantum transport simulations using a 3D full band atomistic approach based on non-equilibrium Green's functions. It is shown that the model reproduces with good accuracy the data obtained from the simulations in all regions of operation: the on/off states and the n/p branches of conduction. This approach allows calculation of energy-dependent band-to-band tunneling currents in TFETs, a feature that allows gaining deep insights into the underlying device physics. The simplicity and accuracy of the approach provide a powerful tool to explore in a quantitatively manner how a wide variety of parameters (material-, size-, and/or geometry-dependent) impact the TFET performance under any bias conditions. The proposed model presents thus a practical complement to computationally expensive simulations such as the 3D NEGF approach

  14. SNR and BER Models and the Simulation for BER Performance of Selected Spectral Amplitude Codes for OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-01-01

    Full Text Available Many encoding schemes are used in OCDMA (Optical Code Division Multiple Access Network but SAC (Spectral Amplitude Codes is widely used. It is considered an effective arrangement to eliminate dominant noise called MAI (Multi Access Interference. Various codes are studied for evaluation with respect to their performance against three noises namely shot noise, thermal noise and PIIN (Phase Induced Intensity Noise. Various Mathematical models for SNR (Signal to Noise Ratios and BER (Bit Error Rates are discussed where the SNRs are calculated and BERs are computed using Gaussian distribution assumption. After analyzing the results mathematically, it is concluded that ZCC (Zero Cross Correlation Code performs better than the other selected SAC codes and can serve larger number of active users than the other codes do. At various receiver power levels, analysis points out that RDC (Random Diagonal Code also performs better than the other codes. For the power interval between -10 and -20 dBm performance of RDC is better ZCC. Their lowest BER values suggest that these codes should be part of an efficient and cost effective OCDM access network in the future.

  15. Manufacturing plant performance evaluation by discrete event simulation

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mohd Rasid Osman; Rosnah Mohd Yusuff; Napsiah Ismail; Zulkiflie Leman

    2002-01-01

    A case study was conducted to evaluate the performance of a manufacturing plant using discrete event simulation technique. The study was carried out on animal feed production plant. Sterifeed plant at Malaysian Institute for Nuclear Technology Research (MINT), Selangor, Malaysia. The plant was modelled base on the actual manufacturing activities recorded by the operators. The simulation was carried out using a discrete event simulation software. The model was validated by comparing the simulation results with the actual operational data of the plant. The simulation results show some weaknesses with the current plant design and proposals were made to improve the plant performance. (Author)

  16. Simulated influence of postweaning production system on performance of different biological types of cattle: I. Estimation of model parameters.

    Science.gov (United States)

    Williams, C B; Bennett, G L; Keele, J W

    1995-03-01

    Breed parameters for a computer model that simulated differences in the composition of empty-body gain of beef cattle, resulting from differences in postweaning level of nutrition that are not associated with empty BW, were estimated for 17 biological types of cattle (steers from F1 crosses of 16 sire breeds [Hereford, Angus, Jersey, South Devon, Limousin, Simmental, Charolais, Red Poll, Brown Swiss, Gelbvieh, Maine Anjou, Chianina, Brahman, Sahiwal, Pinzgauer, and Tarentaise] mated to Hereford and Angus dams). One value for the maximum fractional growth rate of fat-free matter (KMAX) was estimated and used across all breed types. Mature fat-free matter (FFMmat) was estimated from data on mature cows for each of the 17 breed types. Breed type values for a fattening parameter (THETA) were estimated from growth and composition data at slaughter on steers of the 17 breed types, using the previously estimated constant KMAX and breed values for FFMmat. For each breed type, THETA values were unique for given values of KMAX, FFMmat, and composition at slaughter. The results showed that THETA was most sensitive to KMAX and had similar sensitivity to FFMmat and composition at slaughter. Values for THETA were most sensitive for breed types with large THETA values (Chianina, Charolais, and Limousin crossbred steers) and least sensitive for breed types with small THETA values (purebred Angus, crossbred Jersey, and Red Poll steers).(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  18. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  19. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  20. Performance of Irikura's Recipe Rupture Model Generator in Earthquake Ground Motion Simulations as Implemented in the Graves and Pitarka Hybrid Approach.

    Energy Technology Data Exchange (ETDEWEB)

    Pitarka, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-11-22

    We analyzed the performance of the Irikura and Miyake (2011) (IM2011) asperity-­ based kinematic rupture model generator, as implemented in the hybrid broadband ground-­motion simulation methodology of Graves and Pitarka (2010), for simulating ground motion from crustal earthquakes of intermediate size. The primary objective of our study is to investigate the transportability of IM2011 into the framework used by the Southern California Earthquake Center broadband simulation platform. In our analysis, we performed broadband (0 -­ 20Hz) ground motion simulations for a suite of M6.7 crustal scenario earthquakes in a hard rock seismic velocity structure using rupture models produced with both IM2011 and the rupture generation method of Graves and Pitarka (2016) (GP2016). The level of simulated ground motions for the two approaches compare favorably with median estimates obtained from the 2014 Next Generation Attenuation-­West2 Project (NGA-­West2) ground-­motion prediction equations (GMPEs) over the frequency band 0.1–10 Hz and for distances out to 22 km from the fault. We also found that, compared to GP2016, IM2011 generates ground motion with larger variability, particularly at near-­fault distances (<12km) and at long periods (>1s). For this specific scenario, the largest systematic difference in ground motion level for the two approaches occurs in the period band 1 – 3 sec where the IM2011 motions are about 20 – 30% lower than those for GP2016. We found that increasing the rupture speed by 20% on the asperities in IM2011 produced ground motions in the 1 – 3 second bandwidth that are in much closer agreement with the GMPE medians and similar to those obtained with GP2016. The potential implications of this modification for other rupture mechanisms and magnitudes are not yet fully understood, and this topic is the subject of ongoing study.

  1. Aircraft Performance for Open Air Traffic Simulations

    NARCIS (Netherlands)

    Metz, I.C.; Hoekstra, J.M.; Ellerbroek, J.; Kugler, D.

    2016-01-01

    The BlueSky Open Air Tra_c Simulator developed by the Control & Simulation section of TU Delft aims at supporting research for analysing Air Tra_c Management concepts by providing an open source simulation platform. The goal of this study was to complement BlueSky with aircraft performance

  2. Simulating atmospheric composition over a South-East Asian tropical rainforest: performance of a chemistry box model

    Directory of Open Access Journals (Sweden)

    T. A. M. Pugh

    2010-01-01

    Full Text Available Atmospheric composition and chemistry above tropical rainforests is currently not well established, particularly for south-east Asia. In order to examine our understanding of chemical processes in this region, the performance of a box model of atmospheric boundary layer chemistry is tested against measurements made at the top of the rainforest canopy near Danum Valley, Malaysian Borneo. Multi-variate optimisation against ambient concentration measurements was used to estimate average canopy-scale emissions for isoprene, total monoterpenes and nitric oxide. The excellent agreement between estimated values and measured fluxes of isoprene and total monoterpenes provides confidence in the overall modelling strategy, and suggests that this method may be applied where measured fluxes are not available, assuming that the local chemistry and mixing are adequately understood. The largest contributors to the optimisation cost function at the point of best-fit are OH (29%, NO (22% and total peroxy radicals (27%. Several factors affect the modelled VOC chemistry. In particular concentrations of methacrolein (MACR and methyl-vinyl ketone (MVK are substantially overestimated, and the hydroxyl radical (OH concentration is substantially underestimated; as has been seen before in tropical rainforest studies. It is shown that inclusion of dry deposition of MACR and MVK and wet deposition of species with high Henry's Law values substantially improves the fit of these oxidised species, whilst also substantially decreasing the OH sink. Increasing OH production arbitrarily, through a simple OH recycling mechanism , adversely affects the model fit for volatile organic compounds (VOCs. Given the constraints on isoprene flux provided by measurements, a substantial decrease in the rate of reaction of VOCs with OH is the only remaining option to explain the measurement/model discrepancy for OH. A reduction in the isoprene+OH rate constant of 50%, in conjunction with

  3. Acoustic Performance of Novel Fan Noise Reduction Technologies for a High Bypass Model Turbofan at Simulated Flights Conditions

    Science.gov (United States)

    Elliott, David M.; Woodward, Richard P.; Podboy, Gary G.

    2010-01-01

    Two novel fan noise reduction technologies, over the rotor acoustic treatment and soft stator vane technologies, were tested in an ultra-high bypass ratio turbofan model in the NASA Glenn Research Center s 9- by 15-Foot Low-Speed Wind Tunnel. The performance of these technologies was compared to that of the baseline fan configuration, which did not have these technologies. Sideline acoustic data and hot film flow data were acquired and are used to determine the effectiveness of the various treatments. The material used for the over the rotor treatment was foam metal and two different types were used. The soft stator vanes had several internal cavities tuned to target certain frequencies. In order to accommodate the cavities it was necessary to use a cut-on stator to demonstrate the soft vane concept.

  4. Modeling and performance simulation of 100 MW PTC based solar thermal power plant in Udaipur India

    Directory of Open Access Journals (Sweden)

    Deepak Bishoyi

    2017-09-01

    Full Text Available Solar energy is a key renewable energy source and the most abundant energy source on the globe. Solar energy can be converted into electric energy by using two different processes: by means of photovoltaic (PV conversion and the thermodynamic cycles. Concentrated solar power (CSP is viewed as one of the most promising alternatives in the field of solar energy utilization. Lifetime and efficiency of PV system are very less compared to the CSP technology. A 100 MW parabolic trough solar thermal power plant with 6 h of thermal energy storage has been evaluated in terms of design and thermal performance, based on the System Advisor Model (SAM. A location receiving an annual DNI of 2248.17 kW h/m2 in Rajasthan is chosen for the technical feasibility of hypothetical CSP plant. The plant design consists of 194 solar collector loops with each loop comprising of 8 parabolic trough collectors. HITEC solar salt is chosen as an HTF due to its excellent thermodynamic properties. The designed plant can generate annual electricity of 285,288,352 kW h with the plant efficiency of 21%. The proposed design of PTC based solar thermal power plant and its performance analysis encourages further innovation and development of solar thermal power plants in India.

  5. VHDL simulation with access to transistor models

    Science.gov (United States)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  6. Creating Simulated Microgravity Patient Models

    Science.gov (United States)

    Hurst, Victor; Doerr, Harold K.; Bacal, Kira

    2004-01-01

    The Medical Operational Support Team (MOST) has been tasked by the Space and Life Sciences Directorate (SLSD) at the NASA Johnson Space Center (JSC) to integrate medical simulation into 1) medical training for ground and flight crews and into 2) evaluations of medical procedures and equipment for the International Space Station (ISS). To do this, the MOST requires patient models that represent the physiological changes observed during spaceflight. Despite the presence of physiological data collected during spaceflight, there is no defined set of parameters that illustrate or mimic a 'space normal' patient. Methods: The MOST culled space-relevant medical literature and data from clinical studies performed in microgravity environments. The areas of focus for data collection were in the fields of cardiovascular, respiratory and renal physiology. Results: The MOST developed evidence-based patient models that mimic the physiology believed to be induced by human exposure to a microgravity environment. These models have been integrated into space-relevant scenarios using a human patient simulator and ISS medical resources. Discussion: Despite the lack of a set of physiological parameters representing 'space normal,' the MOST developed space-relevant patient models that mimic microgravity-induced changes in terrestrial physiology. These models are used in clinical scenarios that will medically train flight surgeons, biomedical flight controllers (biomedical engineers; BME) and, eventually, astronaut-crew medical officers (CMO).

  7. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  8. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    In the present work a framework for optimizing the design of boilers for dynamic operation has been developed. A cost function to be minimized during the optimization has been formulated and for the present design variables related to the Boiler Volume and the Boiler load Gradient (i.e. ring rate...... on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...

  9. Modelling and Simulation: An Overview

    OpenAIRE

    McAleer, Michael; Chan, Felix; Oxley, Les

    2013-01-01

    This discussion paper resulted in a publication in 'Selected Papers of the MSSANZ 19th Biennial Conference on Modelling and Simulation Mathematics and Computers in Simulation', 2013, pp. viii. The papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal: the emp...

  10. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  11. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  12. Modeling and simulation of blood collection systems.

    Science.gov (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier

    2012-03-01

    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  13. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  14. Used Nuclear Fuel Loading and Structural Performance Under Normal Conditions of Transport - Modeling, Simulation and Experimental Integration RD&D Plan

    Energy Technology Data Exchange (ETDEWEB)

    Adkins, Harold E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-04-01

    Under current U.S. Nuclear Regulatory Commission regulation, it is not sufficient for used nuclear fuel (UNF) to simply maintain its integrity during the storage period, it must maintain its integrity in such a way that it can withstand the physical forces of handling and transportation associated with restaging the fuel and moving it to treatment or recycling facilities, or a geologic repository. Hence it is necessary to understand the performance characteristics of aged UNF cladding and ancillary components under loadings stemming from transport initiatives. Researchers would like to demonstrate that enough information, including experimental support and modeling and simulation capabilities, exists to establish a preliminary determination of UNF structural performance under normal conditions of transport (NCT). This research, development and demonstration (RD&D) plan describes a methodology, including development and use of analytical models, to evaluate loading and associated mechanical responses of UNF rods and key structural components. This methodology will be used to provide a preliminary assessment of the performance characteristics of UNF cladding and ancillary components under rail-related NCT loading. The methodology couples modeling and simulation and experimental efforts currently under way within the Used Fuel Disposition Campaign (UFDC). The methodology will involve limited uncertainty quantification in the form of sensitivity evaluations focused around available fuel and ancillary fuel structure properties exclusively. The work includes collecting information via literature review, soliciting input/guidance from subject matter experts, performing computational analyses, planning experimental measurement and possible execution (depending on timing), and preparing a variety of supporting documents that will feed into and provide the basis for future initiatives. The methodology demonstration will focus on structural performance evaluation of

  15. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operati...

  16. Gauss or Bernoulli? A Monte Carlo Comparison of the Performance of the Linear Mixed-Model and the Logistic Mixed-Model Analyses in Simulated Community Trials with a Dichotomous Outcome Variable at the Individual Level.

    Science.gov (United States)

    Hannan, Peter J.; Murray, David M.

    1996-01-01

    A Monte Carlo study compared performance of linear and logistic mixed-model analyses of simulated community trials having specific event rates, intraclass correlations, and degrees of freedom. Results indicate that in studies with adequate denominator degrees of freedom, the researcher may use either method of analysis, with certain cautions. (SLD)

  17. Multiprocessor performance modeling with ADAS

    Science.gov (United States)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  18. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  19. A Systematic Analysis of the Sensitivity of Plasma Pharmacokinetics to Detect Differences in the Pulmonary Performance of Inhaled Fluticasone Propionate Products Using a Model-Based Simulation Approach.

    Science.gov (United States)

    Weber, Benjamin; Hochhaus, Guenther

    2015-07-01

    The role of plasma pharmacokinetics (PK) for assessing bioequivalence at the target site, the lung, for orally inhaled drugs remains unclear. A validated semi-mechanistic model, considering the presence of mucociliary clearance in central lung regions, was expanded for quantifying the sensitivity of PK studies in detecting differences in the pulmonary performance (total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics) between test (T) and reference (R) inhaled fluticasone propionate (FP) products. PK bioequivalence trials for inhaled FP were simulated based on this PK model for a varying number of subjects and T products. The statistical power to conclude bioequivalence when T and R products are identical was demonstrated to be 90% for approximately 50 subjects. Furthermore, the simulations demonstrated that PK metrics (area under the concentration time curve (AUC) and C max) are capable of detecting differences between T and R formulations of inhaled FP products when the products differ by more than 20%, 30%, and 25% for total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics, respectively. These results were derived using a rather conservative risk assessment approach with an error rate of <10%. The simulations thus indicated that PK studies might be a viable alternative to clinical studies comparing pulmonary efficacy biomarkers for slowly dissolving inhaled drugs. PK trials for pulmonary efficacy equivalence testing should be complemented by in vitro studies to avoid false positive bioequivalence assessments that are theoretically possible for some specific scenarios. Moreover, a user-friendly web application for simulating such PK equivalence trials with inhaled FP is provided.

  20. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  1. Team Culture and Business Strategy Simulation Performance

    Science.gov (United States)

    Ritchie, William J.; Fornaciari, Charles J.; Drew, Stephen A. W.; Marlin, Dan

    2013-01-01

    Many capstone strategic management courses use computer-based simulations as core pedagogical tools. Simulations are touted as assisting students in developing much-valued skills in strategy formation, implementation, and team management in the pursuit of superior strategic performance. However, despite their rich nature, little is known regarding…

  2. Real-Gas Effects in ORC Turbine Flow Simulations : Influence of Thermodynamic Models on Flow Fields and Performance Parameters

    NARCIS (Netherlands)

    Colonna, P.; Rebay, S.; Harinck, J.; Guardone, A.

    2006-01-01

    The analysis and design of turbomachinery is usually performed by means of fluid dynamic computations employing ideal gas laws. This can lead to inaccurate redictions for Organic Rankine Cycle (ORC) turbines, which operate partly in the nonideal thermodynamic region. The objective of this work is to

  3. Terrestrial ecosystem model performance in simulating productivity and its vulnerability to climate change in the northern permafrost region

    DEFF Research Database (Denmark)

    Xia, Jianyang; McGuire, A. David; Lawrence, David

    2017-01-01

    and the maximum rate of carboxylation by the enzyme Rubisco at 25°C (Vcmax_25), respectively. The models also varied in their sensitivities of NPP, GPP, and CUE to historical changes in climate and atmospheric CO2 concentration. These results indicate that model predictive ability of the C cycle in permafrost...... productivity (NPP) and responses to historical climate change in permafrost regions in the Northern Hemisphere. In comparison with the satellite estimate from the Moderate Resolution Imaging Spectroradiometer (MODIS; 246 ± 6 g C m−2 yr−1), most models produced higher NPP (309 ± 12 g C m−2 yr−1) over...... regions can be improved by better representation of the processes controlling CUE and GPPmax as well as their sensitivity to climate change....

  4. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  5. Improving the performance of a filling line based on simulation

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  6. Equivalent drawbead model in finite element simulations

    NARCIS (Netherlands)

    Carleer, Bart D.; Carleer, B.D.; Meinders, Vincent T.; Huetink, Han; Lee, J.K.; Kinzel, G.L.; Wagoner, R.

    1996-01-01

    In 3D simulations of the deep drawing process the drawbead geometries are seldom included. Therefore equivalent drawbeads are used. In order to investigate the drawbead behaviour a 2D plane strain finite element model was used. For verification of this model experiments were performed. The analyses

  7. TREAT Modeling and Simulation Strategy

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  8. FASTBUS simulation models in VHDL

    International Nuclear Information System (INIS)

    Appelquist, G.

    1992-11-01

    Four hardware simulation models implementing the FASTBUS protocol are described. The models are written in the VHDL hardware description language to obtain portability, i.e. without relations to any specific simulator. They include two complete FASTBUS devices, a full-duplex segment interconnect and ancillary logic for the segment. In addition, master and slave models using a high level interface to describe FASTBUS operations, are presented. With these models different configurations of FASTBUS systems can be evaluated and the FASTBUS transactions of new devices can be verified. (au)

  9. Complex Simulation Model of Mobile Fading Channel

    Directory of Open Access Journals (Sweden)

    Tomas Marek

    2005-01-01

    Full Text Available In the mobile communication environment the mobile channel is the main limiting obstacle to reach the best performance of wireless system. Modeling of the radio channel consists of two basic fading mechanisms - Long-term fading and Short-term fading. The contribution deals with simulation of complex mobile radio channel, which is the channel with all fading components. Simulation model is based on Clarke-Gans theoretical model for fading channel and is developed in MATLAB environment. Simulation results have shown very good coincidence with theory. This model was developed for hybrid adaptation 3G uplink simulator (described in this issue during the research project VEGA - 1/0140/03.

  10. Equivalent drawbead performance in deep drawing simulations

    NARCIS (Netherlands)

    Meinders, Vincent T.; Geijselaers, Hubertus J.M.; Huetink, Han

    1999-01-01

    Drawbeads are applied in the deep drawing process to improve the control of the material flow during the forming operation. In simulations of the deep drawing process these drawbeads can be replaced by an equivalent drawbead model. In this paper the usage of an equivalent drawbead model in the

  11. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  12. Matlab-Based Modeling and Simulations to Study the Performance of Different MPPT Techniques Used for Photovoltaic Systems under Partially Shaded Conditions

    Directory of Open Access Journals (Sweden)

    Jehun Hahm

    2015-01-01

    Full Text Available A pulse-width-modulator- (PWM- based sliding mode controller is developed to study the effects of partial shade, temperature, and insolation on the performance of maximum power point tracking (MPPT used in photovoltaic (PV systems. Under partially shaded conditions and temperature, PV array characteristics become more complex, with multiple power-voltage maxima. MPPT is an automatic control technique to adjust power interfaces and deliver power for a diverse range of insolation values, temperatures, and partially shaded modules. The PV system is tested using two conventional algorithms: the Perturb and Observe (P&O algorithm and the Incremental Conductance (IncCond algorithm, which are simple to implement for a PV array. The proposed method applied a model to simulate the performance of the PV system for solar energy usage, which is compared to the conventional methods under nonuniform insolation improving the PV system utilization efficiency and allowing optimization of the system performance. The PWM-based sliding mode controller successfully overcomes the issues presented by nonuniform conditions and tracks the global MPP. In this paper, the PV system consists of a solar module under shade connected to a boost converter that is controlled by three different algorithms and is generated using Matlab/Simulink.

  13. Landscape Modelling and Simulation Using Spatial Data

    Directory of Open Access Journals (Sweden)

    Amjed Naser Mohsin AL-Hameedawi

    2017-08-01

    Full Text Available In this paper a procedure was performed for engendering spatial model of landscape acclimated to reality simulation. This procedure based on combining spatial data and field measurements with computer graphics reproduced using Blender software. Thereafter that we are possible to form a 3D simulation based on VIS ALL packages. The objective was to make a model utilising GIS, including inputs to the feature attribute data. The objective of these efforts concentrated on coordinating a tolerable spatial prototype, circumscribing facilitation scheme and outlining the intended framework. Thus; the eventual result was utilized in simulation form. The performed procedure contains not only data gathering, fieldwork and paradigm providing, but extended to supply a new method necessary to provide the respective 3D simulation mapping production, which authorises the decision makers as well as investors to achieve permanent acceptance an independent navigation system for Geoscience applications.

  14. Well performance model

    International Nuclear Information System (INIS)

    Thomas, L.K.; Evans, C.E.; Pierson, R.G.; Scott, S.L.

    1992-01-01

    This paper describes the development and application of a comprehensive oil or gas well performance model. The model contains six distinct sections: stimulation design, tubing and/or casing flow, reservoir and near-wellbore calculations, production forecasting, wellbore heat transmission, and economics. These calculations may be performed separately or in an integrated fashion with data and results shared among the different sections. The model analysis allows evaluation of all aspects of well completion design, including the effects on future production and overall well economics

  15. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Most systems involve parameters and variables, which are random variables due to uncertainties. Probabilistic meth- ods are powerful in modelling such systems. In this second part, we describe probabilistic models and Monte Carlo simulation along with 'classical' matrix methods and differ- ential equations as most real ...

  16. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  17. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are

  18. Performance Optimization of the ATLAS Detector Simulation

    CERN Document Server

    AUTHOR|(CDS)2091018

    In the thesis at hand the current performance of the ATLAS detector simulation, part of the Athena framework, is analyzed and possible optimizations are examined. For this purpose the event based sampling profiler VTune Amplifier by Intel is utilized. As the most important metric to measure improvements, the total execution time of the simulation of $t\\bar{t}$ events is also considered. All efforts are focused on structural changes, which do not influence the simulation output and can be attributed to CPU specific issues, especially front end stalls and vectorization. The most promising change is the activation of profile guided optimization for Geant4, which is a critical external dependency of the simulation. Profile guided optimization gives an average improvement of $8.9\\%$ and $10.0\\%$ for the two considered cases at the cost of one additional compilation (instrumented binaries) and execution (training to obtain profiling data) at build time.

  19. Approaching Sentient Building Performance Simulation Systems

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas; Heller, Alfred

    2014-01-01

    Sentient BPS systems can combine one or more high precision BPS and provide near instantaneous performance feedback directly in the design tool, thus providing speed and precision of building performance in the early design stages. Sentient BPS systems are essentially combining: 1) design tools, 2......) parametric tools, 3) BPS tools, 4) dynamic databases 5) interpolation techniques and 6) prediction techniques as a fast and valid simulation system, in the early design stage....

  20. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  1. Modeling and Simulation: An Overview

    OpenAIRE

    Michael McAleer; Felix Chan; Les Oxley

    2013-01-01

    The papers in this special issue of Mathematics and Computers in Simulation cover the following topics. Improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal. The empirical properties of some estimators of long memory, characterising trader manipulation in a limitorder driven market, measuring bias in a term-structure model of commodity prices through the c...

  2. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  3. Simulation-based evaluation of the performance of the F test in a linear multilevel model setting with sparseness at the level of the primary unit.

    Science.gov (United States)

    Bruyndonckx, Robin; Aerts, Marc; Hens, Niel

    2016-09-01

    In a linear multilevel model, significance of all fixed effects can be determined using F tests under maximum likelihood (ML) or restricted maximum likelihood (REML). In this paper, we demonstrate that in the presence of primary unit sparseness, the performance of the F test under both REML and ML is rather poor. Using simulations based on the structure of a data example on ceftriaxone consumption in hospitalized children, we studied variability, type I error rate and power in scenarios with a varying number of secondary units within the primary units. In general, the variability in the estimates for the effect of the primary unit decreased as the number of secondary units increased. In the presence of singletons (i.e., only one secondary unit within a primary unit), REML consistently outperformed ML, although even under REML the performance of the F test was found inadequate. When modeling the primary unit as a random effect, the power was lower while the type I error rate was unstable. The options of dropping, regrouping, or splitting the singletons could solve either the problem of a high type I error rate or a low power, while worsening the other. The permutation test appeared to be a valid alternative as it outperformed the F test, especially under REML. We conclude that in the presence of singletons, one should be careful in using the F test to determine the significance of the fixed effects, and propose the permutation test (under REML) as an alternative. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance...... eigenfunctions and estimates of the distributions of the corresponding expansion coefficients. The simulation method utilizes the eigenfunction expansion procedure to produce preliminary time histories of the three velocity components simultaneously. As a final step, a spectral shaping procedure is then applied....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  5. Performance comparison of low and high temperature polymer electrolyte membrane fuel cells. Experimental examinations, modelling and numerical simulation; Leistungsvergleich von Nieder- und Hochtemperatur-Polymerelektrolytmembran-Brennstoffzellen. Experimentelle Untersuchungen, Modellierung und numerische Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Loehn, Helmut

    2010-11-03

    danger of washing out of the phosphoric acid. In an additional test row the Celtec-P-1000 HT-MEA was subjected to temperature change cycles (40 - 160 C), which lead to irreversible voltage losses. In a final test row performance tests were carried out with a HT-PEM fuel cell stack (16 cells /1 kW), developed in the fuel cell research centre of Volkswagen with a special gas diffusion electrode, which should avoid the degradation at deep temperatures. In these examinations no irreversible voltage losses could be detected, but the tests had to be aborted because of leakage problems. The by the experimental examinations gained insight of the superior operating behaviour and the further advantages of the HT-PEMFC in comparison to the LT-PEMFC were crucial for the construction of a simulation model for a single HT-PEM fuel cell in the theoretical part of this thesis, that also should be suitable as process simulation model for the computer based development of a virtual fuel cell within the interdisciplinary project ''Virtual Fuel Cell'' at the TU Darmstadt. The model is a numerical 2D ''along the channel'' - model, that was constructed with the finite element software COMSOL Multiphysics (version 3.5 a). The stationary, one phase model comprises altogether ten dependent variables in seven application modules in a highly complex, coupled non linear system of equations with 33713 degrees of freedom (1675 rectangle elements with 1768 nodes). The simulation model describes the mass transport processes and the electro-chemical reactions in a HT-PEM fuel cell with good accuracy, the model validation by comparing the model results with experimental data could be proved. So the 2D-model is basically suitable as process simulation model for the projecting of a virtual HT-PEM fuel cell. (orig.)

  6. Modeling control in manufacturing simulation

    NARCIS (Netherlands)

    Zee, Durk-Jouke van der; Chick, S.; Sánchez, P.J.; Ferrin, D.; Morrice, D.J.

    2003-01-01

    A significant shortcoming of traditional simulation languages is the lack of attention paid to the modeling of control structures, i.e., the humans or systems responsible for manufacturing planning and control, their activities and the mutual tuning of their activities. Mostly they are hard coded

  7. Design and simulation of a new energy conscious system, (ventilation and thermal performance simulation)

    Energy Technology Data Exchange (ETDEWEB)

    Gadi, Mohamed B. [Nottingham Univ., School of the Built Environment, Nottingham (United Kingdom)

    2000-04-01

    This paper presents the results of simulating the ventilation and thermal performance of a new passive cooling and heating system. The new systems was integrated into the roof of a typical contemporary North African house, which was modelled and mounted inside a wind tunnel, for natural ventilation simulation. Thermal performance of the new systems was simulated using a new computer programme (BTS), developed by the author. Results are presented in terms of indoor temperature and CATD and HATD, which are newly introduced concepts in defining the building cooling and heating loads. (Author)

  8. Predictors of laparoscopic simulation performance among practicing obstetrician gynecologists.

    Science.gov (United States)

    Mathews, Shyama; Brodman, Michael; D'Angelo, Debra; Chudnoff, Scott; McGovern, Peter; Kolev, Tamara; Bensinger, Giti; Mudiraj, Santosh; Nemes, Andreea; Feldman, David; Kischak, Patricia; Ascher-Walsh, Charles

    2017-11-01

    While simulation training has been established as an effective method for improving laparoscopic surgical performance in surgical residents, few studies have focused on its use for attending surgeons, particularly in obstetrics and gynecology. Surgical simulation may have a role in improving and maintaining proficiency in the operating room for practicing obstetrician gynecologists. We sought to determine if parameters of performance for validated laparoscopic virtual simulation tasks correlate with surgical volume and characteristics of practicing obstetricians and gynecologists. All gynecologists with laparoscopic privileges (n = 347) from 5 academic medical centers in New York City were required to complete a laparoscopic surgery simulation assessment. The physicians took a presimulation survey gathering physician self-reported characteristics and then performed 3 basic skills tasks (enforced peg transfer, lifting/grasping, and cutting) on the LapSim virtual reality laparoscopic simulator (Surgical Science Ltd, Gothenburg, Sweden). The association between simulation outcome scores (time, efficiency, and errors) and self-rated clinical skills measures (self-rated laparoscopic skill score or surgical volume category) were examined with regression models. The average number of laparoscopic procedures per month was a significant predictor of total time on all 3 tasks (P = .001 for peg transfer; P = .041 for lifting and grasping; P simulation performance as it correlates to active physician practice, further studies may help assess skill and individualize training to maintain skill levels as case volumes fluctuate. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. A Modeling & Simulation Implementation Framework for Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Song Xiao

    2012-10-01

    Full Text Available Classical High Level Architecture (HLA systems are facing development problems for lack of supporting fine-grained component integration and interoperation in large-scale complex simulation applications. To provide efficient methods of this issue, an extensible, reusable and composable simulation framework is proposed. To promote the reusability from coarse-grained federate to fine-grained components, this paper proposes a modelling & simulation framework which consists of component-based architecture, modelling methods, and simulation services to support and simplify the process of complex simulation application construction. Moreover, a standard process and simulation tools are developed to ensure the rapid and effective development of simulation application.

  10. Evaluating performance of container terminal operation using simulation

    Science.gov (United States)

    Nawawi, Mohd Kamal Mohd; Jamil, Fadhilah Che; Hamzah, Firdaus Mohamad

    2015-05-01

    A container terminal is a facility where containers are transshipped from one mode of transport to another. Congestion problem leads to the decreasing of the customer's level of satisfaction. This study presents the application of simulation technique with the main objective of this study is to develop the current model and evaluate the performance of the container terminal. The suitable performance measure used in this study to evaluate the container terminal model are the average waiting time in queue, average of process time at berth, number of vessels enter the berth and resource utilization. Simulation technique was found to be a suitable technique to conduct in this study. The results from the simulation model had proved to solve the problem occurred in the container terminal.

  11. Simulating and stimulating performance: Introducing distributed simulation to enhance musical learning and performance

    Directory of Open Access Journals (Sweden)

    Aaron eWilliamon

    2014-02-01

    Full Text Available Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of real performance could be recreated. Advanced violin students (n=11 were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three expert virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for

  12. A rainfall simulation model for agricultural development in Bangladesh

    Directory of Open Access Journals (Sweden)

    M. Sayedur Rahman

    2000-01-01

    Full Text Available A rainfall simulation model based on a first-order Markov chain has been developed to simulate the annual variation in rainfall amount that is observed in Bangladesh. The model has been tested in the Barind Tract of Bangladesh. Few significant differences were found between the actual and simulated seasonal, annual and average monthly. The distribution of number of success is asymptotic normal distribution. When actual and simulated daily rainfall data were used to drive a crop simulation model, there was no significant difference of rice yield response. The results suggest that the rainfall simulation model perform adequately for many applications.

  13. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M

    2011-01-01

    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  14. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  15. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  16. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    Directory of Open Access Journals (Sweden)

    Yunzhou Fan

    Full Text Available BACKGROUND: Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. OBJECTIVE: This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. METHODS: Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1 outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. RESULTS: In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp <90%. CONCLUSIONS: The temporal simulation model based on healthcare-seeking behaviors offers an accessible method for evaluating the performance of multi-stream surveillance.

  17. Simulation and Performance of Data Communication using AMSS

    Science.gov (United States)

    Ripamonti, Claudio; Konangi, Vijay K.; Kerczewski, Robert J.

    2006-01-01

    This paper reports the findings of a simulation of the Aeronautical Mobile Satellite Service (AMSS) to be used in the ATN (Aeronautical Telecommunications Network). The models of the protocols used in this simulation were designed to be compliant with the International Civil Aviation Organization (ICAO) Standards and Recommended Practices (SARP). The focus of this research is on the data communication capabilities of the AMSS. The simulated performance characteristics for a region of the AMSS are presented. The results are analyzed to determine the efficiency, limitations, and behavior of this service for the foreseen data communication

  18. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  19. Study on bamboo gluing performance numerical simulation

    Science.gov (United States)

    Zhao, Z. R.; Sun, W. H.; Sui, X. M.; Zhang, X. F.

    2018-01-01

    Bamboo gluing timber is a green building materials, can be widely used as modern building beams and columns. The existing bamboo gluing timber is usually produced by bamboo columns or bamboo bundle rolled into by bamboo columns. The performance of new bamboo gluing timber is decided by bamboo adhesion character. Based on this, the cohesive damage model of bamboo gluing is created, experiment results are used to validate the model. The model proposed in the work is agreed on the experimental results. Different bamboo bonding length and bamboo gluing performance is analysed. The model is helpful to bamboo integrated timber application.

  20. Experiments performed with bubbly flow in vertical pipes at different flow conditions covering the transition region: simulation by coupling Eulerian, Lagrangian and 3D random walks models

    Science.gov (United States)

    Muñoz-Cobo, José; Chiva, Sergio; El Aziz Essa, Mohamed; Mendes, Santos

    2012-08-01

    Two phase flow experiments with different superficial velocities of gas and water were performed in a vertical upward isothermal cocurrent air-water flow column with conditions ranging from bubbly flow, with very low void fraction, to transition flow with some cap and slug bubbles and void fractions around 25%. The superficial velocities of the liquid and the gas phases were varied from 0.5 to 3 m/s and from 0 to 0.6 m/s, respectively. Also to check the effect of changing the surface tension on the previous experiments small amounts of 1-butanol were added to the water. These amounts range from 9 to 75 ppm and change the surface tension. This study is interesting because in real cases the surface tension of the water diminishes with temperature, and with this kind of experiments we can study indirectly the effect of changing the temperature on the void fraction distribution. The following axial and radial distributions were measured in all these experiments: void fraction, interfacial area concentration, interfacial velocity, Sauter mean diameter and turbulence intensity. The range of values of the gas superficial velocities in these experiments covered the range from bubbly flow to the transition to cap/slug flow. Also with transition flow conditions we distinguish two groups of bubbles in the experiments, the small spherical bubbles and the cap/slug bubbles. Special interest was devoted to the transition region from bubbly to cap/slug flow; the goal was to understand the physical phenomena that take place during this transition A set of numerical simulations of some of these experiments for bubbly flow conditions has been performed by coupling a Lagrangian code, that tracks the three dimensional motion of the individual bubbles in cylindrical coordinates inside the field of the carrier liquid, to an Eulerian model that computes the magnitudes of continuous phase and to a 3D random walk model that takes on account the fluctuation in the velocity field of the

  1. Model continuity in discrete event simulation: A framework for model-driven development of simulation models

    NARCIS (Netherlands)

    Cetinkaya, D; Verbraeck, A.; Seck, MD

    2015-01-01

    Most of the well-known modeling and simulation (M&S) methodologies state the importance of conceptual modeling in simulation studies, and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to how to

  2. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  3. Simulating performance of solar cells with spectral downshifting layers

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.

    2008-01-01

    In order to estimate the performance of solar cells with downshifters under realistic irradiation conditions we used spectral distributions as they may be found outdoors. The spectral distributions were generated on a minutely basis by means of the spectrum simulation model SEDES2, using minutely

  4. Numerical simulation of the separating performance of hydrocyclones

    Energy Technology Data Exchange (ETDEWEB)

    Ba, Z.S.; Wang, H.L. [Institute of Pressure Vessel and Process Equipment, East China University of Science and Technology, Shanghai (China)

    2006-10-15

    The flow behavior in hydrocyclones is quite complex. The Computational Fluid Dynamics (CFD) method was used to simulate the flow fields inside a hydrocyclone in order to improve its separation efficiency. The RSM turbulent model (Reynolds Stress Model), which abandons the isotropic eddy-viscosity hypothesis, was used to analyze the highly swirling flow fields in hydrocyclones. The ASM Model (Algebraic Slip Mixture Model) was used to simulate the separation performance. The volume fraction distribution and grade efficiency curve are given. The separating efficiency for 60 {mu}m water particles is more than 90 %. The majority of 60 {mu}m water particles are carried to the underflow. An increase in particle size will improve the efficiency by increasing the centrifugal force on the particles. Based on the simulation, the effects of the overflow tube dimensions on the separation performance were studied. The overflow tube dimensions of the hydrocyclone were modified, and the results showed that the Reynolds Stress Model successfully predicted the characteristics of the flow, and the simulated performances were in good agreement with those obtained by tests. (Abstract Copyright [2006], Wiley Periodicals, Inc.)

  5. Comparison of turbulence measurements from DIII-D low-mode and high-performance plasmas to turbulence simulations and models

    International Nuclear Information System (INIS)

    Rhodes, T.L.; Leboeuf, J.-N.; Sydora, R.D.; Groebner, R.J.; Doyle, E.J.; McKee, G.R.; Peebles, W.A.; Rettig, C.L.; Zeng, L.; Wang, G.

    2002-01-01

    Measured turbulence characteristics (correlation lengths, spectra, etc.) in low-confinement (L-mode) and high-performance plasmas in the DIII-D tokamak [Luxon et al., Proceedings Plasma Physics and Controlled Nuclear Fusion Research 1986 (International Atomic Energy Agency, Vienna, 1987), Vol. I, p. 159] show many similarities with the characteristics determined from turbulence simulations. Radial correlation lengths Δr of density fluctuations from L-mode discharges are found to be numerically similar to the ion poloidal gyroradius ρ θ,s , or 5-10 times the ion gyroradius ρ s over the radial region 0.2 θ,s or 5-10 times ρ s , an experiment was performed which modified ρ θs while keeping other plasma parameters approximately fixed. It was found that the experimental Δr did not scale as ρ θ,s , which was similar to low-resolution UCAN simulations. Finally, both experimental measurements and gyrokinetic simulations indicate a significant reduction in the radial correlation length from high-performance quiescent double barrier discharges, as compared to normal L-mode, consistent with reduced transport in these high-performance plasmas

  6. Facility equipment performance evaluation using microcomputer simulation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chockie, A.D.; Hostick, C.J.

    1985-11-01

    The Pacific Northwest Laboratory (PNL) recently developed a facility performance assessment model as part of the US Department of Energy-sponsored monitored retrievable storage (MRS) program at PNL. The objective in the development of this model was to assist in the evaluation of the operational performance of the MRS facility design alternatives. The microcomputer-based simulation model provided a technique for the analysis of the design and performance of alternative MRS facility systems. The model was applied to the analysis of the material flow, equipment capability, and facility layout of various designs for a facility to receive and canister spent fuel from commercial nuclear power plants. Programs were also developed that evaluated alternative facility operating schedules and facility-equipment designs. The paper is a review of the facility performance assessment model and its advantages and benefits in the analysis of alternative facility designs employing varying degrees of remote handling capability.

  7. Numerical Simulation and Performance Analysis of Twin Screw Air Compressors

    Directory of Open Access Journals (Sweden)

    W. S. Lee

    2001-01-01

    Full Text Available A theoretical model is proposed in this paper in order to study the performance of oil-less and oil-injected twin screw air compressors. Based on this model, a computer simulation program is developed and the effects of different design parameters including rotor profile, geometric clearance, oil-injected angle, oil temperature, oil flow rate, built-in volume ratio and other operation conditions on the performance of twin screw air compressors are investigated. The simulation program gives us output variables such as specific power, compression ratio, compression efficiency, volumetric efficiency, and discharge temperature. Some of the above results are then compared with experimentally measured data and good agreement is found between the simulation results and the measured data.

  8. Facility/equipment performance evaluation using microcomputer simulation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered.

  9. Wedge Experiment Modeling and Simulation for Reactive Flow Model Calibration

    Science.gov (United States)

    Maestas, Joseph T.; Dorgan, Robert J.; Sutherland, Gerrit T.

    2017-06-01

    Wedge experiments are a typical method for generating pop-plot data (run-to-detonation distance versus input shock pressure), which is used to assess an explosive material's initiation behavior. Such data can be utilized to calibrate reactive flow models by running hydrocode simulations and successively tweaking model parameters until a match between experiment is achieved. Typical simulations are performed in 1D and typically use a flyer impact to achieve the prescribed shock loading pressure. In this effort, a wedge experiment performed at the Army Research Lab (ARL) was modeled using CTH (SNL hydrocode) in 1D, 2D, and 3D space in order to determine if there was any justification in using simplified models. A simulation was also performed using the BCAT code (CTH companion tool) that assumes a plate impact shock loading. Results from the simulations were compared to experimental data and show that the shock imparted into an explosive specimen is accurately captured with 2D and 3D simulations, but changes significantly in 1D space and with the BCAT tool. The difference in shock profile is shown to only affect numerical predictions for large run distances. This is attributed to incorrectly capturing the energy fluence for detonation waves versus flat shock loading. Portions of this work were funded through the Joint Insensitive Munitions Technology Program.

  10. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and fie...... as support decision making. However, several other factors affect decision making such as, ethics, politics and economics. Furthermore, the insight gained when models are build leads to point out areas where knowledge is lacking....... of FMD spread that can provide useful and trustworthy advises, there are four important issues, which the model should represent: 1) The herd structure of the country in question, 2) the dynamics of animal movements and contacts between herds, 3) the biology of the disease, and 4) the regulations...

  11. Vermont Yankee simulator BOP model upgrade

    International Nuclear Information System (INIS)

    Alejandro, R.; Udbinac, M.J.

    2006-01-01

    The Vermont Yankee simulator has undergone significant changes in the 20 years since the original order was placed. After the move from the original Unix to MS Windows environment, and upgrade to the latest version of SimPort, now called MASTER, the platform was set for an overhaul and replacement of major plant system models. Over a period of a few months, the VY simulator team, in partnership with WSC engineers, replaced outdated legacy models of the main steam, condenser, condensate, circulating water, feedwater and feedwater heaters, and main turbine and auxiliaries. The timing was ideal, as the plant was undergoing a power up-rate, so the opportunity was taken to replace the legacy models with industry-leading, true on-line object oriented graphical models. Due to the efficiency of design and ease of use of the MASTER tools, VY staff performed the majority of the modeling work themselves with great success, with only occasional assistance from WSC, in a relatively short time-period, despite having to maintain all of their 'regular' simulator maintenance responsibilities. This paper will provide a more detailed view of the VY simulator, including how it is used and how it has benefited from the enhancements and upgrades implemented during the project. (author)

  12. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  13. Predictive accuracy of novel risk factors and markers: A simulation study of the sensitivity of different performance measures for the Cox proportional hazards regression model

    NARCIS (Netherlands)

    P.C. Austin (Peter); Pencinca, M.J. (Michael J.); E.W. Steyerberg (Ewout)

    2017-01-01

    textabstractPredicting outcomes that occur over time is important in clinical, population health, and health services research. We compared changes in different measures of performance when a novel risk factor or marker was added to an existing Cox proportional hazards regression model. We performed

  14. Simulation models in population breast cancer screening : A systematic review

    NARCIS (Netherlands)

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for

  15. Maneuver simulation model of an experimental hovercraft for the Antarctic

    Science.gov (United States)

    Murao, Rinichi

    Results of an investigation of a hovercraft model designed for Antarctic conditions are presented. The buoyancy characteristics, the propellant control system, and simulation model control are examined. An ACV (air cushion vehicle) model of the hovercraft is used to examine the flexibility and friction of the skirt. Simulation results are presented which show the performance of the hovercraft.

  16. Analysing the temporal dynamics of model performance for hydrological models

    NARCIS (Netherlands)

    Reusser, D.E.; Blume, T.; Schaefli, B.; Zehe, E.

    2009-01-01

    The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or

  17. Simulation of arc models with the block modelling method

    NARCIS (Netherlands)

    Thomas, R.; Lahaye, D.J.P.; Vuik, C.; Van der Sluis, L.

    2015-01-01

    Simulation of current interruption is currently performed with non-ideal switching devices for large power systems. Nevertheless, for small networks, non-ideal switching devices can be substituted by arc models. However, this substitution has a negative impact on the computation time. At the same

  18. Cognitive load predicts point-of-care ultrasound simulator performance.

    Science.gov (United States)

    Aldekhyl, Sara; Cavalcanti, Rodrigo B; Naismith, Laura M

    2018-02-01

    The ability to maintain good performance with low cognitive load is an important marker of expertise. Incorporating cognitive load measurements in the context of simulation training may help to inform judgements of competence. This exploratory study investigated relationships between demographic markers of expertise, cognitive load measures, and simulator performance in the context of point-of-care ultrasonography. Twenty-nine medical trainees and clinicians at the University of Toronto with a range of clinical ultrasound experience were recruited. Participants answered a demographic questionnaire then used an ultrasound simulator to perform targeted scanning tasks based on clinical vignettes. Participants were scored on their ability to both acquire and interpret ultrasound images. Cognitive load measures included participant self-report, eye-based physiological indices, and behavioural measures. Data were analyzed using a multilevel linear modelling approach, wherein observations were clustered by participants. Experienced participants outperformed novice participants on ultrasound image acquisition. Ultrasound image interpretation was comparable between the two groups. Ultrasound image acquisition performance was predicted by level of training, prior ultrasound training, and cognitive load. There was significant convergence between cognitive load measurement techniques. A marginal model of ultrasound image acquisition performance including prior ultrasound training and cognitive load as fixed effects provided the best overall fit for the observed data. In this proof-of-principle study, the combination of demographic and cognitive load measures provided more sensitive metrics to predict ultrasound simulator performance. Performance assessments which include cognitive load can help differentiate between levels of expertise in simulation environments, and may serve as better predictors of skill transfer to clinical practice.

  19. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  20. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Development of improved thermal hydraulics and fuel performance technology; development of turbulence model and simulation code for flow analysis in nuclear fuel assembly

    Energy Technology Data Exchange (ETDEWEB)

    Myung, H. K.; Yang, S. Y.; Kim, B. H.; Song, J. H.; Oh, J. Z. [Kookmin University, Seoul (Korea)

    2002-03-01

    The flow through a nuclear rod bundle with mixing vanes is very complex and so required a suitable turbulence model for its accurate prediction. Subchannel flow in a nuclear bundle having vanes to mix flow appears complex turbulent flow. Objective of this study is to investigate performance of prediction about turbulence model contained in STAR-CD code and to develop suitable turbulence model which can predict complex flow in nuclear assembly. For several nonlinear {kappa}-{epsilon} turbulence models, their performance were investigated in the prediction of the flow in nuclear fuel assembly, and also their problems were discussed in detail. The results obtained from the present research would give a help for the development of turbulence model which can accurately predict the flow through the rod bundles with mixing vanes. 19 refs., 32 figs., 3 tabs. (Author)

  2. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    studied to assess the possibilities for using three-dimensional cores in training simulators. The core model results have been compared with the Loviisa WWER-type plant measurement data in steady state and in some transients. Hypothetical control rod withdrawal, ejection and boron dilution transients have been calculated with various three-dimensional core models for the Loviisa WWER-440 core. Several ATWS analyses for the WWER-1000/91 plant have been performed using the three-dimensional core model. In this context, the results of APROS have been compared in detail with the results of the HEXTRAN code. The three-dimensional Olkiluoto BWR-type core model has been used for transient calculation and for severe accident re-criticality studies. The one-dimensional core model is at present used in several plant analyser and training simulator applications and it has been used extensively for safety analyses in the Loviisa WWER-440 plant modernisation project. (orig.) 75 refs. The thesis includes also eight previous publications by author

  3. Ion thruster modeling: Particle simulations and experimental validations

    International Nuclear Information System (INIS)

    Wang, Joseph; Polk, James; Brinza, David

    2003-01-01

    This paper presents results from ion thruster modeling studies performed in support of NASA's Deep Space 1 mission and NSTAR project. Fully 3-dimensional computer particle simulation models are presented for ion optics plasma flow and ion thruster plume. Ion optics simulation results are compared with measurements obtained from ground tests of the NSTAR ion thruster. Plume simulation results are compared with in-flight measurements from the Deep Space 1 spacecraft. Both models show excellent agreement with experimental data

  4. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  5. Impulse pumping modelling and simulation

    International Nuclear Information System (INIS)

    Pierre, B; Gudmundsson, J S

    2010-01-01

    Impulse pumping is a new pumping method based on propagation of pressure waves. Of particular interest is the application of impulse pumping to artificial lift situations, where fluid is transported from wellbore to wellhead using pressure waves generated at wellhead. The motor driven element of an impulse pumping apparatus is therefore located at wellhead and can be separated from the flowline. Thus operation and maintenance of an impulse pump are facilitated. The paper describes the different elements of an impulse pumping apparatus, reviews the physical principles and details the modelling of the novel pumping method. Results from numerical simulations of propagation of pressure waves in water-filled pipelines are then presented for illustrating impulse pumping physical principles, and validating the described modelling with experimental data.

  6. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  7. Predictive accuracy of risk factors and markers: a simulation study of the effect of novel markers on different performance measures for logistic regression models.

    Science.gov (United States)

    Austin, Peter C; Steyerberg, Ewout W

    2013-02-20

    The change in c-statistic is frequently used to summarize the change in predictive accuracy when a novel risk factor is added to an existing logistic regression model. We explored the relationship between the absolute change in the c-statistic, Brier score, generalized R(2) , and the discrimination slope when a risk factor was added to an existing model in an extensive set of Monte Carlo simulations. The increase in model accuracy due to the inclusion of a novel marker was proportional to both the prevalence of the marker and to the odds ratio relating the marker to the outcome but inversely proportional to the accuracy of the logistic regression model with the marker omitted. We observed greater improvements in model accuracy when the novel risk factor or marker was uncorrelated with the existing predictor variable compared with when the risk factor has a positive correlation with the existing predictor variable. We illustrated these findings by using a study on mortality prediction in patients hospitalized with heart failure. In conclusion, the increase in predictive accuracy by adding a marker should be considered in the context of the accuracy of the initial model. Copyright © 2012 John Wiley & Sons, Ltd.

  8. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón

    2013-08-01

    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  9. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  10. Powertrain modeling and simulation for off-road vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Ouellette, S. [McGill Univ., Montreal, PQ (Canada)

    2010-07-01

    Standard forward facing automotive powertrain modeling and simulation methodology did not perform equally for all vehicles in all applications in the 2010 winter Olympics, 2009 world alpine ski championships, summit station in Greenland, the McGill Formula Hybrid, Unicell QuickSider, and lunar mobility. This presentation provided a standard automotive powertrain modeling and simulation flow chart as well as an example. It also provided a flow chart for location based powertrain modeling and simulation and discussed location based powertrain modeling and simulation implementation. It was found that in certain applications, vehicle-environment interactions cannot be neglected in order to have good model fidelity. Powertrain modeling and simulation of off-road vehicles demands a new approach to powertrain modeling and simulation. It was concluded that the proposed location based methodology could improve the results for off-road vehicles. tabs., figs.

  11. Human Performance in Simulated Reduced Gravity Environments

    Science.gov (United States)

    Cowley, Matthew; Harvill, Lauren; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. Our current understanding of human performance in reduced gravity in a planetary environment (the moon or Mars) is limited to lunar observations, studies from the Apollo program, and recent suit tests conducted at JSC using reduced gravity simulators. This study will look at our most recent reduced gravity simulations performed on the new Active Response Gravity Offload System (ARGOS) compared to the C-9 reduced gravity plane. Methods: Subjects ambulated in reduced gravity analogs to obtain a baseline for human performance. Subjects were tested in lunar gravity (1.6 m/sq s) and Earth gravity (9.8 m/sq s) in shirt-sleeves. Subjects ambulated over ground at prescribed speeds on the ARGOS, but ambulated at a self-selected speed on the C-9 due to time limitations. Subjects on the ARGOS were given over 3 minutes to acclimate to the different conditions before data was collected. Nine healthy subjects were tested in the ARGOS (6 males, 3 females, 79.5 +/- 15.7 kg), while six subjects were tested on the C-9 (6 males, 78.8 +/- 11.2 kg). Data was collected with an optical motion capture system (Vicon, Oxford, UK) and was analyzed using customized analysis scripts in BodyBuilder (Vicon, Oxford, UK) and MATLAB (MathWorks, Natick, MA, USA). Results: In all offloaded conditions, variation between subjects increased compared to 1-g. Kinematics in the ARGOS at lunar gravity resembled earth gravity ambulation more closely than the C-9 ambulation. Toe-off occurred 10% earlier in both reduced gravity environments compared to earth gravity, shortening the stance phase. Likewise, ankle, knee, and hip angles remained consistently flexed and had reduced peaks compared to earth gravity. Ground reaction forces in lunar gravity (normalized to Earth body weight) were 0.4 +/- 0.2 on

  12. Active site modeling in copper azurin molecular dynamics simulations

    NARCIS (Netherlands)

    Rizzuti, B; Swart, M; Sportelli, L; Guzzi, R

    Active site modeling in molecular dynamics simulations is investigated for the reduced state of copper azurin. Five simulation runs (5 ns each) were performed at room temperature to study the consequences of a mixed electrostatic/constrained modeling for the coordination between the metal and the

  13. A Simulation Model for Extensor Tendon Repair

    Directory of Open Access Journals (Sweden)

    Elizabeth Aronstam

    2017-07-01

    Full Text Available Audience: This simulation model is designed for use by emergency medicine residents. Although we have instituted this at the PGY-2 level of our residency curriculum, it is appropriate for any level of emergency medicine residency training. It might also be adapted for use for a variety of other learners, such as practicing emergency physicians, orthopedic surgery residents, or hand surgery trainees. Introduction: Tendon injuries commonly present to the emergency department, so it is essential that emergency physicians be competent in evaluating such injuries. Indeed, extensor tendon repair is included as an ACGME Emergency Medicine Milestone (Milestone 13, Wound Management, Level 5 – “Performs advanced wound repairs, such as tendon repairs…”.1 However, emergency medicine residents may have limited opportunity to develop these skills due to a lack of patients, competition from other trainees, or preexisting referral patterns. Simulation may provide an alternative means to effectively teach these skills in such settings. Previously described tendon repair simulation models that were designed for surgical trainees have used rubber worms4, licorice5, feeding tubes, catheters6,7, drinking straws8, microfoam tape9, sheep forelimbs10 and cadavers.11 These models all suffer a variety of limitations, including high cost, lack of ready availability, or lack of realism. Objectives: We sought to develop an extensor tendon repair simulation model for emergency medicine residents, designed to meet ACGME Emergency Medicine Milestone 13, Level 5. We wished this model to be simple, inexpensive, and realistic. Methods: The learner responsible content/educational handout component of our innovation teaches residents about emergency department extensor tendon repair, and includes: 1 relevant anatomy 2 indications and contraindications for emergency department extensor tendon repair 3 physical exam findings 4 tendon suture techniques and 5 aftercare. During

  14. On the performance simulation of inter-stage turbine reheat

    International Nuclear Information System (INIS)

    Pellegrini, Alvise; Nikolaidis, Theoklis; Pachidis, Vassilios; Köhler, Stephan

    2017-01-01

    Highlights: • An innovative gas turbine performance simulation methodology is proposed. • It allows to perform DP and OD performance calculations for complex engines layouts. • It is essential for inter-turbine reheat (ITR) engine performance calculation. • A detailed description is provided for fast and flexible implementation. • The methodology is successfully verified against a commercial closed-source software. - Abstract: Several authors have suggested the implementation of reheat in high By-Pass Ratio (BPR) aero engines, to improve engine performance. In contrast to military afterburning, civil aero engines would aim at reducing Specific Fuel Consumption (SFC) by introducing ‘Inter-stage Turbine Reheat’ (ITR). To maximise benefits, the second combustor should be placed at an early stage of the expansion process, e.g. between the first and second High-Pressure Turbine (HPT) stages. The aforementioned cycle design requires the accurate simulation of two or more turbine stages on the same shaft. The Design Point (DP) performance can be easily evaluated by defining a Turbine Work Split (TWS) ratio between the turbine stages. However, the performance simulation of Off-Design (OD) operating points requires the calculation of the TWS parameter for every OD step, by taking into account the thermodynamic behaviour of each turbine stage, represented by their respective maps. No analytical solution of the aforementioned problem is currently available in the public domain. This paper presents an analytical methodology by which ITR can be simulated at DP and OD. Results show excellent agreement with a commercial, closed-source performance code; discrepancies range from 0% to 3.48%, and are ascribed to the different gas models implemented in the codes.

  15. Management of Industrial Performance Indicators: Regression Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Walter Roberto Hernandez Vergara

    2017-11-01

    Full Text Available Stochastic methods can be used in problem solving and explanation of natural phenomena through the application of statistical procedures. The article aims to associate the regression analysis and systems simulation, in order to facilitate the practical understanding of data analysis. The algorithms were developed in Microsoft Office Excel software, using statistical techniques such as regression theory, ANOVA and Cholesky Factorization, which made it possible to create models of single and multiple systems with up to five independent variables. For the analysis of these models, the Monte Carlo simulation and analysis of industrial performance indicators were used, resulting in numerical indices that aim to improve the goals’ management for compliance indicators, by identifying systems’ instability, correlation and anomalies. The analytical models presented in the survey indicated satisfactory results with numerous possibilities for industrial and academic applications, as well as the potential for deployment in new analytical techniques.

  16. A New Model for Simulating TSS Washoff in Urban Areas

    Directory of Open Access Journals (Sweden)

    E. Crobeddu

    2011-01-01

    Full Text Available This paper presents the formulation and validation of the conceptual Runoff Quality Simulation Model (RQSM that was developed to simulate the erosion and transport of solid particles in urban areas. The RQSM assumes that solid particle accumulation on pervious and impervious areas is infinite. The RQSM simulates soil erosion using rainfall kinetic energy and solid particle transport with linear system theory. A sensitivity analysis was conducted on the RQSM to show the influence of each parameter on the simulated load. Total suspended solid (TSS loads monitored at the outlet of the borough of Verdun in Canada and at three catchment outlets of the City of Champaign in the United States were used to validate the RQSM. TSS loads simulated by the RQSM were compared to measured loads and to loads simulated by the Rating Curve model and the Exponential model of the SWMM software. The simulation performance of the RQSM was comparable to the Exponential and Rating Curve models.

  17. Modeling and simulation of gamma camera

    International Nuclear Information System (INIS)

    Singh, B.; Kataria, S.K.; Samuel, A.M.

    2002-08-01

    Simulation techniques play a vital role in designing of sophisticated instruments and also for the training of operating and maintenance staff. Gamma camera systems have been used for functional imaging in nuclear medicine. Functional images are derived from the external counting of the gamma emitting radioactive tracer that after introduction in to the body mimics the behavior of native biochemical compound. The position sensitive detector yield the coordinates of the gamma ray interaction with the detector and are used to estimate the point of gamma ray emission within the tracer distribution space. This advanced imaging device is thus dependent on the performance of algorithm for coordinate computing, estimation of point of emission, generation of image and display of the image data. Contemporary systems also have protocols for quality control and clinical evaluation of imaging studies. Simulation of this processing leads to understanding of the basic camera design problems. This report describes a PC based package for design and simulation of gamma camera along with the options of simulating data acquisition and quality control of imaging studies. Image display and data processing the other options implemented in SIMCAM will be described in separate reports (under preparation). Gamma camera modeling and simulation in SIMCAM has preset configuration of the design parameters for various sizes of crystal detector with the option to pack the PMT on hexagon or square lattice. Different algorithm for computation of coordinates and spatial distortion removal are allowed in addition to the simulation of energy correction circuit. The user can simulate different static, dynamic, MUGA and SPECT studies. The acquired/ simulated data is processed for quality control and clinical evaluation of the imaging studies. Results show that the program can be used to assess these performances. Also the variations in performance parameters can be assessed due to the induced

  18. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  19. Crowd Human Behavior for Modeling and Simulation

    Science.gov (United States)

    2009-08-06

    Crowd Human Behavior for Modeling and Simulation Elizabeth Mezzacappa, Ph.D. & Gordon Cooke, MEME Target Behavioral Response Laboratory, ARDEC...TYPE Conference Presentation 3. DATES COVERED 00-00-2008 to 00-00-2009 4. TITLE AND SUBTITLE Crowd Human Behavior for Modeling and Simulation...34understanding human behavior " and "model validation and verification" and will focus on modeling and simulation of crowds from a social scientist???s

  20. Fully Coupled Simulation of Lithium Ion Battery Cell Performance

    Energy Technology Data Exchange (ETDEWEB)

    Trembacki, Bradley L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Murthy, Jayathi Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Scott Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Lithium-ion battery particle-scale (non-porous electrode) simulations applied to resolved electrode geometries predict localized phenomena and can lead to better informed decisions on electrode design and manufacturing. This work develops and implements a fully-coupled finite volume methodology for the simulation of the electrochemical equations in a lithium-ion battery cell. The model implementation is used to investigate 3D battery electrode architectures that offer potential energy density and power density improvements over traditional layer-by-layer particle bed battery geometries. Advancement of micro-scale additive manufacturing techniques has made it possible to fabricate these 3D electrode microarchitectures. A variety of 3D battery electrode geometries are simulated and compared across various battery discharge rates and length scales in order to quantify performance trends and investigate geometrical factors that improve battery performance. The energy density and power density of the 3D battery microstructures are compared in several ways, including a uniform surface area to volume ratio comparison as well as a comparison requiring a minimum manufacturable feature size. Significant performance improvements over traditional particle bed electrode designs are observed, and electrode microarchitectures derived from minimal surfaces are shown to be superior. A reduced-order volume-averaged porous electrode theory formulation for these unique 3D batteries is also developed, allowing simulations on the full-battery scale. Electrode concentration gradients are modeled using the diffusion length method, and results for plate and cylinder electrode geometries are compared to particle-scale simulation results. Additionally, effective diffusion lengths that minimize error with respect to particle-scale results for gyroid and Schwarz P electrode microstructures are determined.

  1. Crystal and molecular simulation of high-performance polymers.

    Science.gov (United States)

    Colquhoun, H M; Williams, D J

    2000-03-01

    Single-crystal X-ray analyses of oligomeric models for high-performance aromatic polymers, interfaced to computer-based molecular modeling and diffraction simulation, have enabled the determination of a range of previously unknown polymer crystal structures from X-ray powder data. Materials which have been successfully analyzed using this approach include aromatic polyesters, polyetherketones, polythioetherketones, polyphenylenes, and polycarboranes. Pure macrocyclic homologues of noncrystalline polyethersulfones afford high-quality single crystals-even at very large ring sizes-and have provided the first examples of a "protein crystallographic" approach to the structures of conventionally amorphous synthetic polymers.

  2. Simulation Model for DMEK Donor Preparation.

    Science.gov (United States)

    Mittal, Vikas; Mittal, Ruchi; Singh, Swati; Narang, Purvasha; Sridhar, Priti

    2018-04-09

    To demonstrate a simulation model for donor preparation in Descemet membrane endothelial keratoplasty (DMEK). The inner transparent membrane of the onion (Allium cepa) was used as a simulation model for human Descemet membrane (DM). Surgical video (see Video, Supplemental Digital Content 1, http://links.lww.com/ICO/A663) demonstrating all the steps was recorded. This model closely simulates human DM and helps DMEK surgeons learn the nuances of DM donor preparation steps with ease. The technique is repeatable, and the model is cost-effective. The described simulation model can assist surgeons and eye bank technicians to learn steps in donor preparation in DMEK.

  3. Desempenho de bovinos simulado pelo modelo Pampa Corte e obtido por experimentação Animal performance simulated by Pampa Corte model with experimental records

    Directory of Open Access Journals (Sweden)

    Naíme de Barcellos Trevisan

    2009-02-01

    Full Text Available Este trabalho tem como objetivo verificar a confiabilidade do Modelo Pampa Corte na predição de desempenho de bovinos de corte, em sistemas de pastejo. Para tanto, foram confrontados os valores preditos pelo modelo com dados disponíveis na literatura. Foram verificados coeficientes de correlação acima de 90% entre os dados reais e os simulados em todas as alternativas testadas. O banco de dados do Modelo precisa ser ampliado em termos de alternativas de produtividade das forrageiras, em diferentes condições climáticas. Os parâmetros qualitativos degradabilidade da proteína bruta e fibra em detergente neutro da consorciação aveia preta e azevém necessitam ainda ser pesquisados, assim como o desempenho de animais em pastagens singulares de aveia ou azevém.This study had the objective to evaluate Pampa Corte Model’s reliability in predicting beef cattle performance in grazing systems. For this purpose, model’s predicted values were compared to available data base of published papers. Correlation coefficients above 90 % were obtained between simulated and real data in all tested alternatives. Model’s data base should be enlarged by forage productivity data in different climatic conditions. Mixtures of Italian ryegrass and oat need more studies to obtain qualitative parameters (crude protein degradability and neutral detergent fiber, as well, animal performance in the single pastures of oat or Italian ryegrass.

  4. The new rosetta targets observations, simulations and instrument performances

    CERN Document Server

    Epifani, Elena; Palumbo, Pasquale

    2004-01-01

    The Rosetta mission was successfully launched on March 2nd, 2004 for a rendezvous with the short period comet 67PChuryumov-Gerasimenko in 2014 The new baseline mission foresees also a double fly-by with asteroids 21 Lutetia and 2867 Steins, on the way towards the primary target This volume collects papers presented at the workshop on "The NEW Rosetta targets Observations, simulations and instrument performances", held in Capri on October 13-15, 2003 The papers cover the fields of observations of the new Rosetta targets, laboratory experiments and theoretical simulation of cometary processes, and the expected performances of Rosetta experiments Until real operations around 67PChuryumov-Gerasimenko will start in 10 years from now, new astronomical observations, laboratory experiments and theoretical models are required The goals are to increase knowledge about physics and chemistry of comets and to prepare to exploit at best Rosetta data

  5. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  6. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  7. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  8. Modeling and Performance Analysis of Manufacturing Systems in ...

    African Journals Online (AJOL)

    This study deals with modeling and performance analysis of footwear manufacturing using arena simulation modeling software. It was investigated that modeling and simulation is a potential tool for modeling and analysis of manufacturing assembly lines like footwear manufacturing because it allows the researcher to ...

  9. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  10. Performance Analysis of Wavelet Channel Coding in COST207-based Channel Models on Simulated Radio-over-Fiber Systems at the W-Band

    DEFF Research Database (Denmark)

    Cavalcante, Lucas Costa Pereira; Silveira, Luiz F. Q.; Rommel, Simon

    2016-01-01

    , such systems use diversity schemes in combination with digital signal processing (DSP) techniques to overcome effects such as fading and inter-symbol interference (ISI). Wavelet Channel Coding (WCC) has emerged as a technique to minimize the fading effects of wireless channels, which is a mayor challenge......Millimeter wave communications based on photonic technologies have gained increased attention to provide optic fiber-like capacity in wireless environments. However, the new hybrid fiber-wireless channel represents new challenges in terms of signal transmission performance analysis. Traditionally...... in systems operating in the millimeter wave regime. This work takes the WCC one step beyond by performance evaluation in terms of bit error probability, over time-varying, frequency-selective multipath Rayleigh fading channels. The adopted propagation model follows the COST207 norm, the main international...

  11. A study for production simulation model generation system based on data model at a shipyard

    Directory of Open Access Journals (Sweden)

    Myung-Gi Back

    2016-09-01

    Full Text Available Simulation technology is a type of shipbuilding product lifecycle management solution used to support production planning or decision-making. Normally, most shipbuilding processes are consisted of job shop production, and the modeling and simulation require professional skills and experience on shipbuilding. For these reasons, many shipbuilding companies have difficulties adapting simulation systems, regardless of the necessity for the technology. In this paper, the data model for shipyard production simulation model generation was defined by analyzing the iterative simulation modeling procedure. The shipyard production simulation data model defined in this study contains the information necessary for the conventional simulation modeling procedure and can serve as a basis for simulation model generation. The efficacy of the developed system was validated by applying it to the simulation model generation of the panel block production line. By implementing the initial simulation model generation process, which was performed in the past with a simulation modeler, the proposed system substantially reduced the modeling time. In addition, by reducing the difficulties posed by different modeler-dependent generation methods, the proposed system makes the standardization of the simulation model quality possible.

  12. Systematic simulations of modified gravity: chameleon models

    Energy Technology Data Exchange (ETDEWEB)

    Brax, Philippe [Institut de Physique Theorique, CEA, IPhT, CNRS, URA 2306, F-91191Gif/Yvette Cedex (France); Davis, Anne-Christine [DAMTP, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Li, Baojiu [Institute for Computational Cosmology, Department of Physics, Durham University, Durham DH1 3LE (United Kingdom); Winther, Hans A. [Institute of Theoretical Astrophysics, University of Oslo, 0315 Oslo (Norway); Zhao, Gong-Bo, E-mail: philippe.brax@cea.fr, E-mail: a.c.davis@damtp.cam.ac.uk, E-mail: baojiu.li@durham.ac.uk, E-mail: h.a.winther@astro.uio.no, E-mail: gong-bo.zhao@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 3FX (United Kingdom)

    2013-04-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.

  13. Systematic simulations of modified gravity: chameleon models

    International Nuclear Information System (INIS)

    Brax, Philippe; Davis, Anne-Christine; Li, Baojiu; Winther, Hans A.; Zhao, Gong-Bo

    2013-01-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc −1 , since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future

  14. Architectural and growth traits differ in effects on performance of clonal plants: an analysis using a field-parameterized simulation model

    Czech Academy of Sciences Publication Activity Database

    Wildová, Radka; Gough, L.; Herben, Tomáš; Hershock, Ch.; Goldberg, D. E.

    2007-01-01

    Roč. 116, č. 5 (2007), s. 836-852 ISSN 0030-1299 R&D Projects: GA ČR(CZ) GA206/02/0953; GA ČR(CZ) GA206/02/0578 Grant - others:NSF(US) DEB99-74296; NSF(US) DEB99-74284 Institutional research plan: CEZ:AV0Z60050516 Keywords : individual-based model * performance * plant architecture * competitive response * resource allocation Subject RIV: EF - Botanics Impact factor: 3.136, year: 2007

  15. Performance Simulation Comparison for Parabolic Trough Solar Collectors in China

    Directory of Open Access Journals (Sweden)

    Jinping Wang

    2016-01-01

    Full Text Available Parabolic trough systems are the most used concentrated solar power technology. The operating performance and optical efficiency of the parabolic trough solar collectors (PTCs are different in different regions and different seasons. To determine the optimum design and operation of the parabolic trough solar collector throughout the year, an accurate estimation of the daily performance is needed. In this study, a mathematical model for the optical efficiency of the parabolic trough solar collector was established and three typical regions of solar thermal utilization in China were selected. The performance characteristics of cosine effect, shadowing effect, end loss effect, and optical efficiency were calculated and simulated during a whole year in these three areas by using the mathematical model. The simulation results show that the optical efficiency of PTCs changes from 0.4 to 0.8 in a whole year. The highest optical efficiency of PTCs is in June and the lowest is in December. The optical efficiency of PTCs is mainly influenced by the solar incidence angle. The model is validated by comparing the test results in parabolic trough power plant, with relative error range of 1% to about 5%.

  16. Indium (In) Effects to The Efficiency Performance of Ga1-XInxP/GaAs Based Solar Cell Using Silvaco Software Modelling & Simulation

    Science.gov (United States)

    Norizan, M. N.; Zahari, S. M.; Mohamad, I. S.; Osman, R. A. M.; Shahimin, M. M.; Murad, S. A. Z.

    2017-06-01

    Ga1-xInxP composition has been applied to the top cell of multi-junction GaInP/GaAs based solar cell and currently have achieving a conversion efficiency of more than 46%, however its capability is unclear. We performed an analysis using Silvaco simulation method to evaluate the effect of In and the substitution was made to the Ga1-xInxP for the range of x from 0 to 1. We found that the highest efficiency recorded was 17.66% when the composition of Indium was x=1. The efficiency has been increasing about 11.71% from x=0 to x=1 In content. As the composition of In raised, the value of efficiency and short circuit current density, Jsc also become higher (13.60 mA/cm2) by having a greater photon absorption in a wider band gap energy. In addition to that, Voc, Pmax, Vmax, Imax and fill factor was measured to be 2.15 V, 2.44 mW/cm2, 2.0 V, 1.22 mA/cm2 and 83.34 respectively. In conclusion, this study confirms that the existence of In in Ga1-xInxP improves the solar cell efficiency by gaining a higher energy gap and producing more electrons for best achievement in multilayer solar cell applications.

  17. Modelling and simulation of a heat exchanger

    Science.gov (United States)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  18. Far-Field Acoustic Power Level and Performance Analyses of F31/A31 Open Rotor Model at Simulated Scaled Takeoff, Nominal Takeoff, and Approach Conditions: Technical Report I

    Science.gov (United States)

    Sree, Dave

    2015-01-01

    Far-field acoustic power level and performance analyses of open rotor model F31/A31 have been performed to determine its noise characteristics at simulated scaled takeoff, nominal takeoff, and approach flight conditions. The nonproprietary parts of the data obtained from experiments in 9- by 15-Foot Low-Speed Wind Tunnel (9?15 LSWT) tests were provided by NASA Glenn Research Center to perform the analyses. The tone and broadband noise components have been separated from raw test data by using a new data analysis tool. Results in terms of sound pressure levels, acoustic power levels, and their variations with rotor speed, angle of attack, thrust, and input shaft power have been presented and discussed. The effect of an upstream pylon on the noise levels of the model has been addressed. Empirical equations relating model's acoustic power level, thrust, and input shaft power have been developed. The far-field acoustic efficiency of the model is also determined for various simulated flight conditions. It is intended that the results presented in this work will serve as a database for comparison and improvement of other open rotor blade designs and also for validating open rotor noise prediction codes.

  19. Principles of Sonar Performance Modeling

    NARCIS (Netherlands)

    Ainslie, M.A.

    2010-01-01

    Sonar performance modelling (SPM) is concerned with the prediction of quantitative measures of sonar performance, such as probability of detection. It is a multidisciplinary subject, requiring knowledge and expertise in the disparate fields of underwater acoustics, acoustical oceanography, sonar

  20. Space Station Solar Dynamic Module modelling and simulation

    Science.gov (United States)

    Tylim, A.

    1989-01-01

    Efforts to model and simulate the Solar Dynamic Power Module (SDPM) for the Space Station are discussed. The SDPM configuration is given and the SDPM subsytems are described, including the concentrator assembly, the fine pointing and tracking system, the power generation system, the heat rejection assembly, the electrical equipment, the interface structure and integration hardware, and the beta gimbal assembly. Performance requirements and design considerations are given. The development of models to simulate the SDPM is examined, noting research on models such as the Electric Power System Transient Analysis Model, the Electric Power System on Orbit Performance model, and a spatial flux distribution function.

  1. Modeling lift operations with SASmacr Simulation Studio

    Science.gov (United States)

    Kar, Leow Soo

    2016-10-01

    Lifts or elevators are an essential part of multistorey buildings which provide vertical transportation for its occupants. In large and high-rise apartment buildings, its occupants are permanent, while in buildings, like hospitals or office blocks, the occupants are temporary or users of the buildings. They come in to work or to visit, and thus, the population of such buildings are much higher than those in residential apartments. It is common these days that large office blocks or hospitals have at least 8 to 10 lifts serving its population. In order to optimize the level of service performance, different transportation schemes are devised to control the lift operations. For example, one lift may be assigned to solely service the even floors and another solely for the odd floors, etc. In this paper, a basic lift system is modelled using SAS Simulation Studio to study the effect of factors such as the number of floors, capacity of the lift car, arrival rate and exit rate of passengers at each floor, peak and off peak periods on the system performance. The simulation is applied to a real lift operation in Sunway College's North Building to validate the model.

  2. Optical modeling and simulation of thin-film photovoltaic devices

    CERN Document Server

    Krc, Janez

    2013-01-01

    In wafer-based and thin-film photovoltaic (PV) devices, the management of light is a crucial aspect of optimization since trapping sunlight in active parts of PV devices is essential for efficient energy conversions. Optical modeling and simulation enable efficient analysis and optimization of the optical situation in optoelectronic and PV devices. Optical Modeling and Simulation of Thin-Film Photovoltaic Devices provides readers with a thorough guide to performing optical modeling and simulations of thin-film solar cells and PV modules. It offers insight on examples of existing optical models

  3. Improving hydrological simulations by incorporating GRACE data for model calibration

    Science.gov (United States)

    Bai, Peng; Liu, Xiaomang; Liu, Changming

    2018-02-01

    Hydrological model parameters are typically calibrated by observed streamflow data. This calibration strategy is questioned when the simulated hydrological variables of interest are not limited to streamflow. Well-performed streamflow simulations do not guarantee the reliable reproduction of other hydrological variables. One of the reasons is that hydrological model parameters are not reasonably identified. The Gravity Recovery and Climate Experiment (GRACE)-derived total water storage change (TWSC) data provide an opportunity to constrain hydrological model parameterizations in combination with streamflow observations. In this study, a multi-objective calibration scheme based on GRACE-derived TWSC and streamflow observations was compared with the traditional single-objective calibration scheme based on only streamflow simulations. Two hydrological models were employed on 22 catchments in China with different climatic conditions. The model evaluations were performed using observed streamflows, GRACE-derived TWSC, and actual evapotranspiration (ET) estimates from flux towers and from the water balance approach. Results showed that the multi-objective calibration scheme provided more reliable TWSC and ET simulations without significant deterioration in the accuracy of streamflow simulations than the single-objective calibration. The improvement in TWSC and ET simulations was more significant in relatively dry catchments than in relatively wet catchments. In addition, hydrological models calibrated using GRACE-derived TWSC data alone cannot obtain accurate runoff simulations in ungauged catchments. This study highlights the importance of including additional constraints in addition to streamflow observations to improve performances of hydrological models.

  4. Perceived stress and team performance during a simulated resuscitation.

    Science.gov (United States)

    Hunziker, Sabina; Laschinger, Laura; Portmann-Schwarz, Simone; Semmer, Norbert K; Tschan, Franziska; Marsch, Stephan

    2011-09-01

    Barriers to optimal performance of cardiopulmonary resuscitation may partly relate to human factors, such as stress and specific emotions. The aim of this study was to investigate whether mental stress and different perceived emotions have a negative impact on the performance of rescuers. This prospective, observational study was conducted at the Simulator Center of the University Hospital Basel, Switzerland. A total of 120 medical students (70% female) participated in teams of three. They reported levels of perceived stress, feeling overwhelmed, motivation and specific emotions before, during, and after a simulated resuscitation. The association of stress/overload (index of stress and feeling overwhelmed), motivation, and specific emotions with resuscitation performance defined as hands-on time during the first 180 s after cardiac arrest was investigated. During resuscitation, levels of stress/overload, motivation, and negative emotions were significantly higher as compared to the periods before and after resuscitation. In contrast, positive emotions were highest before and after resuscitation and significantly lower during resuscitation. In general, females reported higher stress/overload and negative emotions, whereas males reported more positive emotions. A multivariate linear regression model showed negative associations of stress/overload (regression coefficient -18.12, 95% CI -30.73, -5.51, p = 0.006) and positive associations of motivation (regression coefficient 13.45, 95% CI 0.95, 25.95, p = 0.036) with resuscitation performance. A simulated cardiac arrest caused substantial perceived stress/overload and negative emotions, particularly in female students, which adversely impacted resuscitation performance. Further studies are required to expand our findings to more experienced medical professionals and investigate whether stress coping strategies improve resuscitation performance.

  5. Solar power plant performance evaluation: simulation and experimental validation

    Science.gov (United States)

    Natsheh, E. M.; Albarbar, A.

    2012-05-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P&O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  6. Solar power plant performance evaluation: simulation and experimental validation

    International Nuclear Information System (INIS)

    Natsheh, E M; Albarbar, A

    2012-01-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P and O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  7. Wireless Communication for Controlling Microgrids: Co-simulation and Performance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mao, Rukun [ORNL; Xu, Yan [ORNL; Li, Huijuan [ORNL; Li, Husheng [ORNL

    2013-01-01

    A microgrid with wireless communication links for microgrid control has been designed and developed. The complete simulation model has been developed in MatLab SimuLink with seamless integration of the power subsystem and the communication subsystem. Unlike the conventional co-simulators that usually glue two existing simulators together by creating an interface, which has a steep learning curve, the proposed simulator is a compact single-unit model. Detailed modeling of the power subsystem and communication system is presented as well as the microgrid control architecture and strategies. The impact of different communication system performances on microgrid control has been studied and evaluated in the proposed simulator.

  8. Policy advice derived from simulation models

    NARCIS (Netherlands)

    Brenner, T.; Werker, C.

    2009-01-01

    When advising policy we face the fundamental problem that economic processes are connected with uncertainty and thus policy can err. In this paper we show how the use of simulation models can reduce policy errors. We suggest that policy is best based on socalled abductive simulation models, which

  9. Assessing performance and validating finite element simulations using probabilistic knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, Ronald M.; Rodriguez, E. A. (Edward A.)

    2002-01-01

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.

  10. VISION: Verifiable Fuel Cycle Simulation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Abdellatif M. Yacout; Gretchen E. Matthern; Steven J. Piet; David E. Shropshire

    2009-04-01

    The nuclear fuel cycle is a very complex system that includes considerable dynamic complexity as well as detail complexity. In the nuclear power realm, there are experts and considerable research and development in nuclear fuel development, separations technology, reactor physics and waste management. What is lacking is an overall understanding of the entire nuclear fuel cycle and how the deployment of new fuel cycle technologies affects the overall performance of the fuel cycle. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing and delays in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works and can transition as technologies are changed. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model and some examples of how to use VISION.

  11. Model Validation for Simulations of Vehicle Systems

    Science.gov (United States)

    2012-08-01

    jackknife”, Annals of Statistics, 7:1-26, 1979. [45] B. Efron and G. Gong, “A leisurely look at the bootstrap, the jackknife, and cross-validation”, The...battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation of ground vehicle systems...Sandia National Laboratories and a battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation

  12. Transient Modeling and Simulation of Compact Photobioreactors

    OpenAIRE

    Ribeiro, Robert Luis Lara; Mariano, André Bellin; Souza, Jeferson Avila; Vargas, Jose Viriato Coelho

    2008-01-01

    In this paper, a mathematical model is developed to make possible the simulation of microalgae growth and its dependency on medium temperature and light intensity. The model is utilized to simulate a compact photobioreactor response in time with physicochemical parameters of the microalgae Phaeodactylum tricornutum. The model allows for the prediction of the transient and local evolution of the biomass concentration in the photobioreactor with low computational time. As a result, the model is...

  13. Characterising performance of environmental models

    NARCIS (Netherlands)

    Bennett, N.D.; Croke, B.F.W.; Guariso, G.; Guillaume, J.H.A.; Hamilton, S.H.; Jakeman, A.J.; Marsili-Libelli, S.; Newham, L.T.H.; Norton, J.; Perrin, C.; Pierce, S.; Robson, B.; Seppelt, R.; Voinov, A.; Fath, B.D.; Andreassian, V.

    2013-01-01

    In order to use environmental models effectively for management and decision-making, it is vital to establish an appropriate level of confidence in their performance. This paper reviews techniques available across various fields for characterising the performance of environmental models with focus

  14. Heinrich events modeled in transient glacial simulations

    Science.gov (United States)

    Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe

    2017-04-01

    Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.

  15. Simulation model for port shunting yards

    Science.gov (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  16. Numerical simulation of hydrodynamic performance of ship under oblique conditions

    Directory of Open Access Journals (Sweden)

    CHEN Zhiming

    2018-02-01

    Full Text Available [Objectives] This paper is intended to study the viscous flow field around a ship under oblique conditions and provide a research basis for ship maneuverability. [Methods] Using commercial software STRA-CCM+, the SST k-ω turbulence model is selected to predict the hydrodynamic performance of the KVLCC2 model at different drift angles, and predict the hull flow field. The pressure distribution of the ship model at different drift angles is observed and the vortex shedding of the ship's hull and constraint streamlines on the hull's surface are also observed. [Results] The results show that numerical simulation can satisfy the demands of engineering application in the prediction of the lateral force, yaw moment and hull surface pressure distribution of a ship. [Conclusions] The research results of this paper can provide valuable references for the study of the flow separation phenomenon under oblique conditions.

  17. Critical review of glass performance modeling

    International Nuclear Information System (INIS)

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process

  18. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  19. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  20. A Network Contention Model for the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2015-01-01

    The Extreme-scale Simulator (xSim) is a performance investigation toolkit for high-performance computing (HPC) hardware/software co-design. It permits running a HPC application with millions of concurrent execution threads, while observing its performance in a simulated extreme-scale system. This paper details a newly developed network modeling feature for xSim, eliminating the shortcomings of the existing network modeling capabilities. The approach takes a different path for implementing network contention and bandwidth capacity modeling using a less synchronous and accurate enough model design. With the new network modeling feature, xSim is able to simulate on-chip and on-node networks with reasonable accuracy and overheads.

  1. Photovoltaic performance models - A report card

    Science.gov (United States)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  2. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  3. A Concept of Simulation-based SC Performance Analysis Using SCOR Metrics

    Directory of Open Access Journals (Sweden)

    Šitova Irīna

    2017-12-01

    Full Text Available The paper discusses a common approach to describing and analysing supply chains between simulation specialists and supply chain managers, which is based on Supply Chain Operations Reference (SCOR model indicators and metrics. SCOR is a reference model of supply chain business processes. It is based on best practices and used in various business areas of supply chains. Supply chain performance indicators are defined by numerous measurable SCOR metrics. Some metrics can be estimated with simulation models. For an efficient supply chain analysis, one should evaluate the conformity of SCOR metrics with simulation-based assessment of performance indicators. Analysing projects in Supply Chain (SC modelling area as well as analysing types of simulation results enables one to assess the conformity of the simulation-based performance indicators with SCOR model metrics of different levels. Supply chain simulation modelling coordinated with the SCOR model expands the scope of simulation model applications for analysing supply chain performance indicators. It helps one estimate specific metrics with simulation results.

  4. Modeling VOC transport in simulated waste drums

    International Nuclear Information System (INIS)

    Liekhus, K.J.; Gresham, G.L.; Peterson, E.S.; Rae, C.; Hotz, N.J.; Connolly, M.J.

    1993-06-01

    A volatile organic compound (VOC) transport model has been developed to describe unsteady-state VOC permeation and diffusion within a waste drum. Model equations account for three primary mechanisms for VOC transport from a void volume within the drum. These mechanisms are VOC permeation across a polymer boundary, VOC diffusion across an opening in a volume boundary, and VOC solubilization in a polymer boundary. A series of lab-scale experiments was performed in which the VOC concentration was measured in simulated waste drums under different conditions. A lab-scale simulated waste drum consisted of a sized-down 55-gal metal drum containing a modified rigid polyethylene drum liner. Four polyethylene bags were sealed inside a large polyethylene bag, supported by a wire cage, and placed inside the drum liner. The small bags were filled with VOC-air gas mixture and the VOC concentration was measured throughout the drum over a period of time. Test variables included the type of VOC-air gas mixtures introduced into the small bags, the small bag closure type, and the presence or absence of a variable external heat source. Model results were calculated for those trials where the VOC permeability had been measured. Permeabilities for five VOCs [methylene chloride, 1,1,2-trichloro-1,2,2-trifluoroethane (Freon-113), 1,1,1-trichloroethane, carbon tetrachloride, and trichloroethylene] were measured across a polyethylene bag. Comparison of model and experimental results of VOC concentration as a function of time indicate that model accurately accounts for significant VOC transport mechanisms in a lab-scale waste drum

  5. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  6. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  7. Firm Sustainability Performance Index Modeling

    Directory of Open Access Journals (Sweden)

    Che Wan Jasimah Bt Wan Mohamed Radzi

    2015-12-01

    Full Text Available The main objective of this paper is to bring a model for firm sustainability performance index by applying both classical and Bayesian structural equation modeling (parametric and semi-parametric modeling. Both techniques are considered to the research data collected based on a survey directed to the China, Taiwan, and Malaysia food manufacturing industry. For estimating firm sustainability performance index we consider three main indicators include knowledge management, organizational learning, and business strategy. Based on the both Bayesian and classical methodology, we confirmed that knowledge management and business strategy have significant impact on firm sustainability performance index.

  8. MMSNF 2005. Materials models and simulations for nuclear fuels

    Energy Technology Data Exchange (ETDEWEB)

    Freyss, M.; Durinck, J.; Carlot, G.; Sabathier, C.; Martin, P.; Garcia, P.; Ripert, M.; Blanpain, P.; Lippens, M.; Schut, H.; Federov, A.V.; Bakker, K.; Osaka, M.; Miwa, S.; Sato, I.; Tanaka, K.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Govers, K.; Verwerft, M.; Hou, M.; Lemehov, S.E.; Terentyev, D.; Govers, K.; Kotomin, E.A.; Ashley, N.J.; Grimes, R.W.; Van Uffelen, P.; Mastrikov, Y.; Zhukovskii, Y.; Rondinella, V.V.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Minato, K.; Phillpot, S.; Watanabe, T.; Shukla, P.; Sinnott, S.; Nino, J.; Grimes, R.; Staicu, D.; Hiernaut, J.P.; Wiss, T.; Rondinella, V.V.; Ronchi, C.; Yakub, E.; Kaye, M.H.; Morrison, C.; Higgs, J.D.; Akbari, F.; Lewis, B.J.; Thompson, W.T.; Gueneau, C.; Gosse, S.; Chatain, S.; Dumas, J.C.; Sundman, B.; Dupin, N.; Konings, R.; Noel, H.; Veshchunov, M.; Dubourg, R.; Ozrin, C.V.; Veshchunov, M.S.; Welland, M.T.; Blanc, V.; Michel, B.; Ricaud, J.M.; Calabrese, R.; Vettraino, F.; Tverberg, T.; Kissane, M.; Tulenko, J.; Stan, M.; Ramirez, J.C.; Cristea, P.; Rachid, J.; Kotomin, E.; Ciriello, A.; Rondinella, V.V.; Staicu, D.; Wiss, T.; Konings, R.; Somers, J.; Killeen, J

    2006-07-01

    The MMSNF Workshop series aims at stimulating research and discussions on models and simulations of nuclear fuels and coupling the results into fuel performance codes.This edition was focused on materials science and engineering for fuel performance codes. The presentations were grouped in three technical sessions: fundamental modelling of fuel properties; integral fuel performance codes and their validation; collaborations and integration of activities. (A.L.B.)

  9. Physically realistic modeling of maritime training simulation

    OpenAIRE

    Cieutat , Jean-Marc

    2003-01-01

    Maritime training simulation is an important matter of maritime teaching, which requires a lot of scientific and technical skills.In this framework, where the real time constraint has to be maintained, all physical phenomena cannot be studied; the most visual physical phenomena relating to the natural elements and the ship behaviour are reproduced only. Our swell model, based on a surface wave simulation approach, permits to simulate the shape and the propagation of a regular train of waves f...

  10. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  11. Monte Carlo simulation to analyze the performance of CPV modules

    Science.gov (United States)

    Herrero, Rebeca; Antón, Ignacio; Sala, Gabriel; De Nardis, Davide; Araki, Kenji; Yamaguchi, Masafumi

    2017-09-01

    A model to evaluate the performance of high concentrator photovoltaics (HCPV) modules (that generates current-voltage curves) has been applied together with a Monte Carlo approach to obtain a distribution of modules with a given set of characteristics (e.g., receivers electrical properties and misalignments within elementary units in modules) related to a manufacturing scenario. In this paper, the performance of CPV systems (tracker and inverter) that contain the set of simulated modules is evaluated depending on different system characteristics: inverter configuration, sorting of modules and bending of the tracker frame. Thus, the study of the HCPV technology regarding its angular constrains is fully covered by analyzing all the possible elements affecting the generated electrical power.

  12. A quantum energy transport model for semiconductor device simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sho, Shohiro, E-mail: shoshohiro@gmail.com [Graduate School of Information Science and Technology, Osaka University, Osaka (Japan); Odanaka, Shinji [Computer Assisted Science Division, Cybermedia Center, Osaka University, Osaka (Japan)

    2013-02-15

    This paper describes numerical methods for a quantum energy transport (QET) model in semiconductors, which is derived by using a diffusion scaling in the quantum hydrodynamic (QHD) model. We newly drive a four-moments QET model similar with a classical ET model. Space discretization is performed by a new set of unknown variables. Numerical stability and convergence are obtained by developing numerical schemes and an iterative solution method with a relaxation method. Numerical simulations of electron transport in a scaled MOSFET device are discussed. The QET model allows simulations of quantum confinement transport, and nonlocal and hot-carrier effects in scaled MOSFETs.

  13. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  14. Simulation Model of Mobile Detection Systems

    International Nuclear Information System (INIS)

    Edmunds, T.; Faissol, D.; Yao, Y.

    2009-01-01

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped with 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains

  15. Computer simulation of steady-state performance of air-to-air heat pumps

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, R D; Creswick, F A

    1978-03-01

    A computer model by which the performance of air-to-air heat pumps can be simulated is described. The intended use of the model is to evaluate analytically the improvements in performance that can be effected by various component improvements. The model is based on a trio of independent simulation programs originated at the Massachusetts Institute of Technology Heat Transfer Laboratory. The three programs have been combined so that user intervention and decision making between major steps of the simulation are unnecessary. The program was further modified by substituting a new compressor model and adding a capillary tube model, both of which are described. Performance predicted by the computer model is shown to be in reasonable agreement with performance data observed in our laboratory. Planned modifications by which the utility of the computer model can be enhanced in the future are described. User instructions and a FORTRAN listing of the program are included.

  16. Development of a Simulation Model for Swimming with Diving Fins

    Directory of Open Access Journals (Sweden)

    Motomu Nakashima

    2018-02-01

    Full Text Available The simulation model to assess the performance of diving fin was developed by extending the swimming human simulation model SWUM. A diving fin was modeled as a series of five rigid plates and connected to the human model by springs and dampers. These plates were connected to each other by virtual springs and dampers, and fin’s bending property was represented by springs and dampers as well. An actual diver’s swimming motion with fins was acquired by a motion capture experiment. In order to determine the bending property of the fin, two bending tests on land were conducted. In addition, an experiment was conducted in order to determine the fluid force coefficients in the fluid force model for the fin. Finally, using all measured and identified information, a simulation, in which the experimental situation was reproduced, was carried out. It was confirmed that the diver in the simulation propelled forward in the water successfully.

  17. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  18. Student-Perceived Influences on Performance During Simulation.

    Science.gov (United States)

    Burbach, Beth E; Thompson, Sarah A; Barnason, Susan A; Wilhelm, Susan L; Kotcherlakota, Suhasini; Miller, Connie L; Paulman, Paul M

    2016-07-01

    Understanding the effect of the context of simulation to learning and performance is critical to ensure not only optimal learning but to provide a valid and reliable means to evaluate performance. The purpose of this study is to identify influences on performance from the student perspective and understand the contextual barriers inherent in simulation before using simulation for high-stakes testing. This study used a qualitative descriptive design. Senior nursing students (N = 29) provided nursing care during simulation. Vocalized thoughts during simulation and reflective debriefing were digitally recorded and transcribed verbatim. Thematic analysis was conducted on transcribed data. Student performance during simulation was influenced by anxiety, uncertainty, technological limitations, and experience with the patient condition. Students had few previous simulation-based learning experiences that may have influenced performance. More needs to be understood regarding factors affecting simulation performance before pass-or-fail decisions are made using this technology. [J Nurs Educ. 2016;55(7):396-398.]. Copyright 2016, SLACK Incorporated.

  19. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  20. Magnetosphere Modeling: From Cartoons to Simulations

    Science.gov (United States)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems

  1. Coupling global models for hydrology and nutrient loading to simulate nitrogen and phosphorus retention in surface water – description of IMAGE–GNM and analysis of performance

    NARCIS (Netherlands)

    Beusen, A.H.W.; van Beek, L.P.H.; Bouwman, Lex; Mogollon, J.M.; Middelburg, J.B.M.

    2015-01-01

    The Integrated Model to Assess the Global Environment–Global Nutrient Model (IMAGE–GNM) is a global distributed, spatially explicit model using hydrology as the basis for describing nitrogen (N) and phosphorus (P) delivery to surface water, transport and in-stream retention in rivers, lakes,

  2. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  3. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    A familiar example of a feedback loop is the business model in which part of the output or profit is fedback as input or additional capital - for instance, a company may choose to reinvest 10% of the profit for expansion of the business. Such simple models, like ..... would help scientists, engineers and managers towards better.

  4. Simulation Model Development for Mail Screening Process

    National Research Council Canada - National Science Library

    Vargo, Trish; Marvin, Freeman; Kooistra, Scott

    2005-01-01

    STUDY OBJECTIVE: Provide decision analysis support to the Homeland Defense Business Unit, Special Projects Team, in developing a simulation model to help determine the most effective way to eliminate backlog...

  5. SEIR model simulation for Hepatitis B

    Science.gov (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  6. Simulation data mapping in virtual cardiac model.

    Science.gov (United States)

    Jiquan, Liu; Jingyi, Feng; Duan, Huilong; Siping, Chen

    2004-01-01

    Although 3D heart and torso model with realistic geometry are basis of simulation computation in LFX virtual cardiac model, the simulation results are mostly output in 2D format. To solve such a problem and enhance the virtual reality of LFX virtual cardiac model, the methods of voxel mapping and vertex project mapping were presented. With these methods, excitation isochrone map (EIM) was mapped from heart model with realistic geometry to real visible man heart model, and body surface potential map (BSPM) was mapped from torso model with realistic geometry to real visible man body surface. By visualizing in the 4Dview, which is a real-time 3D medical image visualization platform, the visualization results of EIM and BSPM simulation data before and after mapping were also provided. According to the visualization results, the output format of EIM and BSPM simulation data of LFX virtual cardiac model were extended from 2D to 4D (spatio-temporal) and from cardiac model with realistic geometry to real cardiac model, and more realistic and effective simulation was achieved.

  7. A fire management simulation model using stochastic arrival times

    Science.gov (United States)

    Eric L. Smith

    1987-01-01

    Fire management simulation models are used to predict the impact of changes in the fire management program on fire outcomes. As with all models, the goal is to abstract reality without seriously distorting relationships between variables of interest. One important variable of fire organization performance is the length of time it takes to get suppression units to the...

  8. Fully Adaptive Radar Modeling and Simulation Development

    Science.gov (United States)

    2017-04-01

    AFRL-RY-WP-TR-2017-0074 FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT Kristine L. Bell and Anthony Kellems Metron, Inc...SMALL BUSINESS INNOVATION RESEARCH (SBIR) PHASE I REPORT. Approved for public release; distribution unlimited. See additional restrictions...2017 4. TITLE AND SUBTITLE FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT 5a. CONTRACT NUMBER FA8650-16-M-1774 5b. GRANT NUMBER 5c

  9. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  10. MODELING SUPPLY CHAIN PERFORMANCE VARIABLES

    Directory of Open Access Journals (Sweden)

    Ashish Agarwal

    2005-01-01

    Full Text Available In order to understand the dynamic behavior of the variables that can play a major role in the performance improvement in a supply chain, a System Dynamics-based model is proposed. The model provides an effective framework for analyzing different variables affecting supply chain performance. Among different variables, a causal relationship among different variables has been identified. Variables emanating from performance measures such as gaps in customer satisfaction, cost minimization, lead-time reduction, service level improvement and quality improvement have been identified as goal-seeking loops. The proposed System Dynamics-based model analyzes the affect of dynamic behavior of variables for a period of 10 years on performance of case supply chain in auto business.

  11. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  12. Biofilm carrier migration model describes reactor performance.

    Science.gov (United States)

    Boltz, Joshua P; Johnson, Bruce R; Takács, Imre; Daigger, Glen T; Morgenroth, Eberhard; Brockmann, Doris; Kovács, Róbert; Calhoun, Jason M; Choubert, Jean-Marc; Derlon, Nicolas

    2017-06-01

    The accuracy of a biofilm reactor model depends on the extent to which physical system conditions (particularly bulk-liquid hydrodynamics and their influence on biofilm dynamics) deviate from the ideal conditions upon which the model is based. It follows that an improved capacity to model a biofilm reactor does not necessarily rely on an improved biofilm model, but does rely on an improved mathematical description of the biofilm reactor and its components. Existing biofilm reactor models typically include a one-dimensional biofilm model, a process (biokinetic and stoichiometric) model, and a continuous flow stirred tank reactor (CFSTR) mass balance that [when organizing CFSTRs in series] creates a pseudo two-dimensional (2-D) model of bulk-liquid hydrodynamics approaching plug flow. In such a biofilm reactor model, the user-defined biofilm area is specified for each CFSTR; thereby, X carrier does not exit the boundaries of the CFSTR to which they are assigned or exchange boundaries with other CFSTRs in the series. The error introduced by this pseudo 2-D biofilm reactor modeling approach may adversely affect model results and limit model-user capacity to accurately calibrate a model. This paper presents a new sub-model that describes the migration of X carrier and associated biofilms, and evaluates the impact that X carrier migration and axial dispersion has on simulated system performance. Relevance of the new biofilm reactor model to engineering situations is discussed by applying it to known biofilm reactor types and operational conditions.

  13. Challenges for Modeling and Simulation

    National Research Council Canada - National Science Library

    Johnson, James

    2002-01-01

    This document deals with modeling and stimulation. The strengths are study processes that rarely or never occur, evaluate a wide range of alternatives, generate new ideas, new concepts and innovative solutions...

  14. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  15. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  16. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  17. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  18. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  19. A simulation model for football championships

    OpenAIRE

    Koning, Ruud H.; Koolhaas, Michael; Renes, Gusta

    2001-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like ‘which team had a lucky draw?’ or ‘what is the probability that two teams meet at some moment in the tournament?’. Input to the simulation/probability model are scoring intensities, that are estimated as a weighted average of goals scored. The model has been used in practice to write articles for the popular press, ...

  20. Performance Test of Core Protection and Monitoring Algorithm with DLL for SMART Simulator Implementation

    International Nuclear Information System (INIS)

    Koo, Bonseung; Hwang, Daehyun; Kim, Keungkoo

    2014-01-01

    A multi-purpose best-estimate simulator for SMART is being established, which is intended to be used as a tool to evaluate the impacts of design changes on the safety performance, and to improve and/or optimize the operating procedure of SMART. In keeping with these intentions, a real-time model of the digital core protection and monitoring systems was developed and the real-time performance of the models was verified for various simulation scenarios. In this paper, a performance test of the core protection and monitoring algorithm with a DLL file for the SMART simulator implementation was performed. A DLL file of the simulator application code was made and several real-time evaluation tests were conducted for the steady-state and transient conditions with simulated system variables. A performance test of the core protection and monitoring algorithms for the SMART simulator was performed. A DLL file of the simulator version code was made and several real-time evaluation tests were conducted for various scenarios with a DLL file and simulated system variables. The results of all test cases showed good agreement with the reference results and some features caused by algorithm change were properly reflected to the DLL results. Therefore, it was concluded that the SCOPS S SIM and SCOMS S SIM algorithms and calculational capabilities are appropriate for the core protection and monitoring program in the SMART simulator

  1. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  2. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  3. Air Conditioner Compressor Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Ning; Xie, YuLong; Huang, Zhenyu

    2008-09-05

    During the past three years, the Western Electricity Coordinating Council (WECC) Load Modeling Task Force (LMTF) has led the effort to develop the new modeling approach. As part of this effort, the Bonneville Power Administration (BPA), Southern California Edison (SCE), and Electric Power Research Institute (EPRI) Solutions tested 27 residential air-conditioning units to assess their response to delayed voltage recovery transients. After completing these tests, different modeling approaches were proposed, among them a performance modeling approach that proved to be one of the three favored for its simplicity and ability to recreate different SVR events satisfactorily. Funded by the California Energy Commission (CEC) under its load modeling project, researchers at Pacific Northwest National Laboratory (PNNL) led the follow-on task to analyze the motor testing data to derive the parameters needed to develop a performance models for the single-phase air-conditioning (SPAC) unit. To derive the performance model, PNNL researchers first used the motor voltage and frequency ramping test data to obtain the real (P) and reactive (Q) power versus voltage (V) and frequency (f) curves. Then, curve fitting was used to develop the P-V, Q-V, P-f, and Q-f relationships for motor running and stalling states. The resulting performance model ignores the dynamic response of the air-conditioning motor. Because the inertia of the air-conditioning motor is very small (H<0.05), the motor reaches from one steady state to another in a few cycles. So, the performance model is a fair representation of the motor behaviors in both running and stalling states.

  4. Inventory of Simulation and Modeling for the Analysis of Ground Manoeuvre Performance (inventarisatie van vragen en modellen voor de analyse van het grondgebonden optreden)

    Science.gov (United States)

    2006-04-01

    2007. Voor staftrainingen op bataijons- en brigadeniveau wordt in Duitsland bet model GUPPIS gebruikt. Dit model is vergelijkbaar aan KIBOWI, waarbij de...GESI SMARTT • 22 GUPPIS . 22 C2WS . 33 CAEN. 20, 107 CAMEX. 19 J CASTFOREM • 20, 56, 96 CASTOR . 17.25, 27 INDIA- 17, 68, 71 CAT. 19, 21, 48 ITEMS

  5. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  6. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  7. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  8. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Modelling Deterministic Systems. N K Srinivasan gradu- ated from Indian. Institute of Science and obtained his Doctorate from Columbia Univer- sity, New York. He has taught in several universities, and later did system analysis, wargaming and simula- tion for defence. His other areas of interest are reliability engineer-.

  9. Long-term simulation of temporal change of soil organic carbon in Denmark: comparison of three model performances under climate change

    DEFF Research Database (Denmark)

    Öztürk, Isik; Sharif, Behzad; Santhome, Sanmohan

    2018-01-01

    The temporal change in soil organic carbon (SOC) was analysed over an 80-year period based on climate change predictions of four regional circulation models under the International Panel on Climate Change (IPCC) A1B emission scenario in the 21st century. A 20-year (1991–2010) set of observed...... strategies. The results also suggested significant interactive effect of N input rate and climate variables on soil C and denitrification in response to climate change. The uncertainty was addressed by including the crop-soil models in a mixed-effect analysis so that the contribution of the models...... to the total variance of random variation was quantified. Statistical analysis showed that the crop-soil models are the main source for uncertainty in analysing soil C and N responses to climate change....

  10. A satellite simulator for TRMM PR applied to climate model simulations

    Science.gov (United States)

    Spangehl, T.; Schroeder, M.; Bodas-Salcedo, A.; Hollmann, R.; Riley Dellaripa, E. M.; Schumacher, C.

    2017-12-01

    Climate model simulations have to be compared against observation based datasets in order to assess their skill in representing precipitation characteristics. Here we use a satellite simulator for TRMM PR in order to evaluate simulations performed with MPI-ESM (Earth system model of the Max Planck Institute for Meteorology in Hamburg, Germany) performed within the MiKlip project (https://www.fona-miklip.de/, funded by Federal Ministry of Education and Research in Germany). While classical evaluation methods focus on geophysical parameters such as precipitation amounts, the application of the satellite simulator enables an evaluation in the instrument's parameter space thereby reducing uncertainties on the reference side. The CFMIP Observation Simulator Package (COSP) provides a framework for the application of satellite simulators to climate model simulations. The approach requires the introduction of sub-grid cloud and precipitation variability. Radar reflectivities are obtained by applying Mie theory, with the microphysical assumptions being chosen to match the atmosphere component of MPI-ESM (ECHAM6). The results are found to be sensitive to the methods used to distribute the convective precipitation over the sub-grid boxes. Simple parameterization methods are used to introduce sub-grid variability of convective clouds and precipitation. In order to constrain uncertainties a comprehensive comparison with sub-grid scale convective precipitation variability which is deduced from TRMM PR observations is carried out.

  11. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  12. Optimization Model for Web Based Multimodal Interactive Simulations.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  13. Aircraft vulnerability analysis by modeling and simulation

    Science.gov (United States)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    guidance acceleration and seeker sensitivity. For the purpose of this investigation the aircraft is equipped with conventional pyrotechnic decoy flares and the missile has no counter-countermeasure means (security restrictions on open publication). This complete simulation is used to calculate the missile miss distance, when the missile is launched from different locations around the aircraft. The miss distance data is then graphically presented showing miss distance (aircraft vulnerability) as a function of launch direction and range. The aircraft vulnerability graph accounts for aircraft and missile characteristics, but does not account for missile deployment doctrine. A Bayesian network is constructed to fuse the doctrinal rules with the aircraft vulnerability data. The Bayesian network now provides the capability to evaluate the combined risk of missile launch and aircraft vulnerability. It is shown in this paper that it is indeed possible to predict the aircraft vulnerability to missile attack in a comprehensive modelling and a holistic process. By using the appropriate real-world models, this approach is used to evaluate the effectiveness of specific countermeasure techniques against specific missile threats. The use of a Bayesian network provides the means to fuse simulated performance data with more abstract doctrinal rules to provide a realistic assessment of the aircraft vulnerability.

  14. Alcohol consumption for simulated driving performance: A systematic review

    Directory of Open Access Journals (Sweden)

    Mohammad Saeid Rezaee-Zavareh

    2017-06-01

    Conclusion: Alcohol consumption may decrease simulated driving performance in alcohol consumed people compared with non-alcohol consumed people via changes in SDSD, LPSD, speed, MLPD, LC and NA. More well-designed randomized controlled clinical trials are recommended.

  15. 20th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Patel, Nisarg; Kobayashi, Hiroaki

    2016-01-01

    The book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It explores general trends in hardware and software development, and then focuses specifically on the future of high-performance systems and heterogeneous architectures. It also covers applications such as computational fluid dynamics, material science, medical applications and climate research and discusses innovative fields like coupled multi-physics or multi-scale simulations. The papers included were selected from the presentations given at the 20th Workshop on Sustained Simulation Performance at the HLRS, University of Stuttgart, Germany in December 2015, and the subsequent Workshop on Sustained Simulation Performance at Tohoku University in February 2016.

  16. Representing Context in Simulator-based Human Performance Measurement

    National Research Council Canada - National Science Library

    Stacy, Webb; Merket, Danielle C; Puglisi, Matt; Haimson, Craig

    2006-01-01

    .... How would we measure FO performance in simulator-based training for this scenario? It's not enough simply to take obvious measurements like target location error or target/ammunition combination...

  17. High temperature and performance in a flight task simulator.

    Science.gov (United States)

    1972-05-01

    The effects of high cockpit temperature on physiological responses and performance were determined on pilots in a general aviation simulator. The pilots (all instrument rated) 'flew' an instrument flight while exposed to each of three cockpit tempera...

  18. Calibration of microscopic traffic simulation models using metaheuristic algorithms

    Directory of Open Access Journals (Sweden)

    Miao Yu

    2017-06-01

    Full Text Available This paper presents several metaheuristic algorithms to calibrate a microscopic traffic simulation model. The genetic algorithm (GA, Tabu Search (TS, and a combination of the GA and TS (i.e., warmed GA and warmed TS are implemented and compared. A set of traffic data collected from the I-5 Freeway, Los Angles, California, is used. Objective functions are defined to minimize the difference between simulated and field traffic data which are built based on the flow and speed. Several car-following parameters in VISSIM, which can significantly affect the simulation outputs, are selected to calibrate. A better match to the field measurements is reached with the GA, TS, and warmed GA and TS when comparing with that only using the default parameters in VISSIM. Overall, TS performs very well and can be used to calibrate parameters. Combining metaheuristic algorithms clearly performs better and therefore is highly recommended for calibrating microscopic traffic simulation models.

  19. submitter Simulation-Based Performance Analysis of the ALICE Mass Storage System

    CERN Document Server

    Vickovic, L; Celar, S

    2016-01-01

    CERN – the European Organization for Nuclear Research today, in the era of big data, is one of the biggest data generators in the world. Especially interesting is transient data storage system in the ALICE experiment. With the goal to optimize its performance this paper discusses a dynamic, discrete event simulation model of disk based Storage Area Network (SAN) and its usage for the performance analyses. Storage system model is based on modular, bottom up approach and the differences between measured and simulated values vary between 1.5 % and 4 % depending on the simulated component. Once finished, simulation model was used for detailed performance analyses. Among other findings it showed that system performances can be seriously affected if the array stripe size is larger than the size of cache on individual disks in the array, which so far has been completely ignored in the literature.

  20. 24th & 25th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Gienger, Michael; Kobayashi, Hiroaki

    2017-01-01

    This book presents the state of the art in High Performance Computing on modern supercomputer architectures. It addresses trends in hardware and software development in general, as well as the future of High Performance Computing systems and heterogeneous architectures. The contributions cover a broad range of topics, from improved system management to Computational Fluid Dynamics, High Performance Data Analytics, and novel mathematical approaches for large-scale systems. In addition, they explore innovative fields like coupled multi-physics and multi-scale simulations. All contributions are based on selected papers presented at the 24th Workshop on Sustained Simulation Performance, held at the University of Stuttgart’s High Performance Computing Center in Stuttgart, Germany in December 2016 and the subsequent Workshop on Sustained Simulation Performance, held at the Cyberscience Center, Tohoku University, Japan in March 2017.

  1. Advances in Electrochemical Models for Predicting the Cycling Performance of Traction Batteries: Experimental Study on Ni-MH and Simulation Développement de modèles électrochimiques de batteries de traction pour la prédiction de performances : étude expérimentale de batteries NiMH et simulations

    Directory of Open Access Journals (Sweden)

    Bernard J.

    2009-11-01

    Full Text Available Rigorous electrochemical models to simulate the cycling performance of batteries have been successfully developed and reported in the literature. They constitute a very promising approach for State-of-Charge (SoC estimation based on the physics of the cell with regards to other methods since SoC is an internal parameter of these physical models. However, the computational time needed to solve electrochemical battery models for online applications requires to develop a simplified physics-based battery model. In this work, our goal is to present and validate an advanced 0D-electrochemical model of a Ni-MH cell, as an example. This lumped-parameter model will be used to design an extended Kalman filter to predict the SoC of a Ni-MH pack. It is presented, followed by an extensive experimental study conducted on Ni-MH cells to better understand the mechanisms of physico-chemical phenomena occurring at both electrodes and support the model development. The last part of the paper focuses on the evaluation of the model with regards to experimental results obtained on Ni-MH sealed cells but also on the related commercial HEV battery pack. Des modèles électrochimiques fins permettant de simuler le comportement de batteries ont été développés avec succès et reportés dans la littérature. Ils constituent une alternative aux méthodes classiques pour estimer l’état de charge (SoC pour State of Charge des batteries, cette variable étant ici un paramètre interne du modèle physique. Cependant, pour les applications embarquées, il est nécessaire de développer des modèles simplifiés sur la base de ces modèles physiques afin de diminuer le temps de calcul nécessaire à la résolution des équations. Ici, nous présenterons à titre d’exemple un modèle électrochimique 0D avancé d’un accumulateur NiMH et sa validation. Ce modèle à paramètres concentrés sera utilisé pour réaliser un filtre de Kalman qui permettra la prédiction de l

  2. Contribution to the Development of Simulation Model of Ship Turbine

    Directory of Open Access Journals (Sweden)

    Božić Ratko

    2015-01-01

    Full Text Available Simulation modelling, performed by System Dynamics Modelling Approach and intensive use of computers, is one of the most convenient and most successful scientific methods of analysis of performance dynamics of nonlinear and very complex natural technical and organizational systems [1]. The purpose of this work is to demonstrate the successful application of system dynamics simulation modelling at analyzing performance dynamics of a complex system of ship’s propulsion system. Gas turbine is a complex non-linear system, which needs to be systematically investigated as a unit consisting of a number of subsystems and elements, which are linked by cause-effect (UPV feedback loops (KPD, both within the propulsion system and with the relevant surrounding. In this paper the authors will present an efficient application of scientific methods for the study of complex dynamic systems called qualitative and quantitative simulation System Dynamics Methodology. Gas turbine will be presented by a set of non-linear differential equations, after which mental-verbal structural models and flowcharts in System dynamics symbols will be produced, and the performance dynamics in load condition will be simulated in POWERSIM simulation language.

  3. Model Driven Development of Simulation Models : Defining and Transforming Conceptual Models into Simulation Models by Using Metamodels and Model Transformations

    NARCIS (Netherlands)

    Küçükkeçeci Çetinkaya, D.

    2013-01-01

    Modeling and simulation (M&S) is an effective method for analyzing and designing systems and it is of interest to scientists and engineers from all disciplines. This thesis proposes the application of a model driven software development approach throughout the whole set of M&S activities and it

  4. Proceedings of eSim 2006 : IBPSA-Canada's 4. biennial building performance simulation conference

    International Nuclear Information System (INIS)

    Kesik, T.

    2006-01-01

    This conference was attended by professionals, academics and students interested in promoting the science of building performance simulation in order to optimize design, construction, operation and maintenance of new and existing buildings around the world. This biennial conference and exhibition covered all topics related to computerized simulation of a building's energy performance and energy efficiency. Computerized simulation is widely used to predict the environmental performance of buildings during all stages of a building's life cycle, from the design, commissioning, construction, occupancy and management stages. Newly developed simulation methods for optimal comfort in new and existing buildings were evaluated. The themes of the conference were: recent developments for modelling the physical processes relevant to buildings; algorithms for modelling conventional and innovative HVAC systems; methods for modelling whole-building performance; building simulation software development; the use of building simulation tools in code compliance; moving simulation into practice; validation of building simulation software; architectural design; and optimization approaches in building design. The conference also covered the modeling of energy supply systems with reference to renewable energy sources such as ground source heat pumps or hybrid systems incorporating solar energy. The conference featured 32 presentations, of which 28 have been catalogued separately for inclusion in this database. refs., tabs., figs

  5. Simulation and modeling of turbulent flows

    CERN Document Server

    Gatski, Thomas B; Lumley, John L

    1996-01-01

    This book provides students and researchers in fluid engineering with an up-to-date overview of turbulent flow research in the areas of simulation and modeling. A key element of the book is the systematic, rational development of turbulence closure models and related aspects of modern turbulent flow theory and prediction. Starting with a review of the spectral dynamics of homogenous and inhomogeneous turbulent flows, succeeding chapters deal with numerical simulation techniques, renormalization group methods and turbulent closure modeling. Each chapter is authored by recognized leaders in their respective fields, and each provides a thorough and cohesive treatment of the subject.

  6. Dynamic modeling and simulation of wind turbines

    International Nuclear Information System (INIS)

    Ghafari Seadat, M.H.; Kheradmand Keysami, M.; Lari, H.R.

    2002-01-01

    Using wind energy for generating electricity in wind turbines is a good way for using renewable energies. It can also help to protect the environment. The main objective of this paper is dynamic modeling by energy method and simulation of a wind turbine aided by computer. In this paper, the equations of motion are extracted for simulating the system of wind turbine and then the behavior of the system become obvious by solving the equations. The turbine is considered with three blade rotor in wind direction, induced generator that is connected to the network and constant revolution for simulation of wind turbine. Every part of the wind turbine should be simulated for simulation of wind turbine. The main parts are blades, gearbox, shafts and generator

  7. Blast Load Simulator Experiments for Computational Model Validation Report 3

    Science.gov (United States)

    2017-07-01

    establish confidence in the simulation results specific to their intended use. One method for providing experimental data for computational model...walls, to higher blast pressures required to evaluate the performance of protective construction methods . Figure 1. ERDC Blast Load Simulator (BLS... Instrumentation included 3 pressure gauges mounted on the steel calibration plate, 2 pressure gauges mounted in the wall of the BLS, and 25 pressure gauges

  8. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  9. The behaviour of adaptive boneremodeling simulation models

    NARCIS (Netherlands)

    Weinans, H.; Huiskes, R.; Grootenboer, H.J.

    1992-01-01

    The process of adaptive bone remodeling can be described mathematically and simulated in a computer model, integrated with the finite element method. In the model discussed here, cortical and trabecular bone are described as continuous materials with variable density. The remodeling rule applied to

  10. Analytical system dynamics modeling and simulation

    CERN Document Server

    Fabien, Brian C

    2008-01-01

    This book offering a modeling technique based on Lagrange's energy method includes 125 worked examples. Using this technique enables one to model and simulate systems as diverse as a six-link, closed-loop mechanism or a transistor power amplifier.

  11. A simulation model for football championships

    NARCIS (Netherlands)

    Koning, RH; Koolhaas, M; Renes, G; Ridder, G

    2003-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like 'which team bad a lucky draw?' or 'what is the probability that two teams meet at some moment in the tournament?' Input

  12. A simulation model for football championships

    NARCIS (Netherlands)

    Koning, Ruud H.; Koolhaas, Michael; Renes, Gusta

    2001-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like ‘which team had a lucky draw?’ or ‘what is the probability that two teams meet at some moment in the tournament?’. Input

  13. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  14. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  15. HPC Performance Analysis of a Distributed Information Enterprise Simulation

    National Research Council Canada - National Science Library

    Hanna, James P; Walter, Martin J; Hillman, Robert G

    2004-01-01

    .... The analysis identified several performance limitations and bottlenecks. One critical limitation addressed and eliminated was simultaneously mixing a periodic process model with an event driven model causing rollbacks...

  16. Relating Standardized Visual Perception Measures to Simulator Visual System Performance

    Science.gov (United States)

    Kaiser, Mary K.; Sweet, Barbara T.

    2013-01-01

    Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).

  17. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together...... to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal...... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  18. Comments on ''Use of conditional simulation in nuclear waste site performance assessment'' by Carol Gotway

    International Nuclear Information System (INIS)

    Downing, D.J.

    1993-01-01

    This paper discusses Carol Gotway's paper, ''The Use of Conditional Simulation in Nuclear Waste Site Performance Assessment.'' The paper centers on the use of conditional simulation and the use of geostatistical methods to simulate an entire field of values for subsequent use in a complex computer model. The issues of sampling designs for geostatistics, semivariogram estimation and anisotropy, turning bands method for random field generation, and estimation of the comulative distribution function are brought out

  19. A queuing model for road traffic simulation

    International Nuclear Information System (INIS)

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-01-01

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme

  20. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  1. Constrained optimization via simulation models for new product innovation

    Science.gov (United States)

    Pujowidianto, Nugroho A.

    2017-11-01

    We consider the problem of constrained optimization where the decision makers aim to optimize the primary performance measure while constraining the secondary performance measures. This paper provides a brief overview of stochastically constrained optimization via discrete event simulation. Most review papers tend to be methodology-based. This review attempts to be problem-based as decision makers may have already decided on the problem formulation. We consider constrained optimization models as there are usually constraints on secondary performance measures as trade-off in new product development. It starts by laying out different possible methods and the reasons using constrained optimization via simulation models. It is then followed by the review of different simulation optimization approach to address constrained optimization depending on the number of decision variables, the type of constraints, and the risk preferences of the decision makers in handling uncertainties.

  2. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  3. Fracture network modeling and GoldSim simulation support

    International Nuclear Information System (INIS)

    Sugita, Kenichirou; Dershowitz, W.

    2005-01-01

    During Heisei-16, Golder Associates provided support for JNC Tokai through discrete fracture network data analysis and simulation of the Mizunami Underground Research Laboratory (MIU), participation in Task 6 of the AEspoe Task Force on Modeling of Groundwater Flow and Transport, and development of methodologies for analysis of repository site characterization strategies and safety assessment. MIU support during H-16 involved updating the H-15 FracMan discrete fracture network (DFN) models for the MIU shaft region, and developing improved simulation procedures. Updates to the conceptual model included incorporation of 'Step2' (2004) versions of the deterministic structures, and revision of background fractures to be consistent with conductive structure data from the DH-2 borehole. Golder developed improved simulation procedures for these models through the use of hybrid discrete fracture network (DFN), equivalent porous medium (EPM), and nested DFN/EPM approaches. For each of these models, procedures were documented for the entire modeling process including model implementation, MMP simulation, and shaft grouting simulation. Golder supported JNC participation in Task 6AB, 6D and 6E of the AEspoe Task Force on Modeling of Groundwater Flow and Transport during H-16. For Task 6AB, Golder developed a new technique to evaluate the role of grout in performance assessment time-scale transport. For Task 6D, Golder submitted a report of H-15 simulations to SKB. For Task 6E, Golder carried out safety assessment time-scale simulations at the block scale, using the Laplace Transform Galerkin method. During H-16, Golder supported JNC's Total System Performance Assessment (TSPA) strategy by developing technologies for the analysis of the use site characterization data in safety assessment. This approach will aid in the understanding of the use of site characterization to progressively reduce site characterization uncertainty. (author)

  4. Virtual reality simulation training of mastoidectomy - studies on novice performance.

    Science.gov (United States)

    Andersen, Steven Arild Wuyts

    2016-08-01

    Virtual reality (VR) simulation-based training is increasingly used in surgical technical skills training including in temporal bone surgery. The potential of VR simulation in enabling high-quality surgical training is great and VR simulation allows high-stakes and complex procedures such as mastoidectomy to be trained repeatedly, independent of patients and surgical tutors, outside traditional learning environments such as the OR or the temporal bone lab, and with fewer of the constraints of traditional training. This thesis aims to increase the evidence-base of VR simulation training of mastoidectomy and, by studying the final-product performances of novices, investigates the transfer of skills to the current gold-standard training modality of cadaveric dissection, the effect of different practice conditions and simulator-integrated tutoring on performance and retention of skills, and the role of directed, self-regulated learning. Technical skills in mastoidectomy were transferable from the VR simulation environment to cadaveric dissection with significant improvement in performance after directed, self-regulated training in the VR temporal bone simulator. Distributed practice led to a better learning outcome and more consolidated skills than massed practice and also resulted in a more consistent performance after three months of non-practice. Simulator-integrated tutoring accelerated the initial learning curve but also caused over-reliance on tutoring, which resulted in a drop in performance when the simulator-integrated tutor-function was discontinued. The learning curves were highly individual but often plateaued early and at an inadequate level, which related to issues concerning both the procedure and the VR simulator, over-reliance on the tutor function and poor self-assessment skills. Future simulator-integrated automated assessment could potentially resolve some of these issues and provide trainees with both feedback during the procedure and immediate

  5. Simulation platform to model, optimize and design wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    farms. The performance of these models is proven and they can be directly implemented in different simulation tools. Then, the general conclusions regarding the achieved results during the project are summarized and some guidelines for future work are given. A general conclusion is that the main goals of the project have been achieved. Finally, the papers and reports published during the project are presented. (au)

  6. Models Robustness for Simulating Drainage and NO3-N Fluxes

    Science.gov (United States)

    Jabro, Jay; Jabro, Ann

    2013-04-01

    Computer models simulate and forecast appropriate agricultural practices to reduce environmental impact. The objectives of this study were to assess and compare robustness and performance of three models -- LEACHM, NCSWAP, and SOIL-SOILN--for simulating drainage and NO3-N leaching fluxes in an intense pasture system without recalibration. A 3-yr study was conducted on a Hagerstown silt loam to measure drainage and NO3-N fluxes below 1 m depth from N-fertilized orchardgrass using intact core lysimeters. Five N-fertilizer treatments were replicated five times in a randomized complete block experimental design. The models were validated under orchardgrass using soil, water and N transformation rate parameters and C pools fractionation derived from a previous study conducted on similar soils under corn. The model efficiency (MEF) of drainage and NO3-N fluxes were 0.53, 0.69 for LEACHM; 0.75, 0.39 for NCSWAP; and 0.94, 0.91for SOIL-SOILN. The models failed to produce reasonable simulations of drainage and NO3-N fluxes in January, February and March due to limited water movement associated with frozen soil and snow accumulation and melt. The differences between simulated and measured NO3-N leaching and among models' performances may also be related to soil N and C transformation processes embedded in the models These results are a monumental progression in the validation of computer models which will lead to continued diffusion across diverse stakeholders.

  7. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  8. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-11-30

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  9. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  10. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Tryggvason, Tryggvi

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...

  11. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm; Tryggvason, T.

    1998-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...

  12. NLTE atomic kinetics modeling in ICF target simulations

    Science.gov (United States)

    Patel, Mehul V.; Mauche, Christopher W.; Scott, Howard A.; Jones, Ogden S.; Shields, Benjamin T.

    2017-10-01

    Radiation hydrodynamics (HYDRA) simulations using recently developed 1D spherical and 2D cylindrical hohlraum models have enabled a reassessment of the accuracy of energetics modeling across a range of NIF target configurations. Higher-resolution hohlraum calculations generally find that the X-ray drive discrepancies are greater than previously reported. We identify important physics sensitivities in the modeling of the NLTE wall plasma and highlight sensitivity variations between different hohlraum configurations (e.g. hohlraum gas fill). Additionally, 1D capsule only simulations show the importance of applying a similar level of rigor to NLTE capsule ablator modeling. Taken together, these results show how improved target performance predictions can be achieved by performing inline atomic kinetics using more complete models for the underlying atomic structure and transitions. Prepared by LLNL under Contract DE-AC52-07NA27344.

  13. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  14. Suppressing correlations in massively parallel simulations of lattice models

    Science.gov (United States)

    Kelling, Jeffrey; Ódor, Géza; Gemming, Sibylle

    2017-11-01

    For lattice Monte Carlo simulations parallelization is crucial to make studies of large systems and long simulation time feasible, while sequential simulations remain the gold-standard for correlation-free dynamics. Here, various domain decomposition schemes are compared, concluding with one which delivers virtually correlation-free simulations on GPUs. Extensive simulations of the octahedron model for 2 + 1 dimensional Kardar-Parisi-Zhang surface growth, which is very sensitive to correlation in the site-selection dynamics, were performed to show self-consistency of the parallel runs and agreement with the sequential algorithm. We present a GPU implementation providing a speedup of about 30 × over a parallel CPU implementation on a single socket and at least 180 × with respect to the sequential reference.

  15. Modelling, simulation and validation of the industrial robot

    Directory of Open Access Journals (Sweden)

    Aleksandrov Slobodan Č.

    2014-01-01

    Full Text Available In this paper, a DH model of industrial robot, with anthropomorphic configuration and five degrees of freedom - Mitsubishi RV2AJ, is developed. The model is verified on the example robot Mitsubishi RV2AJ. In paper detailed represented the complete mathematical model of the robot and the parameters of the programming. On the basis of this model, simulation of robot motion from point to point is performed, as well as the continuous movement of the pre-defined path. Also, programming of industrial robots identical to simulation programs is made, and comparative analysis of real and simulated experiment is shown. In the final section, a detailed analysis of robot motion is described.

  16. An improved methodology for dynamic modelling and simulation of ...

    Indian Academy of Sciences (India)

    This presents a real struggle to the engineers who want to design and implement such systems with high performance, efficiency and reliability. For this purpose, engineers need a tool capable of modelling and/or simulating components of diverse nature within the ECDS. However, a majority of the available tools are limited ...

  17. Geant4 models for simulation of multiple scattering

    CERN Document Server

    Ivanchenko, V N; Maire, M; Urban, L

    2010-01-01

    Recent progress in development of single and multiple scattering models within the Geant4 toolkit is presented. Different options available to users are discussed. The comparisons with the data are shown. The trade of precision versus CPU performance is discussed with the focus on LHC detectors simulation

  18. Fracture Network Modeling and GoldSim Simulation Support

    OpenAIRE

    杉田 健一郎; Dershowiz, W.

    2003-01-01

    During Heisei-14, Golder Associates provided support for JNC Tokai through data analysis and simulation of the MIU Underground Rock Laboratory, participation in Task 6 of the Aspo Task Force on Modelling of Groundwater Flow and Transport, and analysis of repository safety assessment technologies including cell networks for evaluation of the disturbed rock zone (DRZ) and total systems performance assessment (TSPA).

  19. Distributed dynamic simulations of networked control and building performance applications.

    Science.gov (United States)

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  20. MODELING AND SIMULATION OF INDUSTRIAL FORMALDEHYDE ABSORBERS

    NARCIS (Netherlands)

    WINKELMAN, JGM; SIJBRING, H; BEENACKERS, AACM; DEVRIES, ET

    1992-01-01

    The industrially important process of formaldehyde absorption in water constitutes a case of multicomponent mass transfer with multiple reactions and considerable heat effects. A stable solution algorithm is developed to simulate the performance of industrial absorbers for this process using a

  1. The Five Key Questions of Human Performance Modeling.

    Science.gov (United States)

    Wu, Changxu

    2018-01-01

    Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.

  2. Ravenscar Computational Model compliant AADL Simulation on LEON2

    Directory of Open Access Journals (Sweden)

    Roberto Varona-Gómez

    2013-02-01

    Full Text Available AADL has been proposed for designing and analyzing SW and HW architectures for real-time mission-critical embedded systems. Although the Behavioral Annex improves its simulation semantics, AADL is a language for analyzing architectures and not for simulating them. AADS-T is an AADL simulation tool that supports the performance analysis of the AADL specification throughout the refinement process from the initial system architecture until the complete, detailed application and execution platform are developed. In this way, AADS-T enables the verification of the initial timing constraints during the complete design process. In this paper we focus on the compatibility of AADS-T with the Ravenscar Computational Model (RCM as part of the TASTE toolset. Its flexibility enables AADS-T to support different processors. In this work we have focused on performing the simulation on a LEON2 processor.

  3. Millimeter waves sensor modeling and simulation

    Science.gov (United States)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. One important class of sensors are millimeter waves radar systems that are very efficient for seeing through atmosphere and/or foliage for example. This type of high frequency radar can produce high quality images with very tricky features such as dihedral and trihedral bright points, shadows and lay over effect. Besides, image quality is very dependent on the carrier velocity and trajectory. Such sensors systems are so complex that they need simulation to be tested. This paper presents a state of the Art of millimeter waves sensor models. A short presentation of asymptotic methods shows that physical optics support is mandatory to reach realistic results. SE-Workbench-RF tool is presented and typical examples of results are shown both in the frame of Synthetic Aperture Radar sensors and Real Beam Ground Mapping radars. Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench-RF are showed and commented.

  4. A Concept of Simulation-based SC Performance Analysis Using SCOR Metrics

    OpenAIRE

    Šitova Irīna; Pečerska Jeļena

    2017-01-01

    The paper discusses a common approach to describing and analysing supply chains between simulation specialists and supply chain managers, which is based on Supply Chain Operations Reference (SCOR) model indicators and metrics. SCOR is a reference model of supply chain business processes. It is based on best practices and used in various business areas of supply chains. Supply chain performance indicators are defined by numerous measurable SCOR metrics. Some metrics can be estimated with simul...

  5. Modeling, Simulation and Analysis of Public Key Infrastructure

    Science.gov (United States)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  6. Validation and performance studies for the ATLAS simulation

    International Nuclear Information System (INIS)

    Marshall, Zachary

    2010-01-01

    We present the validation of the ATLAS simulation software project. Software development is controlled by nightly builds and several levels of automatic tests to ensure stability. Software performance validation, including CPU time, memory, and disk space required per event, is monitored for all software releases. Several different physics processes are checked to thoroughly test all aspects of the detector simulation. The robustness of the simulation software is demonstrated by the production of 500 million events on the World-wide LHC Computing Grid in the last year.

  7. Review of Methods Related to Assessing Human Performance in Nuclear Power Plant Control Room Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Katya L Le Blanc; Ronald L Boring; David I Gertman

    2001-11-01

    With the increased use of digital systems in Nuclear Power Plant (NPP) control rooms comes a need to thoroughly understand the human performance issues associated with digital systems. A common way to evaluate human performance is to test operators and crews in NPP control room simulators. However, it is often challenging to characterize human performance in meaningful ways when measuring performance in NPP control room simulations. A review of the literature in NPP simulator studies reveals a variety of ways to measure human performance in NPP control room simulations including direct observation, automated computer logging, recordings from physiological equipment, self-report techniques, protocol analysis and structured debriefs, and application of model-based evaluation. These methods and the particular measures used are summarized and evaluated.

  8. A simulation infrastructure for examining the performance of resilience strategies at scale.

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Kurt Brian; Levy, Scott N.; Bridges, Patrick G.

    2013-04-01

    Fault-tolerance is a major challenge for many current and future extreme-scale systems, with many studies showing it to be the key limiter to application scalability. While there are a number of studies investigating the performance of various resilience mechanisms, these are typically limited to scales orders of magnitude smaller than expected for next-generation systems and simple benchmark problems. In this paper we show how, with very minor changes, a previously published and validated simulation framework for investigating appli- cation performance of OS noise can be used to simulate the overheads of various resilience mechanisms at scale. Using this framework, we compare the failure-free performance of this simulator against an analytic model to validate its performance and demonstrate its ability to simulate the performance of two popular rollback recovery methods on traces from real

  9. A Simulation and Modeling Framework for Space Situational Awareness

    Science.gov (United States)

    Olivier, S.

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. This framework includes detailed models for threat scenarios, signatures, sensors, observables and knowledge extraction algorithms. The framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the details of the modeling and simulation framework, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical and infra-red brightness calculations, generic radar system models, generic optical and infra-red system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The specific modeling of the Space Surveillance Network is performed in collaboration with the Air Force Space Command Space Control Group. We will demonstrate the use of this integrated simulation and modeling framework on specific threat scenarios, including space debris and satellite maneuvers, and we will examine the results of case studies involving the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  10. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  11. 3D simulation of the Cluster-Cluster Aggregation model

    Science.gov (United States)

    Li, Chao; Xiong, Hailing

    2014-12-01

    We write a program to implement the Cluster-Cluster Aggregation (CCA) model with java programming language. By using the simulation program, the fractal aggregation growth process can be displayed dynamically in the form of a three-dimensional (3D) figure. Meanwhile, the related kinetics data of aggregation simulation can be also recorded dynamically. Compared to the traditional programs, the program has better real-time performance and is more helpful to observe the fractal growth process, which contributes to the scientific study in fractal aggregation. Besides, because of adopting java programming language, the program has very good cross-platform performance.

  12. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  13. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels

    2013-01-01

    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  14. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  15. preliminary multidomain modelling and simulation study

    African Journals Online (AJOL)

    user

    PRELIMINARY MULTIDOMAIN MODELLING AND SIMULATION STUDY OF A. HORIZONTAL AXIS WIND TURBINE (HAWT) TOWER VIBRATION. I. lliyasu1, I. Iliyasu2, I. K. Tanimu3 and D. O Obada4. 1,4 DEPARTMENT OF MECHANICAL ENGINEERING, AHMADU BELLO UNIVERSITY, ZARIA, KADUNA STATE. NIGERIA.

  16. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  17. Survey of chemically amplified resist models and simulator algorithms

    Science.gov (United States)

    Croffie, Ebo H.; Yuan, Lei; Cheng, Mosong; Neureuther, Andrew R.

    2001-08-01

    Modeling has become indespensable tool for chemically amplified resist (CAR) evaluations. It has been used extensively to study acid diffusion and its effects on resist image formation. Several commercial and academic simulators have been developed for CAR process simulation. For commercial simulators such as PROLITH (Finle Technologies) and Solid-C (Sigma-C), the user is allowed to choose between an empirical model or a concentration dependant diffusion model. The empirical model is faster but not very accurate for 2-dimension resist simulations. In this case there is a trade off between the speed of the simulator and the accuracy of the results. An academic simulator such as STORM (U.C. Berkeley) gives the user a choice of different algorithms including Fast Imaging 2nd order finite difference algorithm and Moving Boundary finite element algorithm. A user interested in simulating the volume shrinkage and polymer stress effects during post exposure bake will need the Moving Boundary algorithm whereas a user interested in the latent image formation without polymer deformations will find the Fast Imaging algorithm more appropriate. The Fast Imaging algorithm is generally faster and requires less computer memory. This choice of algorithm presents a trade off between speed and level of detail in resist profile prediction. This paper surveys the different models and simulator algorithms available in the literature. Contributions in the field of CAR modeling including contributions to characterization of CAR exposure and post exposure bake (PEB) processes for different resist systems. Several numerical algorithms and their performances will also be discussed in this paper.

  18. Thermohydraulic modeling and simulation of breeder reactors

    International Nuclear Information System (INIS)

    Agrawal, A.K.; Khatib-Rahbar, M.; Curtis, R.T.; Hetrick, D.L.; Girijashankar, P.V.

    1982-01-01

    This paper deals with the modeling and simulation of system-wide transients in LMFBRs. Unprotected events (i.e., the presumption of failure of the plant protection system) leading to core-melt are not considered in this paper. The existing computational capabilities in the area of protected transients in the US are noted. Various physical and numerical approximations that are made in these codes are discussed. Finally, the future direction in the area of model verification and improvements is discussed

  19. 18th and 19th Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Kobayashi, Hiroaki; Patel, Nisarg

    2015-01-01

    This book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and the future of high-performance systems and heterogeneous architectures in particular. The application-related contributions cover computational fluid dynamics, material science, medical applications and climate research; innovative fields such as coupled multi-physics and multi-scale simulations are highlighted. All papers were chosen from presentations given at the 18th Workshop on Sustained Simulation Performance held at the HLRS, University of Stuttgart, Germany in October 2013 and subsequent Workshop of the same name held at Tohoku University in March 2014.  

  20. Tools for evaluating team performance in simulation-based training.

    Science.gov (United States)

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-10-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine.

  1. Blood vessel modeling for interactive simulation of interventional neuroradiology procedures.

    Science.gov (United States)

    Kerrien, E; Yureidini, A; Dequidt, J; Duriez, C; Anxionnat, R; Cotin, S

    2017-01-01

    Endovascular interventions can benefit from interactive simulation in their training phase but also during pre-operative and intra-operative phases if simulation scenarios are based on patient data. A key feature in this context is the ability to extract, from patient images, models of blood vessels that impede neither the realism nor the performance of simulation. This paper addresses both the segmentation and reconstruction of the vasculature from 3D Rotational Angiography data, and adapted to simulation: An original tracking algorithm is proposed to segment the vessel tree while filtering points extracted at the vessel surface in the vicinity of each point on the centerline; then an automatic procedure is described to reconstruct each local unstructured point set as a skeleton-based implicit surface (blobby model). The output of successively applying both algorithms is a new model of vasculature as a tree of local implicit models. The segmentation algorithm is compared with Multiple Hypothesis Testing (MHT) algorithm (Friman et al., 2010) on patient data, showing its greater ability to track blood vessels. The reconstruction algorithm is evaluated on both synthetic and patient data and demonstrate its ability to fit points with a subvoxel precision. Various tests are also reported where our model is used to simulate catheter navigation in interventional neuroradiology. An excellent realism, and much lower computational costs are reported when compared to triangular mesh surface models. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Characterization uncertainty and its effects on models and performance

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization.

  3. Twitter's tweet method modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  4. Advances in NLTE modeling for integrated simulations

    Science.gov (United States)

    Scott, H. A.; Hansen, S. B.

    2010-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different atomic species for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly-excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with sufficient accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δ n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short time steps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  5. Advances in NLTE Modeling for Integrated Simulations

    International Nuclear Information System (INIS)

    Scott, H.A.; Hansen, S.B.

    2009-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δn = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  6. Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

    Directory of Open Access Journals (Sweden)

    Florin-Catalin ENACHE

    2015-10-01

    Full Text Available The growing character of the cloud business has manifested exponentially in the last 5 years. The capacity managers need to concentrate on a practical way to simulate the random demands a cloud infrastructure could face, even if there are not too many mathematical tools to simulate such demands.This paper presents an introduction into the most important stochastic processes and queueing theory concepts used for modeling computer performance. Moreover, it shows the cases where such concepts are applicable and when not, using clear programming examples on how to simulate a queue, and how to use and validate a simulation, when there are no mathematical concepts to back it up.

  7. Performance evaluation by simulation and analysis with applications to computer networks

    CERN Document Server

    Chen, Ken

    2015-01-01

    This book is devoted to the most used methodologies for performance evaluation: simulation using specialized software and mathematical modeling. An important part is dedicated to the simulation, particularly in its theoretical framework and the precautions to be taken in the implementation of the experimental procedure.  These principles are illustrated by concrete examples achieved through operational simulation languages ​​(OMNeT ++, OPNET). Presented under the complementary approach, the mathematical method is essential for the simulation. Both methodologies based largely on the theory of

  8. Dual Arm Work Package performance estimates and telerobot task network simulation

    International Nuclear Information System (INIS)

    Draper, J.V.

    1997-01-01

    This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy's Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collected to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations

  9. Particle tracking in sophisticated CAD models for simulation purposes

    International Nuclear Information System (INIS)

    Sulkimo, J.; Vuoskoski, J.

    1995-01-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT. (orig.)

  10. cellGPU: Massively parallel simulations of dynamic vertex models

    Science.gov (United States)

    Sussman, Daniel M.

    2017-10-01

    Vertex models represent confluent tissue by polygonal or polyhedral tilings of space, with the individual cells interacting via force laws that depend on both the geometry of the cells and the topology of the tessellation. This dependence on the connectivity of the cellular network introduces several complications to performing molecular-dynamics-like simulations of vertex models, and in particular makes parallelizing the simulations difficult. cellGPU addresses this difficulty and lays the foundation for massively parallelized, GPU-based simulations of these models. This article discusses its implementation for a pair of two-dimensional models, and compares the typical performance that can be expected between running cellGPU entirely on the CPU versus its performance when running on a range of commercial and server-grade graphics cards. By implementing the calculation of topological changes and forces on cells in a highly parallelizable fashion, cellGPU enables researchers to simulate time- and length-scales previously inaccessible via existing single-threaded CPU implementations. Program Files doi:http://dx.doi.org/10.17632/6j2cj29t3r.1 Licensing provisions: MIT Programming language: CUDA/C++ Nature of problem: Simulations of off-lattice "vertex models" of cells, in which the interaction forces depend on both the geometry and the topology of the cellular aggregate. Solution method: Highly parallelized GPU-accelerated dynamical simulations in which the force calculations and the topological features can be handled on either the CPU or GPU. Additional comments: The code is hosted at https://gitlab.com/dmsussman/cellGPU, with documentation additionally maintained at http://dmsussman.gitlab.io/cellGPUdocumentation

  11. SIMULATION MODELING OF IT PROJECTS BASED ON PETRI NETS

    Directory of Open Access Journals (Sweden)

    Александр Михайлович ВОЗНЫЙ

    2015-05-01

    Full Text Available An integrated simulation model of IT project based on a modified Petri net model that combines product and model of project tasks has been proposed. Substantive interpretation of the components of the simulation model has been presented, the process of simulation has been described. The conclusions about the integration of the product model and the model of works project were made.

  12. Transient performance simulation of aircraft engine integrated with fuel and control systems

    International Nuclear Information System (INIS)

    Wang, C.; Li, Y.G.; Yang, B.Y.

    2017-01-01

    Highlights: • A new performance simulation method for engine hydraulic fuel systems is introduced. • Time delay of engine performance due to fuel system model is noticeable but small. • The method provides details of fuel system behavior in engine transient processes. • The method could be used to support engine and fuel system designs. - Abstract: A new method for the simulation of gas turbine fuel systems based on an inter-component volume method has been developed. It is able to simulate the performance of each of the hydraulic components of a fuel system using physics-based models, which potentially offers more accurate results compared with those using transfer functions. A transient performance simulation system has been set up for gas turbine engines based on an inter-component volume (ICV) method. A proportional-integral (PI) control strategy is used for the simulation of engine controller. An integrated engine and its control and hydraulic fuel systems has been set up to investigate their coupling effect during engine transient processes. The developed simulation system has been applied to a model aero engine. The results show that the delay of the engine transient response due to the inclusion of the fuel system model is noticeable although relatively small. The developed method is generic and can be applied to any other gas turbines and their control and fuel systems.

  13. Accelerating transient simulation of linear reduced order models.

    Energy Technology Data Exchange (ETDEWEB)

    Thornquist, Heidi K.; Mei, Ting; Keiter, Eric Richard; Bond, Brad

    2011-10-01

    Model order reduction (MOR) techniques have been used to facilitate the analysis of dynamical systems for many years. Although existing model reduction techniques are capable of providing huge speedups in the frequency domain analysis (i.e. AC response) of linear systems, such speedups are often not obtained when performing transient analysis on the systems, particularly when coupled with other circuit components. Reduced system size, which is the ostensible goal of MOR methods, is often insufficient to improve transient simulation speed on realistic circuit problems. It can be shown that making the correct reduced order model (ROM) implementation choices is crucial to the practical application of MOR methods. In this report we investigate methods for accelerating the simulation of circuits containing ROM blocks using the circuit simulator Xyce.

  14. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  15. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L

    2007-01-01

    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  16. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  17. Numerical model simulation of atmospheric coolant plumes

    International Nuclear Information System (INIS)

    Gaillard, P.

    1980-01-01

    The effect of humid atmospheric coolants on the atmosphere is simulated by means of a three-dimensional numerical model. The atmosphere is defined by its natural vertical profiles of horizontal velocity, temperature, pressure and relative humidity. Effluent discharge is characterised by its vertical velocity and the temperature of air satured with water vapour. The subject of investigation is the area in the vicinity of the point of discharge, with due allowance for the wake effect of the tower and buildings and, where application, wind veer with altitude. The model equations express the conservation relationships for mometum, energy, total mass and water mass, for an incompressible fluid behaving in accordance with the Boussinesq assumptions. Condensation is represented by a simple thermodynamic model, and turbulent fluxes are simulated by introduction of turbulent viscosity and diffusivity data based on in-situ and experimental water model measurements. The three-dimensional problem expressed in terms of the primitive variables (u, v, w, p) is governed by an elliptic equation system which is solved numerically by application of an explicit time-marching algorithm in order to predict the steady-flow velocity distribution, temperature, water vapour concentration and the liquid-water concentration defining the visible plume. Windstill conditions are simulated by a program processing the elliptic equations in an axisymmetrical revolution coordinate system. The calculated visible plumes are compared with plumes observed on site with a view to validate the models [fr

  18. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. On the performance of small-scale living facilities in nursing homes: A simulation approach

    NARCIS (Netherlands)

    Moeke, D.; van de Geer, R.; Koole, G.M.; Bekker, R.

    2016-01-01

    Scientific evidence on the impact of small-scale living facilities (SSLFs) on quality of life of nursing home clients remains scarce. In this study a simulation model is developed to examine the performance of SSLFs, in terms of meeting the time preferences of their residents. We model scheduled

  20. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.

    1998-02-01

    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  1. MODELING THE Ly α FOREST IN COLLISIONLESS SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Sorini, Daniele; Oñorbe, José; Hennawi, Joseph F. [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Lukić, Zarija, E-mail: sorini@mpia-hd.mpg.de [Lawrence Berkeley National Laboratory, CA 94720-8139 (United States)

    2016-08-20

    Cosmological hydrodynamic simulations can accurately predict the properties of the intergalactic medium (IGM), but only under the condition of retaining the high spatial resolution necessary to resolve density fluctuations in the IGM. This resolution constraint prohibits simulating large volumes, such as those probed by BOSS and future surveys, like DESI and 4MOST. To overcome this limitation, we present “Iteratively Matched Statistics” (IMS), a novel method to accurately model the Ly α forest with collisionless N -body simulations, where the relevant density fluctuations are unresolved. We use a small-box, high-resolution hydrodynamic simulation to obtain the probability distribution function (PDF) and the power spectrum of the real-space Ly α forest flux. These two statistics are iteratively mapped onto a pseudo-flux field of an N -body simulation, which we construct from the matter density. We demonstrate that our method can reproduce the PDF, line of sight and 3D power spectra of the Ly α forest with good accuracy (7%, 4%, and 7% respectively). We quantify the performance of the commonly used Gaussian smoothing technique and show that it has significantly lower accuracy (20%–80%), especially for N -body simulations with achievable mean inter-particle separations in large-volume simulations. In addition, we show that IMS produces reasonable and smooth spectra, making it a powerful tool for modeling the IGM in large cosmological volumes and for producing realistic “mock” skies for Ly α forest surveys.

  2. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  3. Simulation as a surgical teaching model.

    Science.gov (United States)

    Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos

    2018-01-01

    Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Simulated astigmatism impairs academic-related performance in children.

    Science.gov (United States)

    Narayanasamy, Sumithira; Vincent, Stephen J; Sampson, Geoff P; Wood, Joanne M

    2015-01-01

    Astigmatism is an important refractive condition in children. However, the functional impact of uncorrected astigmatism in this population is not well established, particularly with regard to academic performance. This study investigated the impact of simulated bilateral astigmatism on academic-related tasks before and after sustained near work in children. Twenty visually normal children (mean age: 10.8 ± 0.7 years; six males and 14 females) completed a range of standardised academic-related tests with and without 1.50 D of simulated bilateral astigmatism (with both academic-related tests and the visual condition administered in a randomised order). The simulated astigmatism was induced using a positive cylindrical lens while maintaining a plano spherical equivalent. Performance was assessed before and after 20 min of sustained near work, during two separate testing sessions. Academic-related measures included a standardised reading test (the Neale Analysis of Reading Ability), visual information processing tests (Coding and Symbol Search subtests from the Wechsler Intelligence Scale for Children) and a reading-related eye movement test (the Developmental Eye Movement test). Each participant was systematically assigned either with-the-rule (WTR, axis 180°) or against-the-rule (ATR, axis 90°) simulated astigmatism to evaluate the influence of axis orientation on any decrements in performance. Reading, visual information processing and reading-related eye movement performance were all significantly impaired by both simulated bilateral astigmatism (p  0.05). Simulated astigmatism led to a reduction of between 5% and 12% in performance across the academic-related outcome measures, but there was no significant effect of the axis (WTR or ATR) of astigmatism (p > 0.05). Simulated bilateral astigmatism impaired children's performance on a range of academic-related outcome measures irrespective of the orientation of the astigmatism. These findings have

  5. Intraocular Telescopic System Design: Optical and Visual Simulation in a Human Eye Model

    OpenAIRE

    Zoulinakis, Georgios; Ferrer-Blasco, Teresa

    2017-01-01

    Purpose. To design an intraocular telescopic system (ITS) for magnifying retinal image and to simulate its optical and visual performance after implantation in a human eye model. Methods. Design and simulation were carried out with a ray-tracing and optical design software. Two different ITS were designed, and their visual performance was simulated using the Liou-Brennan eye model. The difference between the ITS was their lenses’ placement in the eye model and their powers. Ray tracing in bot...

  6. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  7. Simple model of surface roughness for binary collision sputtering simulations

    International Nuclear Information System (INIS)

    Lindsey, Sloan J.; Hobler, Gerhard; Maciążek, Dawid; Postawa, Zbigniew

    2017-01-01

    Highlights: • A simple model of surface roughness is proposed. • Its key feature is a linearly varying target density at the surface. • The model can be used in 1D/2D/3D Monte Carlo binary collision simulations. • The model fits well experimental glancing incidence sputtering yield data. - Abstract: It has been shown that surface roughness can strongly influence the sputtering yield – especially at glancing incidence angles where the inclusion of surface roughness leads to an increase in sputtering yields. In this work, we propose a simple one-parameter model (the “density gradient model”) which imitates surface roughness effects. In the model, the target’s atomic density is assumed to vary linearly between the actual material density and zero. The layer width is the sole model parameter. The model has been implemented in the binary collision simulator IMSIL and has been evaluated against various geometric surface models for 5 keV Ga ions impinging an amorphous Si target. To aid the construction of a realistic rough surface topography, we have performed MD simulations of sequential 5 keV Ga impacts on an initially crystalline Si target. We show that our new model effectively reproduces the sputtering yield, with only minor variations in the energy and angular distributions of sputtered particles. The success of the density gradient model is attributed to a reduction of the reflection coefficient – leading to increased sputtering yields, similar in effect to surface roughness.

  8. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  9. Modeling and simulation of photovoltaic solar panel

    International Nuclear Information System (INIS)

    Belarbi, M.; Haddouche, K.; Midoun, A.

    2006-01-01

    In this article, we present a new approach for estimating the model parameters of a photovoltaic solar panel according to the irradiance and temperature. The parameters of the one diode model are given from the knowledge of three operating points: short-circuit, open circuit, and maximum power. In the first step, the adopted approach concerns the resolution of the system of equations constituting the three operating points to write all the model parameters according to series resistance. Secondly, we make an iterative resolution at the optimal operating point by using the Newton-Raphson method to calculate the series resistance value as well as the model parameters. Once the panel model is identified, we consider other equations for taking into account the irradiance and temperature effect. The simulation results show the convergence speed of the model parameters and the possibility of visualizing the electrical behaviour of the panel according to the irradiance and temperature. Let us note that a sensitivity of the algorithm at the optimal operating point was observed owing to the fact that a small variation of the optimal voltage value leads to a very great variation of the identified parameters values. With the identified model, we can develop algorithms of maximum power point tracking, and make simulations of a solar water pumping system.(Author)

  10. A simulation model for material accounting systems

    International Nuclear Information System (INIS)

    Coulter, C.A.; Thomas, K.E.

    1987-01-01

    A general-purpose model that was developed to simulate the operation of a chemical processing facility for nuclear materials has been extended to describe material measurement and accounting procedures as well. The model now provides descriptors for material balance areas, a large class of measurement instrument types and their associated measurement errors for various classes of materials, the measurement instruments themselves with their individual calibration schedules, and material balance closures. Delayed receipt of measurement results (as for off-line analytical chemistry assay), with interim use of a provisional measurement value, can be accurately represented. The simulation model can be used to estimate inventory difference variances for processing areas that do not operate at steady state, to evaluate the timeliness of measurement information, to determine process impacts of measurement requirements, and to evaluate the effectiveness of diversion-detection algorithms. Such information is usually difficult to obtain by other means. Use of the measurement simulation model is illustrated by applying it to estimate inventory difference variances for two material balance area structures of a fictitious nuclear material processing line

  11. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  12. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  13. Performance simulation of a natural circulation solar air heater with ...

    African Journals Online (AJOL)

    Performance simulation of a natural circulation solar air heater with phase change material energy storage. ... The predicted temperatures of the system are compared with the experimental data under daytime no-load condition over the ambient temperature range of 18.5-36.0OC and daily global irradiation of ...

  14. Pilot performance evaluation of simulated flight approach and ...

    Indian Academy of Sciences (India)

    This research work examines the application of different statistical and empirical analysis methods to quantify pilot performance. A realistic approach and landing flight scenario is executed using the reconfigurable flight simulator at National Aerospace Laboratories and both subjective and quantitative measures are applied ...

  15. Visuospatial ability factors and performance variables in laparoscopic simulator training

    NARCIS (Netherlands)

    Luursema, J.M.; Verwey, Willem B.; Burie, Remke

    2012-01-01

    Visuospatial ability has been shown to be important to several aspects of laparoscopic performance, including simulator training. Only a limited subset of visuospatial ability factors however has been investigated in such studies. Tests for different visuospatial ability factors differ in stimulus

  16. Visuospatial Ability Factors and Performance Variables in Laparoscopic Simulator Training

    Science.gov (United States)

    Luursema, Jan-Maarten; Verwey, Willem B.; Burie, Remke

    2012-01-01

    Visuospatial ability has been shown to be important to several aspects of laparoscopic performance, including simulator training. Only a limited subset of visuospatial ability factors however has been investigated in such studies. Tests for different visuospatial ability factors differ in stimulus complexity, in their emphasis on identifying…

  17. Performance analysis of bullet trajectory estimation: Approach, simulation, and experiments

    Energy Technology Data Exchange (ETDEWEB)

    Ng, L.C.; Karr, T.J.

    1994-11-08

    This paper describes an approach to estimate a bullet`s trajectory from a time sequence of angles-only observations from a high-speed camera, and analyzes its performance. The technique is based on fitting a ballistic model of a bullet in flight along with unknown source location parameters to a time series of angular observations. The theory is developed to precisely reconstruct, from firing range geometry, the actual bullet trajectory as it appeared on the focal plane array and in real space. A metric for measuring the effective trajectory track error is also presented. Detailed Monte-Carlo simulations assuming different bullet ranges, shot-angles, camera frame rates, and angular noise show that angular track error can be as small as 100 {mu}rad for a 2 mrad/pixel sensor. It is also shown that if actual values of bullet ballistic parameters were available, the bullet s source location variables, and the angles of flight information could also be determined.

  18. MASADA: A MODELING AND SIMULATION AUTOMATED DATA ANALYSIS FRAMEWORK FOR CONTINUOUS DATA-INTENSIVE VALIDATION OF SIMULATION MODELS

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  19. MASADA: A Modeling and Simulation Automated Data Analysis framework for continuous data-intensive validation of simulation models

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  20. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  1. Comprehensive Performance Evaluation for Hydrological and Nutrients Simulation Using the Hydrological Simulation Program–Fortran in a Mesoscale Monsoon Watershed, China

    OpenAIRE

    Zhaofu Li; Chuan Luo; Kaixia Jiang; Rongrong Wan; Hengpeng Li

    2017-01-01

    The Hydrological Simulation Program–Fortran (HSPF) is a hydrological and water quality computer model that was developed by the United States Environmental Protection Agency. Comprehensive performance evaluations were carried out for hydrological and nutrient simulation using the HSPF model in the Xitiaoxi watershed in China. Streamflow simulation was calibrated from 1 January 2002 to 31 December 2007 and then validated from 1 January 2008 to 31 December 2010 using daily observed data, and nu...

  2. eShopper modeling and simulation

    Science.gov (United States)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  3. Simulation modelling in agriculture: General considerations. | R.I. ...

    African Journals Online (AJOL)

    The computer does all the necessary arithmetic when the hypothesis is invoked to predict the future behaviour of the simulated system under given conditions.A general ... in the advisory service. Keywords: agriculture; botany; computer simulation; modelling; simulation model; simulation modelling; south africa; techniques ...

  4. Evaluating Asset Pricing Models in a Simulated Multifactor Approach

    Directory of Open Access Journals (Sweden)

    Wagner Piazza Gaglianone

    2012-12-01

    Full Text Available In this paper a methodology to compare the performance of different stochastic discount factor (SDF models is suggested. The starting point is the estimation of several factor models in which the choice of the fundamental factors comes from different procedures. Then, a Monte Carlo simulation is designed in order to simulate a set of gross returns with the objective of mimicking the temporal dependency and the observed covariance across gross returns. Finally, the artificial returns are used to investigate the performance of the competing asset pricing models through the Hansen and Jagannathan (1997 distance and some goodness-of-fit statistics of the pricing error. An empirical application is provided for the U.S. stock market.

  5. Modelling and Simulation Analysis of Rolling Motion of Spherical Robot

    Science.gov (United States)

    Kamis, N. N.; Embong, A. H.; Ahmad, S.

    2017-11-01

    This paper presents the findings of modelling, control and analysis of the spherical rolling robot based on pendulum driven within the simulation environment. The spherical robot is modelled using Lagrange function based on the equation of rolling motion. PD-type Fuzzy logic controller (FLC) was designed to control the position of the spherical robot where 25 rules were constructed to control the rolling motion of spherical robot. It was then integrated with the model developed in Simulink-Matlab environment. The output scaling factor (output gain) of the FLC was heuristically tuned to improve the system performance. The simulation results show that the FLC managed to eliminate the overshoot response and demonstrated better performance with 29.67% increasing in settling time to reach 0.01% of steady state error.

  6. Aqueous Electrolytes: Model Parameters and Process Simulation

    DEFF Research Database (Denmark)

    Thomsen, Kaj

    This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer ...... program including a steady state process simulator for the design, simulation, and optimization of fractional crystallization processes is presented.......This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer...

  7. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  8. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  9. Performance predictions for solar-chemical convertors by computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Luttmer, J.D.; Trachtenberg, I.

    1985-08-01

    A computer model which simulates the operation of Texas Instruments solar-chemical convertor (SCC) was developed. The model allows optimization of SCC processes, material, and configuration by facilitating decisions on tradeoffs among ease of manufacturing, power conversion efficiency, and cost effectiveness. The model includes various algorithms which define the electrical, electrochemical, and resistance parameters and which describ the operation of the discrete components of the SCC. Results of the model which depict the effect of material and geometric changes on various parameters are presented. The computer-calculated operation is compared with experimentall observed hydrobromic acid electrolysis rates.

  10. A Placement Model for Flight Simulators.

    Science.gov (United States)

    1982-09-01

    simulator basing strategies. Captains David R. VanDenburg and Jon D. Veith developed a mathematical model to assist in the placement analysis of A-7...Institute for Defense Analysis, Arlington VA, August 1977. AD A049979. 23. Sugarman , Robert C., Steven L. Johnson, and William F. H. Ring. "B-I Systems...USAF Cost and Plan- nin& Factors. AFR 173-13. Washington: Govern- ment Printing Office, I February 1982. * 30. Van Denburg, Captain David R., USAF

  11. Rejection-free stochastic simulation of BNGL-encoded models

    Energy Technology Data Exchange (ETDEWEB)

    Hlavacek, William S [Los Alamos National Laboratory; Monine, Michael I [Los Alamos National Laboratory; Colvin, Joshua [TRANSLATIONAL GENOM; Posner, Richard G [NORTHERN ARIZONA UNIV.; Von Hoff, Daniel D [TRANSLATIONAL GENOMICS RESEARCH INSTIT.

    2009-01-01

    Formal rules encoded using the BioNetGen language (BNGL) can be used to represent the system-level dynamics of molecular interactions. Rules allow one to compactly and implicitly specify the reaction network implied by a set of molecules and their interactions. Typically, the reaction network implied by a set of rules is large, which makes generation of the underlying rule-defined network expensive. Moreover, the cost of conventional simulation methods typically depends on network size. Together these factors have limited application of the rule-based modeling approach. To overcome this limitation, several methods have recently been developed for determining the reaction dynamics implied by rules while avoiding the expensive step of network generation. The cost of these 'network-free' simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is needed for the analysis of rule-based models of biochemical systems. Here, we present a software tool called RuleMonkey that implements a network-free stochastic simulation method for rule-based models. The method is rejection free, unlike other network-free methods that introduce null events (i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated), and the software is capable of simulating models encoded in BNGL, a general-purpose model-specification language. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant general-purpose simulator for rule-based models, as well as various problem-specific codes that implement network-free simulation methods. RuleMonkey enables the simulation of models defined by rule sets that imply large-scale reaction networks. It is faster than DYNSTOC for stiff problems, although it requires the use of more computer memory. RuleMonkey is freely available for non-commercial use as a stand

  12. Simulation Modelling and Strategic Change: Creating the Sustainable Enterprise

    Directory of Open Access Journals (Sweden)

    Patrick Dawson

    2010-01-01

    Full Text Available This paper highlights the benefits of using discrete event simulation models for developing change management frameworks which facilitate productivity and environmental improvements in order to create a sustainable enterprise. There is an increasing need for organisations to be more socially and environmentally responsible, however these objectives cannot be realised in isolation of the strategic, operations and business objectives of the enterprise. Discrete Event Simulation models facilitate a multidimensional approach to enterprise modelling which can integrate operations and strategic considerations with environmental and social issues. Moreover these models can provide a dynamic roadmap for implementing a change strategy for realising the optimal conditions for operational and environmental performance. It is important to note that the nature of change is itself dynamic and that simulation models are capable of characterising the dynamics of the change process. The paper argues that incorporating social and environmental challenges into a strategic business model for an enterprise can result in improved profits and long term viability and that a multidimensional simulation approach can support decision making throughout the change process to more effectively achieve these goals.

  13. Simulation Tools Model Icing for Aircraft Design

    Science.gov (United States)

    2012-01-01

    Here s a simple science experiment to try: Place an unopened bottle of distilled water in your freezer. After 2-3 hours, if the water is pure enough, you will notice that it has not frozen. Carefully pour the water into a bowl with a piece of ice in it. When it strikes the ice, the water will instantly freeze. One of the most basic and commonly known scientific facts is that water freezes at around 32 F. But this is not always the case. Water lacking any impurities for ice crystals to form around can be supercooled to even lower temperatures without freezing. High in the atmosphere, water droplets can achieve this delicate, supercooled state. When a plane flies through clouds containing these droplets, the water can strike the airframe and, like the supercooled water hitting the ice in the experiment above, freeze instantly. The ice buildup alters the aerodynamics of the plane - reducing lift and increasing drag - affecting its performance and presenting a safety issue if the plane can no longer fly effectively. In certain circumstances, ice can form inside aircraft engines, another potential hazard. NASA has long studied ways of detecting and countering atmospheric icing conditions as part of the Agency s efforts to enhance aviation safety. To do this, the Icing Branch at Glenn Research Center utilizes a number of world-class tools, including the Center s Icing Research Tunnel and the NASA 607 icing research aircraft, a "flying laboratory" for studying icing conditions. The branch has also developed a suite of software programs to help aircraft and icing protection system designers understand the behavior of ice accumulation on various surfaces and in various conditions. One of these innovations is the LEWICE ice accretion simulation software. Initially developed in the 1980s (when Glenn was known as Lewis Research Center), LEWICE has become one of the most widely used tools in icing research and aircraft design and certification. LEWICE has been transformed over

  14. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  15. MODELING AND SIMULATION OF A HYDROCRACKING UNIT

    Directory of Open Access Journals (Sweden)

    HASSAN A. FARAG

    2016-06-01

    Full Text Available Hydrocracking is used in the petroleum industry to convert low quality feed stocks into high valued transportation fuels such as gasoline, diesel, and jet fuel. The aim of the present work is to develop a rigorous steady state two-dimensional mathematical model which includes conservation equations of mass and energy for simulating the operation of a hydrocracking unit. Both the catalyst bed and quench zone have been included in this integrated model. The model equations were numerically solved in both axial and radial directions using Matlab software. The presented model was tested against a real plant data in Egypt. The results indicated that a very good agreement between the model predictions and industrial values have been reported for temperature profiles, concentration profiles, and conversion in both radial and axial directions at the hydrocracking unit. Simulation of the quench zone conversion and temperature profiles in the quench zone was also included and gave a low deviation from the actual ones. In concentration profiles, the percentage deviation in the first reactor was found to be 9.28 % and 9.6% for the second reactor. The effect of several parameters such as: Pellet Heat Transfer Coefficient, Effective Radial Thermal Conductivity, Wall Heat Transfer Coefficient, Effective Radial Diffusivity, and Cooling medium (quench zone has been included in this study. The variation of Wall Heat Transfer Coefficient, Effective Radial Diffusivity for the near-wall region, gave no remarkable changes in the temperature profiles. On the other hand, even small variations of Effective Radial Thermal Conductivity, affected the simulated temperature profiles significantly, and this effect could not be compensated by the variations of the other parameters of the model.

  16. Reactive transport models and simulation with ALLIANCES

    International Nuclear Information System (INIS)

    Leterrier, N.; Deville, E.; Bary, B.; Trotignon, L.; Hedde, T.; Cochepin, B.; Stora, E.

    2009-01-01

    Many chemical processes influence the evolution of nuclear waste storage. As a result, simulations based only upon transport and hydraulic processes fail to describe adequately some industrial scenarios. We need to take into account complex chemical models (mass action laws, kinetics...) which are highly non-linear. In order to simulate the coupling of these chemical reactions with transport, we use a classical Sequential Iterative Approach (SIA), with a fixed point algorithm, within the mainframe of the ALLIANCES platform. This approach allows us to use the various transport and chemical modules available in ALLIANCES, via an operator-splitting method based upon the structure of the chemical system. We present five different applications of reactive transport simulations in the context of nuclear waste storage: 1. A 2D simulation of the lixiviation by rain water of an underground polluted zone high in uranium oxide; 2. The degradation of the steel envelope of a package in contact with clay. Corrosion of the steel creates corrosion products and the altered package becomes a porous medium. We follow the degradation front through kinetic reactions and the coupling with transport; 3. The degradation of a cement-based material by the injection of an aqueous solution of zinc and sulphate ions. In addition to the reactive transport coupling, we take into account in this case the hydraulic retroaction of the porosity variation on the Darcy velocity; 4. The decalcification of a concrete beam in an underground storage structure. In this case, in addition to the reactive transport simulation, we take into account the interaction between chemical degradation and the mechanical forces (cracks...), and the retroactive influence on the structure changes on transport; 5. The degradation of the steel envelope of a package in contact with a clay material under a temperature gradient. In this case the reactive transport simulation is entirely directed by the temperature changes and

  17. Simulating the service life performance of an inspected group of jacket-type structures

    DEFF Research Database (Denmark)

    Schneider, Ronald; Thöns, Sebastian; Rogge, Andreas

    2017-01-01

    failure probability conditional on simulated inspection and repair histories, and evaluates the associated costs and risk. The expected total service life costs and risk for a strategy are finally determined using Monte Carlo simulation. The optimal strategy minimizes the expected total service life costs...... and risk. We intend to adopt this approach to optimize inspection, monitoring and repair activities for offshore wind park support structures. As a first step, we simulate – in analogy to an offshore wind park – the service life performance of an inspected group of jacket-type frames. The performance...... is quantified in terms of the group’s system failure probability conditional on simulated inspection and repair histories. The underlying system model accounts for the structural redundancy of the frames and the interdependence among their failure events due to similar loading conditions. The model also...

  18. Unsafe Acts from Human Performance Analysis with Simulator Training Data

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Jung, Wondea

    2015-01-01

    In this research, UA is defined as an inappropriate human behavior that has a potential for leading the safety of NPPs to a negative direction. The purpose of this paper is to describe a process of UA analysis through a case study under a simulated emergency and the analysis results. In this paper, we described the process for UA identification and demonstrated examples of UA accompanied by consequence by a case study on ISLOCA scenario. It is well known that the contribution of human performance related problems (or human error) to the safety of sociotechnical systems is critical. Moreover, it is reported that about 75% of all accidents and/or incidents that have occurred in the complicated process control systems are attributable to human error. Therefore lots of efforts to perform an HRA (Human Reliability Analysis) based on various approaches made to enhance their safety. HRA data is an important prerequisite for improving HRA quality. For this reason, KAERI (Korea Atomic Energy Research Institute) developed a standardized guideline is to specify how to gather HRA data from simulator training records and crated IGT (Information Gathering Template) specifying what kinds of measures should be observed during the simulations. Based on the data collection framework, we have performed data collection to analyze inappropriate human behavior (or UA; Unsafe Act) with simulator training data about various scenario needed AOP (Abnormal Operation Procedure) or EOP (Emergency Operation Procedure) operations for HEP (Human Error Probability) calculation

  19. Performance comparison of hydrological model structures during low flows

    Science.gov (United States)

    Staudinger, Maria; Stahl, Kerstin; Tallaksen, Lena M.; Clark, Martyn P.; Seibert, Jan

    2010-05-01

    Low flows are still poorly reproduced by common hydrological models since they are traditionally designed to meet peak flow situations best possible. As low flow becomes increasingly important to several target areas there is a need to improve available models. We present a study that assesses the impact of model structure on low flow simulations. This is done using the Framework for Understanding Structural Errors (FUSE), which identifies the set of (subjective) decisions made when building a hydrological model, and provides multiple options for each modeling decision. 79 models were built using the FUSE framework, and applied to simulate stream flows in the Narsjø catchment in Norway (119 km²). To allow comparison all new models were calibrated using an automatic optimization method. Low flow and recession analysis of the new models enables us to evaluate model performance focusing on different aspects by using various objective functions. Additionally, model structures responsible for poor performance, and hence unsuitable, can be detected. We focused on elucidating model performance during summer (August - October) and winter low flows which evolve from entirely different hydrological processes in the Narsjø catchment. Summer low flows develop out of a lack of precipitation while winter low flows are due to water storage in ice and snow. The results showed that simulations of summer low flows were throughout poorer than simulations of winter low flows when evaluating with an objective function focusing on low flows; here, the model structure influencing winter low flow simulations is the lower layer architecture. Different model structures were found to influence model performance during the summer season. The choice of other objective functions has the potential to affect such an evaluation. These findings call for the use of different model structures tailored to particular needs.

  20. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  1. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe

    2015-01-01

    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  2. A Simulation Model for Measuring Customer Satisfaction through Employee Satisfaction

    Science.gov (United States)

    Zondiros, Dimitris; Konstantopoulos, Nikolaos; Tomaras, Petros

    2007-12-01

    Customer satisfaction is defined as a measure of how a firm's product or service performs compared to customer's expectations. It has long been a subject of research due to its importance for measuring marketing and business performance. A lot of models have been developed for its measurement. This paper propose a simulation model using employee satisfaction as one of the most important factors leading to customer satisfaction (the others being expectations and disconfirmation of expectations). Data obtained from a two-year survey in customers of banks in Greece were used. The application of three approaches regarding employee satisfaction resulted in greater customer satisfaction when there is serious effort to keep employees satisfied.

  3. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  4. Simulation of MILD combustion using Perfectly Stirred Reactor model

    KAUST Repository

    Chen, Z.

    2016-07-06

    A simple model based on a Perfectly Stirred Reactor (PSR) is proposed for moderate or intense low-oxygen dilution (MILD) combustion. The PSR calculation is performed covering the entire flammability range and the tabulated chemistry approach is used with a presumed joint probability density function (PDF). The jet, in hot and diluted coflow experimental set-up under MILD conditions, is simulated using this reactor model for two oxygen dilution levels. The computed results for mean temperature, major and minor species mass fractions are compared with the experimental data and simulation results obtained recently using a multi-environment transported PDF approach. Overall, a good agreement is observed at three different axial locations for these comparisons despite the over-predicted peak value of CO formation. This suggests that MILD combustion can be effectively modelled by the proposed PSR model with lower computational cost.

  5. Markov chain Monte Carlo simulation for Bayesian Hidden Markov Models

    Science.gov (United States)

    Chan, Lay Guat; Ibrahim, Adriana Irawati Nur Binti

    2016-10-01

    A hidden Markov model (HMM) is a mixture model which has a Markov chain with finite states as its mixing distribution. HMMs have been applied to a variety of fields, such as speech and face recognitions. The main purpose of this study is to investigate the Bayesian approach to HMMs. Using this approach, we can simulate from the parameters' posterior distribution using some Markov chain Monte Carlo (MCMC) sampling methods. HMMs seem to be useful, but there are some limitations. Therefore, by using the Mixture of Dirichlet processes Hidden Markov Model (MDPHMM) based on Yau et. al (2011), we hope to overcome these limitations. We shall conduct a simulation study using MCMC methods to investigate the performance of this model.

  6. Performance modelling for product development of advanced window systems

    DEFF Research Database (Denmark)

    Appelfeld, David

    The research presented in this doctoral thesis shows how the product development (PD) of Complex Fenestration Systems (CFSs) can be facilitated by computer-based analysis to improve the energy efficiency of fenestration systems as well as to improve the indoor environment. The first chapter defines...... and methods,which can address interrelated performance parameters of CFS, are sought. It is possible to evaluate such systems by measurements, however the high cost and complexity of the measurements are limiting factors. The studies in this thesis confirmed that the results from the performance measurements...... of CFSs can be interpreted by simulations and hence simulations can be used for the performance analysis of new CFSs. An advanced simulation model must be often developed and needs to be validated by measurements before the model can be reused. The validation of simulations against the measurements proved...

  7. Integrating Visualizations into Modeling NEST Simulations.

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  8. Integrating Visualizations into Modeling NEST Simulations

    Directory of Open Access Journals (Sweden)

    Christian eNowke

    2015-12-01

    Full Text Available Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  9. Integrating Visualizations into Modeling NEST Simulations

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  10. Large wind power plants modeling techniques for power system simulation studies

    Energy Technology Data Exchange (ETDEWEB)

    Larose, Christian; Gagnon, Richard; Turmel, Gilbert; Giroux, Pierre; Brochu, Jacques [IREQ Hydro-Quebec Research Institute, Varennes, QC (Canada); McNabb, Danielle; Lefebvre, Daniel [Hydro-Quebec TransEnergie, Montreal, QC (Canada)

    2009-07-01

    This paper presents efficient modeling techniques for the simulation of large wind power plants in the EMT domain using a parallel supercomputer. Using these techniques, large wind power plants can be simulated in detail, with each wind turbine individually represented, as well as the collector and receiving network. The simulation speed of the resulting models is fast enough to perform both EMT and transient stability studies. The techniques are applied to develop an EMT detailed model of a generic wind power plant consisting of 73 x 1.5-MW doubly-fed induction generator (DFIG) wind turbine. Validation of the modeling techniques is presented using a comparison with a Matlab/SimPowerSystems simulation. To demonstrate the simulation capabilities using these modeling techniques, simulations involving a 120-bus receiving network with two generic wind power plants (146 wind turbines) are performed. The complete system is modeled using the Hypersim simulator and Matlab/SimPowerSystems. The simulations are performed on a 32-processor supercomputer using an EMTP-like solution with a time step of 18.4 {mu}s. The simulation performance is 10 times slower than in real-time, which is a huge gain in performance compared to traditional tools. The simulation is designed to run in real-time so it never stops, resulting in a capability to perform thousand of tests via automatic testing tools. (orig.)

  11. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    , that can accurately and efficiently simulate wind turbine wakes. The linear k-ε eddy viscosity model (EVM) is a popular turbulence model in RANS; however, it underpredicts the velocity wake deficit and cannot predict the anisotropic Reynolds-stresses in the wake. In the current work, nonlinear eddy...... viscosity models (NLEVM) are applied to wind turbine wakes. NLEVMs can model anisotropic turbulence through a nonlinear stress-strain relation, and they can improve the velocity deficit by the use of a variable eddy viscosity coefficient, that delays the wake recovery. Unfortunately, all tested NLEVMs show...... numerically unstable behavior for fine grids, which inhibits a grid dependency study for numerical verification. Therefore, a simpler EVM is proposed, labeled as the k-ε - fp EVM, that has a linear stress-strain relation, but still has a variable eddy viscosity coefficient. The k-ε - fp EVM is numerically...

  12. Fully Kinetic Ion Models for Magnetized Plasma Simulations

    Science.gov (United States)

    Sturdevant, Benjamin J.

    This thesis focuses on the development of simulation models, based on fully resolving the gyro-motion of ions with the Lorentz force equations of motion, for studying low-frequency phenomena in well-magnetized plasma systems. Such models, known as fully kinetic ion models, offer formal simplicity over higher order gyrokinetic ion models and may provide an important validation tool or replacement for gyrokinetic ion models in applications where the gyrokinetic ordering assumptions are in question. Methods for dealing with the added difficulty of resolving the short time scales associated with the ion gyro-motion in fully kinetic ion models are explored with the use of graphics processing units (GPUs) and advanced time integration algorithms, including sub-cycling, orbit averaging and variational integrators. Theoretical work is performed to analyze the effects of the ion Bernstein modes, which are known to cause difficulties in simulations based on fully kinetic ion models. In addition, the first simulation results for the ion temperature gradient driven instability in toroidal geometry using a fully kinetic ion model are presented. Finally, during the course of this work, a method for analyzing the effects of a finite time step size and spatial grid in the delta-f approach to the particle-in-cell method was developed for the first time. This method was applied to an implicit time integration scheme and has revealed some unusual numerical properties related to the delta-f method.

  13. Developed hydraulic simulation model for water pipeline networks

    Directory of Open Access Journals (Sweden)

    A. Ayad

    2013-03-01

    Full Text Available A numerical method that uses linear graph theory is presented for both steady state, and extended period simulation in a pipe network including its hydraulic components (pumps, valves, junctions, etc.. The developed model is based on the Extended Linear Graph Theory (ELGT technique. This technique is modified to include new network components such as flow control valves and tanks. The technique also expanded for extended period simulation (EPS. A newly modified method for the calculation of updated flows improving the convergence rate is being introduced. Both benchmarks, ad Actual networks are analyzed to check the reliability of the proposed method. The results reveal the finer performance of the proposed method.

  14. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  15. H5hut: A High-Performance I/O Library for Particle-based Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Howison, Mark; Adelmann, Andreas; Bethel, E. Wes; Gsell, Achim; Oswald, Benedikt; Prabhat,

    2010-09-24

    Particle-based simulations running on large high-performance computing systems over many time steps can generate an enormous amount of particle- and field-based data for post-processing and analysis. Achieving high-performance I/O for this data, effectively managing it on disk, and interfacing it with analysis and visualization tools can be challenging, especially for domain scientists who do not have I/O and data management expertise. We present the H5hut library, an implementation of several data models for particle-based simulations that encapsulates the complexity of HDF5 and is simple to use, yet does not compromise performance.

  16. Simulation Model of Bus Rapid Transit

    Directory of Open Access Journals (Sweden)

    Gunawan Fergyanto E.

    2014-03-01

    Full Text Available Bus rapid transit system is modern solution for mass transportation system. The system, in comparison to the rail-based transportation system, is significantly cheaper and requires shorter development time, but lower performance. The BRT system performance strongly depends on variables related to station design and infrastructure. A numerical model offers an effective and efficient means to evaluate the system performance. This article offers a detailed numerical model on the basis of the discrete-event approach and demonstrates its application.

  17. A Pass Band Performance Simulation Code of Coupled Cavities

    CERN Document Server

    Tao, X

    2004-01-01

    A simulation code of accelerating cavities named PPSC is developed by the solutions of the microwave equivalent circuit equations. PPSC can give the pass band performance of periodic or non-periodic accelerating structures, such as the dispersion frequency and the reflection factor of the cavity, the field distribution of each mode and so on. The natural parameters of the structure, such as the number of the cavities, the resonant frequencies and Q-factors of each cavity, the coupling factor between two cavities, and the locations of the couplers, can be changed easily to see the different results of the simulation. The code is written based on MS Visual Basic under MS windows. With these, a user-friendly interface is made. Some simple examples was simulated and gave reliable results.

  18. Adaptive Performance-Constrained in Situ Visualization of Atmospheic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dorier, Matthieu; Sisneros, Roberto; Bautista Gomez, Leonard; Peterka, Tom; Orf, Leigh; Rahmani, Lokman; Antoniu, Gabriel; Bouge, Luc

    2016-09-12

    While many parallel visualization tools now provide in situ visualization capabilities, the trend has been to feed such tools with large amounts of unprocessed output data and let them render everything at the highest possible resolution. This leads to an increased run time of simulations that still have to complete within a fixed-length job allocation. In this paper, we tackle the challenge of enabling in situ visualization under performance constraints. Our approach shuffles data across processes according to its content and filters out part of it in order to feed a visualization pipeline with only a reorganized subset of the data produced by the simulation. Our framework leverages fast, generic evaluation procedures to score blocks of data, using information theory, statistics, and linear algebra. It monitors its own performance and adapts dynamically to achieve appropriate visual fidelity within predefined performance constraints. Experiments on the Blue Waters supercomputer with the CM1 simulation show that our approach enables a 5 speedup with respect to the initial visualization pipeline and is able to meet performance constraints.

  19. The applicability and limitations of the geochemical models and tools used in simulating radionuclide behaviour in natural waters. Lessons learned from the Blind Predictive Modelling exercises performed in conjunction with Natural Analogue studies

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, J.; Duro, L.; Grive, M. [QuantiSci SL, Parc Tecnologic del Valles (Spain)

    2001-07-01

    One of the key applications of Natural Analogue studies to the Performance Assessment (PA) of nuclear waste disposal has been the possibility to test the geochemical models and tools to be used in describing the migration of radionuclides in a future radioactive waste repository system. To this end, several geochemical modelling testing exercises (commonly denoted as Blind Predictive Modelling), have formed an integral part of Natural Analogue Studies over the last decade. Consequently, we thought that this is a timely occasion to make an evaluation of the experience gained and lessons learnt. We have reviewed, discussed and compared the results obtained from the Blind Prediction Modelling (BPM) exercises carried out within 7 Natural Analogue Studies: Oman, Pocos de Caldas, Cigar Lake, Maqarin, El Berrocal, Oklo and Palmottu. To make this comparison meaningful, we present the main geochemical characteristics of each site in order to highlight the most relevant mineralogical and hydrochemical differences. From the complete list of elements studied at all the investigated sites we have made a selection based on the relevance of a given element from a PA viewpoint and on the frequency this element has been included in the BPM exercises. The elements selected for discussion are: Sr, Ba, Sn, Pb, Se, Ni, Zn, REEs, Th and U. We have based our discussion on the results obtained from the speciation as well as solubility calculations. From the comparison of the results it is concluded that we can differentiate between three element categories: 1. Elements whose geochemical behaviour can be fairly well described by assuming solubility control exerted by pure solid phases of the given element (i.e. Th, U under reducing conditions and U in some sites under oxidising conditions); 2. Elements for which the association to major geochemical components of the system must be considered in order to explain their concentrations in groundwaters (i.e. Sr, Ba, Zn, Se, REEs and U under

  20. Best Practices for Crash Modeling and Simulation

    Science.gov (United States)

    Fasanella, Edwin L.; Jackson, Karen E.

    2002-01-01

    Aviation safety can be greatly enhanced by the expeditious use of computer simulations of crash impact. Unlike automotive impact testing, which is now routine, experimental crash tests of even small aircraft are expensive and complex due to the high cost of the aircraft and the myriad of crash impact conditions that must be considered. Ultimately, the goal is to utilize full-scale crash simulations of aircraft for design evaluation and certification. The objective of this publication is to describe "best practices" for modeling aircraft impact using explicit nonlinear dynamic finite element codes such as LS-DYNA, DYNA3D, and MSC.Dytran. Although "best practices" is somewhat relative, it is hoped that the authors' experience will help others to avoid some of the common pitfalls in modeling that are not documented in one single publication. In addition, a discussion of experimental data analysis, digital filtering, and test-analysis correlation is provided. Finally, some examples of aircraft crash simulations are described in several appendices following the main report.