WorldWideScience

Sample records for model simulations performed

  1. Photovoltaic array performance simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Menicucci, D. F.

    1986-09-15

    The experience of the solar industry confirms that, despite recent cost reductions, the profitability of photovoltaic (PV) systems is often marginal and the configuration and sizing of a system is a critical problem for the design engineer. Construction and evaluation of experimental systems are expensive and seldom justifiable. A mathematical model or computer-simulation program is a desirable alternative, provided reliable results can be obtained. Sandia National Laboratories, Albuquerque (SNLA), has been studying PV-system modeling techniques in an effort to develop an effective tool to be used by engineers and architects in the design of cost-effective PV systems. This paper reviews two of the sources of error found in previous PV modeling programs, presents the remedies developed to correct these errors, and describes a new program that incorporates these improvements.

  2. Maintenance Personnel Performance Simulation (MAPPS) model

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.; Haas, P.M.

    1984-01-01

    A stochastic computer model for simulating the actions and behavior of nuclear power plant maintenance personnel is described. The model considers personnel, environmental, and motivational variables to yield predictions of maintenance performance quality and time to perform. The mode has been fully developed and sensitivity tested. Additional evaluation of the model is now taking place

  3. CASTOR detector. Model, objectives and simulated performance

    International Nuclear Information System (INIS)

    Angelis, A. L. S.; Mavromanolakis, G.; Panagiotou, A. D.; Aslanoglou, X.; Nicolis, N.; Lobanov, M.; Erine, S.; Kharlov, Y. V.; Bogolyubsky, M. Y.; Kurepin, A. B.; Chileev, K.; Wlodarczyk, Z.

    2001-01-01

    It is presented a phenomenological model describing the formation and evolution of a Centauro fireball in the baryon-rich region in nucleus-nucleus interactions in the upper atmosphere and at the LHC. The small particle multiplicity and imbalance of electromagnetic and hadronic content characterizing a Centauro event and also the strongly penetrating particles (assumed to be strangelets) frequently accompanying them can be naturally explained. It is described the CASTOR calorimeter, a sub detector of the ALICE experiment dedicated to the search for Centauro in the very forward, baryon-rich region of central Pb+Pb collisions at the LHC. The basic characteristics and simulated performance of the calorimeter are presented

  4. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  5. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  6. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  7. Impact of reactive settler models on simulated WWTP performance

    DEFF Research Database (Denmark)

    Gernaey, Krist; Jeppsson, Ulf; Batstone, Damien J.

    2006-01-01

    for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takacs settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate......, combined with a non-reactive Takacs settler. The second is a fully reactive ASM1 Takacs settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively....... The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler....

  8. MODELING SIMULATION AND PERFORMANCE STUDY OF GRIDCONNECTED PHOTOVOLTAIC ENERGY SYSTEM

    OpenAIRE

    Nagendra K; Karthik J; Keerthi Rao C; Kumar Raja Pemmadi

    2017-01-01

    This paper presents Modeling Simulation of grid connected Photovoltaic Energy System and performance study using MATLAB/Simulink. The Photovoltaic energy system is considered in three main parts PV Model, Power conditioning System and Grid interface. The Photovoltaic Model is inter-connected with grid through full scale power electronic devices. The simulation is conducted on the PV energy system at normal temperature and at constant load by using MATLAB.

  9. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  10. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  11. Simulation model of a twin-tail, high performance airplane

    Science.gov (United States)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  12. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Directory of Open Access Journals (Sweden)

    Jeon Soohong

    2014-12-01

    Full Text Available This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/ SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive elements, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  13. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Directory of Open Access Journals (Sweden)

    Soohong Jeon

    2014-12-01

    Full Text Available This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive ele- ments, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  14. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  15. Water desalination price from recent performances: Modelling, simulation and analysis

    International Nuclear Information System (INIS)

    Metaiche, M.; Kettab, A.

    2005-01-01

    The subject of the present article is the technical simulation of seawater desalination, by a one stage reverse osmosis system, the objectives of which are the recent valuation of cost price through the use of new membrane and permeator performances, the use of new means of simulation and modelling of desalination parameters, and show the main parameters influencing the cost price. We have taken as the simulation example the Seawater Desalting centre of Djannet (Boumerdes, Algeria). The present performances allow water desalting at a price of 0.5 $/m 3 , which is an interesting and promising price, corresponding with the very acceptable water product quality, in the order of 269 ppm. It is important to run the desalting systems by reverse osmosis under high pressure, resulting in further decrease of the desalting cost and the production of good quality water. Aberration in choice of functioning conditions produces high prices and unacceptable quality. However there exists the possibility of decreasing the price by decreasing the requirement on the product quality. The seawater temperature has an effect on the cost price and quality. The installation of big desalting centres, contributes to the decrease in prices. A very important, long and tedious calculation is effected, which is impossible to conduct without programming and informatics tools. The use of the simulation model has been much efficient in the design of desalination centres that can perform at very improved prices. (author)

  16. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  17. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  18. A New Model to Simulate Energy Performance of VRF Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  19. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.

    1985-01-01

    A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

  20. Maintenance personnel performance simulation (MAPPS): a model for predicting maintenance performance reliability in nuclear power plants

    International Nuclear Information System (INIS)

    Knee, H.E.; Krois, P.A.; Haas, P.M.; Siegel, A.I.; Ryan, T.G.

    1983-01-01

    The NRC has developed a structured, quantitative, predictive methodology in the form of a computerized simulation model for assessing maintainer task performance. Objective of the overall program is to develop, validate, and disseminate a practical, useful, and acceptable methodology for the quantitative assessment of NPP maintenance personnel reliability. The program was organized into four phases: (1) scoping study, (2) model development, (3) model evaluation, and (4) model dissemination. The program is currently nearing completion of Phase 2 - Model Development

  1. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  2. Maintenance Personnel Performance Simulation (MAPPS) model: description of model content, structure, and sensitivity testing. Volume 2

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.

    1984-12-01

    This volume of NUREG/CR-3626 presents details of the content, structure, and sensitivity testing of the Maintenance Personnel Performance Simulation (MAPPS) model that was described in summary in volume one of this report. The MAPPS model is a generalized stochastic computer simulation model developed to simulate the performance of maintenance personnel in nuclear power plants. The MAPPS model considers workplace, maintenance technician, motivation, human factors, and task oriented variables to yield predictive information about the effects of these variables on successful maintenance task performance. All major model variables are discussed in detail and their implementation and interactive effects are outlined. The model was examined for disqualifying defects from a number of viewpoints, including sensitivity testing. This examination led to the identification of some minor recalibration efforts which were carried out. These positive results indicate that MAPPS is ready for initial and controlled applications which are in conformity with its purposes

  3. OPNET Modeler simulations of performance for multi nodes wireless systems

    Directory of Open Access Journals (Sweden)

    Krupanek Beata

    2016-01-01

    Full Text Available Paper presents a study under the Quality of Service in modern wireless sensor networks. Such a networks are characterized by small amount of data transmitted in fixed periods. Very often this data must by transmitted in real time so data transmission delays should be well known. This article shows multimode network simulated in packet OPNET Modeler. Also nowadays the quality of services is very important especially in multi-nodes systems such a home automation or measurement systems.

  4. Modeling of HVAC operational faults in building performance simulation

    International Nuclear Information System (INIS)

    Zhang, Rongpeng; Hong, Tianzhen

    2017-01-01

    Highlights: •Discuss significance of capturing operational faults in existing buildings. •Develop a novel feature in EnergyPlus to model operational faults of HVAC systems. •Compare three approaches to faults modeling using EnergyPlus. •A case study demonstrates the use of the fault-modeling feature. •Future developments of new faults are discussed. -- Abstract: Operational faults are common in the heating, ventilating, and air conditioning (HVAC) systems of existing buildings, leading to a decrease in energy efficiency and occupant comfort. Various fault detection and diagnostic methods have been developed to identify and analyze HVAC operational faults at the component or subsystem level. However, current methods lack a holistic approach to predicting the overall impacts of faults at the building level—an approach that adequately addresses the coupling between various operational components, the synchronized effect between simultaneous faults, and the dynamic nature of fault severity. This study introduces the novel development of a fault-modeling feature in EnergyPlus which fills in the knowledge gap left by previous studies. This paper presents the design and implementation of the new feature in EnergyPlus and discusses in detail the fault-modeling challenges faced. The new fault-modeling feature enables EnergyPlus to quantify the impacts of faults on building energy use and occupant comfort, thus supporting the decision making of timely fault corrections. Including actual building operational faults in energy models also improves the accuracy of the baseline model, which is critical in the measurement and verification of retrofit or commissioning projects. As an example, EnergyPlus version 8.6 was used to investigate the impacts of a number of typical operational faults in an office building across several U.S. climate zones. The results demonstrate that the faults have significant impacts on building energy performance as well as on occupant

  5. Modeling, Simulation and Performance Evaluation of Parabolic Trough

    African Journals Online (AJOL)

    Mekuannint

    Heat Transfer Fluid (HTF); TRNSYS power plant model; STEC library; Solar Advisor Model (SAM);. TRNSYS solar field model; Solar Electric. Generation System (SEGS). INTRODUCTION. Parabolic troughs are currently most used means of power generation option of solar sources. Solar electric generation systems (SEGs) ...

  6. Maintenance Personnel Performance Simulation (MAPPS) model. Users' Manual

    International Nuclear Information System (INIS)

    Kopstein, F.F.; Wolf, J.J.

    1985-09-01

    This report (MAPPS User's Manual) is the last report to be published from this program and provides detailed guidelines for utilization of the MAPPS model. Although the model has been developed to be highly user-friendly and provides interactive means for controlling and running of the model, the user's manual is provided as a guide for the user in the event clarification or direction is required. The user will find that in general the model requires primarily user input that is self explanatory. Once initial familiarization with the model has been achieved by the user, the amount of interaction between the user's manual and the computer model will be minimal. It is suggested however that even the experienced user keep the user's manual handy for quick reference. 5 refs., 10 figs., 7 tabs

  7. Modeling and simulation performance of sucker rod beam pump

    Energy Technology Data Exchange (ETDEWEB)

    Aditsania, Annisa, E-mail: annisaaditsania@gmail.com [Department of Computational Sciences, Institut Teknologi Bandung (Indonesia); Rahmawati, Silvy Dewi, E-mail: silvyarahmawati@gmail.com; Sukarno, Pudjo, E-mail: psukarno@gmail.com [Department of Petroleum Engineering, Institut Teknologi Bandung (Indonesia); Soewono, Edy, E-mail: esoewono@math.itb.ac.id [Department of Mathematics, Institut Teknologi Bandung (Indonesia)

    2015-09-30

    Artificial lift is a mechanism to lift hydrocarbon, generally petroleum, from a well to surface. This is used in the case that the natural pressure from the reservoir has significantly decreased. Sucker rod beam pumping is a method of artificial lift. Sucker rod beam pump is modeled in this research as a function of geometry of the surface part, the size of sucker rod string, and fluid properties. Besides its length, sucker rod string also classified into tapered and un-tapered. At the beginning of this research, for easy modeling, the sucker rod string was assumed as un-tapered. The assumption proved non-realistic to use. Therefore, the tapered sucker rod string modeling needs building. The numerical solution of this sucker rod beam pump model is computed using finite difference method. The numerical result shows that the peak of polished rod load for sucker rod beam pump unit C-456-D-256-120, for non-tapered sucker rod string is 38504.2 lb, while for tapered rod string is 25723.3 lb. For that reason, to avoid the sucker rod string breaks due to the overload, the use of tapered sucker rod beam string is suggested in this research.

  8. Modeling and simulation performance of sucker rod beam pump

    International Nuclear Information System (INIS)

    Aditsania, Annisa; Rahmawati, Silvy Dewi; Sukarno, Pudjo; Soewono, Edy

    2015-01-01

    Artificial lift is a mechanism to lift hydrocarbon, generally petroleum, from a well to surface. This is used in the case that the natural pressure from the reservoir has significantly decreased. Sucker rod beam pumping is a method of artificial lift. Sucker rod beam pump is modeled in this research as a function of geometry of the surface part, the size of sucker rod string, and fluid properties. Besides its length, sucker rod string also classified into tapered and un-tapered. At the beginning of this research, for easy modeling, the sucker rod string was assumed as un-tapered. The assumption proved non-realistic to use. Therefore, the tapered sucker rod string modeling needs building. The numerical solution of this sucker rod beam pump model is computed using finite difference method. The numerical result shows that the peak of polished rod load for sucker rod beam pump unit C-456-D-256-120, for non-tapered sucker rod string is 38504.2 lb, while for tapered rod string is 25723.3 lb. For that reason, to avoid the sucker rod string breaks due to the overload, the use of tapered sucker rod beam string is suggested in this research

  9. Modeling and Simulation of Ceramic Arrays to Improve Ballistic Performance

    Science.gov (United States)

    2014-04-30

    Impact with no Adhesive DOP (mm) 10.3 □ □ An adhesive layer of Epoxy Resin was added in between the SiC tile and the Al backing The tile...Tile Gap 0.508 mm with No Adhesive DOP (mm) 17.2 □ An adhesive layer of Epoxy Resin was added in between the SiC tile and the Al backing The...Baseline performance seam assessment (2 ft x 2 ft panels) □ Sintered 4’sq. SiC (Superior Graphite) on Kevlar /Phenolic with 2-ply cover Cover (a

  10. Surrogate model approach for improving the performance of reactive transport simulations

    Science.gov (United States)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines

  11. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  12. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    Science.gov (United States)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  13. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  14. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  15. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  16. Performance of the general circulation models in simulating temperature and precipitation over Iran

    Science.gov (United States)

    Abbasian, Mohammadsadegh; Moghim, Sanaz; Abrishamchi, Ahmad

    2018-03-01

    General Circulation Models (GCMs) are advanced tools for impact assessment and climate change studies. Previous studies show that the performance of the GCMs in simulating climate variables varies significantly over different regions. This study intends to evaluate the performance of the Coupled Model Intercomparison Project phase 5 (CMIP5) GCMs in simulating temperature and precipitation over Iran. Simulations from 37 GCMs and observations from the Climatic Research Unit (CRU) were obtained for the period of 1901-2005. Six measures of performance including mean bias, root mean square error (RMSE), Nash-Sutcliffe efficiency (NSE), linear correlation coefficient (r), Kolmogorov-Smirnov statistic (KS), Sen's slope estimator, and the Taylor diagram are used for the evaluation. GCMs are ranked based on each statistic at seasonal and annual time scales. Results show that most GCMs perform reasonably well in simulating the annual and seasonal temperature over Iran. The majority of the GCMs have a poor skill to simulate precipitation, particularly at seasonal scale. Based on the results, the best GCMs to represent temperature and precipitation simulations over Iran are the CMCC-CMS (Euro-Mediterranean Center on Climate Change) and the MRI-CGCM3 (Meteorological Research Institute), respectively. The results are valuable for climate and hydrometeorological studies and can help water resources planners and managers to choose the proper GCM based on their criteria.

  17. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    Science.gov (United States)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  18. Integrating Soft Set Theory and Fuzzy Linguistic Model to Evaluate the Performance of Training Simulation Systems.

    Science.gov (United States)

    Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu

    2016-01-01

    The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance-performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system.

  19. Weather model performance on extreme rainfall events simulation's over Western Iberian Peninsula

    Science.gov (United States)

    Pereira, S. C.; Carvalho, A. C.; Ferreira, J.; Nunes, J. P.; Kaiser, J. J.; Rocha, A.

    2012-08-01

    This study evaluates the performance of the WRF-ARW numerical weather model in simulating the spatial and temporal patterns of an extreme rainfall period over a complex orographic region in north-central Portugal. The analysis was performed for the December month of 2009, during the Portugal Mainland rainy season. The heavy rainfall to extreme heavy rainfall periods were due to several low surface pressure's systems associated with frontal surfaces. The total amount of precipitation for December exceeded, in average, the climatological mean for the 1971-2000 time period in +89 mm, varying from 190 mm (south part of the country) to 1175 mm (north part of the country). Three model runs were conducted to assess possible improvements in model performance: (1) the WRF-ARW is forced with the initial fields from a global domain model (RunRef); (2) data assimilation for a specific location (RunObsN) is included; (3) nudging is used to adjust the analysis field (RunGridN). Model performance was evaluated against an observed hourly precipitation dataset of 15 rainfall stations using several statistical parameters. The WRF-ARW model reproduced well the temporal rainfall patterns but tended to overestimate precipitation amounts. The RunGridN simulation provided the best results but model performance of the other two runs was good too, so that the selected extreme rainfall episode was successfully reproduced.

  20. Shoulder Arthroscopy Simulator Training Improves Shoulder Arthroscopy Performance in a Cadaver Model

    Science.gov (United States)

    Henn, R. Frank; Shah, Neel; Warner, Jon J.P.; Gomoll, Andreas H.

    2013-01-01

    Purpose The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaver model of shoulder arthroscopy. Methods Seventeen first year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and nine of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The two groups were compared with students t-tests, and change over time within groups was analyzed with paired t-tests. Results There were no observed differences between the two groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (parthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaver model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. Clinical Relevance There may be a role for simulator training in shoulder arthroscopy education. PMID:23591380

  1. Shoulder arthroscopy simulator training improves shoulder arthroscopy performance in a cadaveric model.

    Science.gov (United States)

    Henn, R Frank; Shah, Neel; Warner, Jon J P; Gomoll, Andreas H

    2013-06-01

    The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaveric model of shoulder arthroscopy. Seventeen first-year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and 9 of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The 2 groups were compared by use of Student t tests, and change over time within groups was analyzed with paired t tests. There were no observed differences between the 2 groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (P arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaveric model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. There may be a role for simulator training in shoulder arthroscopy education. Copyright © 2013 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  2. Thermodynamic simulation model for predicting the performance of spark ignition engines using biogas as fuel

    International Nuclear Information System (INIS)

    Nunes de Faria, Mário M.; Vargas Machuca Bueno, Juan P.; Ayad, Sami M.M. Elmassalami; Belchior, Carlos R. Pereira

    2017-01-01

    Highlights: • A 0-D model for performance prediction of SI ICE fueled with biogas is proposed. • Relative difference between simulated and experimental values was under 5%. • Can be adapted for different biogas compositions and operating ranges. • Could be a valuable tool for predicting trends and guiding experimentation. • Is suitable for use with biogas supplies in developing regions. - Abstract: Biogas found its way from developing countries and is now an alternative to fossil fuels in internal combustion engines and with the advantage of lower greenhouse gas emissions. However, its use in gas engines requires engine modifications or adaptations that may be costly. This paper reports the results of experimental performance and emissions tests of an engine-generator unit fueled with biogas produced in a sewage plant in Brazil, operating under different loads, and with suitable engine modifications. These emissions and performance results were in agreement with the literature and it was confirmed that the penalties to engine performance were more significant than emission reduction in the operating range tested. Furthermore, a zero dimensional simulation model was employed to predict performance characteristics. Moreover, a differential thermodynamic equation system was solved, obtaining the pressure inside the cylinder as a function of the crank angle for different engine conditions. Mean effective pressure and indicated power were also obtained. The results of simulation and experimental tests of the engine in similar conditions were compared and the model validated. Although several simplifying assumptions were adopted and empirical correlations were used for Wiebe function, the model was adequate in predicting engine performance as the relative difference between simulated and experimental values was lower than 5%. The model can be adapted for use with different raw or enriched biogas compositions and could prove to be a valuable tool to guide

  3. Comparison of Two Models for Damage Accumulation in Simulations of System Performance

    Energy Technology Data Exchange (ETDEWEB)

    Youngblood, R. [Idaho National Laboratory, Idaho Falls, ID (United States); Mandelli, D. [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-11-01

    A comprehensive simulation study of system performance needs to address variations in component behavior, variations in phenomenology, and the coupling between phenomenology and component failure. This paper discusses two models of this: 1. damage accumulation is modeled as a random walk process in each time history, with component failure occurring when damage accumulation reaches a specified threshold; or 2. damage accumulation is modeled mechanistically within each time history, but failure occurs when damage reaches a time-history-specific threshold, sampled at time zero from each component’s distribution of damage tolerance. A limiting case of the latter is classical discrete-event simulation, with component failure times sampled a priori from failure time distributions; but in such models, the failure times are not typically adjusted for operating conditions varying within a time history. Nowadays, as discussed below, it is practical to account for this. The paper compares the interpretations and computational aspects of the two models mentioned above.

  4. High-Performance Modeling of Carbon Dioxide Sequestration by Coupling Reservoir Simulation and Molecular Dynamics

    KAUST Repository

    Bao, Kai

    2015-10-26

    The present work describes a parallel computational framework for carbon dioxide (CO2) sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel high-performance-computing (HPC) systems. In this framework, a parallel reservoir simulator, reservoir-simulation toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, whereas the MD simulations are performed to provide the required physical parameters. Technologies from several different fields are used to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large-scale CO2 sequestration for long-term storage in subsurface geological formations, such as depleted oil and gas reservoirs and deep saline aquifers, which has been proposed as one of the few attractive and practical solutions to reduce CO2 emissions and address the global-warming threat. Fine grids and accurate prediction of the properties of fluid mixtures under geological conditions are essential for accurate simulations. In this work, CO2 sequestration is presented as a first example for coupling reservoir simulation and MD, although the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical processes in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability is observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well-demonstrated with several experiments with hundreds of millions to one billion cells. To the best of our knowledge, the present work represents the first attempt to couple reservoir simulation and molecular simulation for large-scale modeling. Because of the complexity of

  5. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations

    Science.gov (United States)

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  6. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  7. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    International Nuclear Information System (INIS)

    Kelly, Dana L.; Boring, Ronald L.; Mosleh, Ali; Smidts, Carol

    2011-01-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  8. A Dynamic Simulation Model of Organizational Culture and Business Strategy Effects on Performance

    Science.gov (United States)

    Trivellas, Panagiotis; Reklitis, Panagiotis; Konstantopoulos, Nikolaos

    2007-12-01

    In the past two decades, organizational culture literature has gained tremendous interest for both academic and practitioners. This is based not only on the suggestion that culture is related to performance, but also on the view that it is subject of direct managerial control and manipulation to the desired direction. In the present paper, we adopt Competing Values Framework (CVF) to operationalise organizational culture and Porter's typology to conceptualize business strategy (cost leadership, innovative and marketing differentiation, and focus). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic model to simulate the organizational culture and strategy effects on financial performance. Data obtained from a six-year survey in the banking sector of a European developing economy was used for the proposed dynamic model development.

  9. High-performance modeling of CO2 sequestration by coupling reservoir simulation and molecular dynamics

    KAUST Repository

    Bao, Kai

    2013-01-01

    The present work describes a parallel computational framework for CO2 sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel HPC systems. In this framework, a parallel reservoir simulator, Reservoir Simulation Toolbox (RST), solves the flow and transport equations that describe the subsurface flow behavior, while the molecular dynamics simulations are performed to provide the required physical parameters. Numerous technologies from different fields are employed to make this novel coupled system work efficiently. One of the major applications of the framework is the modeling of large scale CO2 sequestration for long-term storage in the subsurface geological formations, such as depleted reservoirs and deep saline aquifers, which has been proposed as one of the most attractive and practical solutions to reduce the CO2 emission problem to address the global-warming threat. To effectively solve such problems, fine grids and accurate prediction of the properties of fluid mixtures are essential for accuracy. In this work, the CO2 sequestration is presented as our first example to couple the reservoir simulation and molecular dynamics, while the framework can be extended naturally to the full multiphase multicomponent compositional flow simulation to handle more complicated physical process in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our MD simulations compared with published data, and good scalability are observed with the massively parallel HPC systems. The performance and capacity of the proposed framework are well demonstrated with several experiments with hundreds of millions to a billion cells. To our best knowledge, the work represents the first attempt to couple the reservoir simulation and molecular simulation for large scale modeling. Due to the complexity of the subsurface systems

  10. Teamwork skills, shared mental models, and performance in simulated trauma teams: an independent group design

    Directory of Open Access Journals (Sweden)

    Westli Heidi

    2010-08-01

    Full Text Available Abstract Background Non-technical skills are seen as an important contributor to reducing adverse events and improving medical management in healthcare teams. Previous research on the effectiveness of teams has suggested that shared mental models facilitate coordination and team performance. The purpose of the study was to investigate whether demonstrated teamwork skills and behaviour indicating shared mental models would be associated with observed improved medical management in trauma team simulations. Methods Revised versions of the 'Anesthetists' Non-Technical Skills Behavioural marker system' and 'Anti-Air Teamwork Observation Measure' were field tested in moment-to-moment observation of 27 trauma team simulations in Norwegian hospitals. Independent subject matter experts rated medical management in the teams. An independent group design was used to explore differences in teamwork skills between higher-performing and lower-performing teams. Results Specific teamwork skills and behavioural markers were associated with indicators of good team performance. Higher and lower-performing teams differed in information exchange, supporting behaviour and communication, with higher performing teams showing more effective information exchange and communication, and less supporting behaviours. Behavioural markers of shared mental models predicted effective medical management better than teamwork skills. Conclusions The present study replicates and extends previous research by providing new empirical evidence of the significance of specific teamwork skills and a shared mental model for the effective medical management of trauma teams. In addition, the study underlines the generic nature of teamwork skills by demonstrating their transferability from different clinical simulations like the anaesthesia environment to trauma care, as well as the potential usefulness of behavioural frequency analysis in future research on non-technical skills.

  11. The Maintenance Personnel Performance Simulation (MAPPS) model: A human reliability analysis tool

    International Nuclear Information System (INIS)

    Knee, H.E.

    1985-01-01

    The Maintenance Personnel Performance Simulation (MAPPS) model is a computerized, stochastic, task-oriented human behavioral model developed to provide estimates of nuclear power plant (NPP) maintenance team performance measures. It is capable of addressing person-machine, person-environment, and person-person relationships, and accounts for interdependencies that exist between the subelements that make up the maintenance task of interest. The primary measures of performance estimated by MAPPS are: 1) the probability of successfully completing the task of interest and 2) the task duration time. MAPPS also estimates a host of other performance indices, including the probability of an undetected error, identification of the most- and least-likely error-prone subelements, and maintenance team stress profiles during task execution

  12. Improving firm performance in out-of-equilibrium, deregulated markets using feedback simulation models

    International Nuclear Information System (INIS)

    Gary, S.; Larsen, E.R.

    2000-01-01

    Deregulation has reshaped the utility sector in many countries around the world. Organisations in these deregulated industries must adopt new polices which guide strategic decisions, in an uncertain and unfamiliar environment, that determine the short- and long-term fate of their companies. Traditional economic equilibrium models do not adequately address the issues facing these organisations in the shift towards deregulated market competition. Equilibrium assumptions break down in the out-of-equilibrium transition to competitive markets, and therefore different underpinning assumptions must be adopted in order to guide management in these periods. Simulation models incorporating information feedback through behavioural policies fill the void left by equilibrium models and support strategic policy analysis in out-of-equilibrium markets. As an example, we present a feedback simulation model developed to examine firm and industry level performance consequences of new generation capacity investment policies in the deregulated UK electricity sector. The model explicitly captures behavioural decision polices of boundedly rational managers and avoids equilibrium assumptions. Such models are essential to help managers evaluate the performance impact of various strategic policies in environments in which disequilibrum behaviour dominates. (Author)

  13. Power converter topologies for wind energy conversion systems: Integrated modeling, control strategy and performance simulation

    Energy Technology Data Exchange (ETDEWEB)

    Melicio, R.; Catalao, J.P.S. [Department of Electromechanical Engineering, University of Beira Interior, R. Fonte do Lameiro, 6201-001 Covilha (Portugal); Mendes, V.M.F. [Department of Electrical Engineering and Automation, Instituto Superior de Engenharia de Lisboa, R. Conselheiro Emidio Navarro, 1950-062 Lisbon (Portugal)

    2010-10-15

    This paper presents new integrated model for variable-speed wind energy conversion systems, considering a more accurate dynamic of the wind turbine, rotor, generator, power converter and filter. Pulse width modulation by space vector modulation associated with sliding mode is used for controlling the power converters. Also, power factor control is introduced at the output of the power converters. Comprehensive performance simulation studies are carried out with matrix, two-level and multilevel power converter topologies in order to adequately assert the system performance. Conclusions are duly drawn. (author)

  14. Performance of process-based models for simulation of grain N in crop rotations across Europe

    DEFF Research Database (Denmark)

    Yin, Xiaogang; Kersebaum, KC; Kollas, C

    2017-01-01

    The accurate estimation of crop grain nitrogen (N; N in grain yield) is crucial for optimizing agricultural N management, especially in crop rotations. In the present study, 12 process-based models were applied to simulate the grain N of i) seven crops in rotations, ii) across various pedo...... (Brassica napus L.). These differences are linked to the intensity of parameterization with better parameterized crops showing lower prediction errors. The model performance was influenced by N fertilization and irrigation treatments, and a majority of the predictions were more accurate under low N...

  15. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    Science.gov (United States)

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  16. Trickle bed reactor model to simulate the performance of commercial diesel hydrotreating unit

    Energy Technology Data Exchange (ETDEWEB)

    C. Murali; R.K. Voolapalli; N. Ravichander; D.T. Gokak; N.V. Choudary [Bharat Petroleum Corporation Ltd., Udyog Kendra (India). Corporate R& amp; D Centre

    2007-05-15

    A two phase mathematical model was developed to simulate the performance of bench scale and commercial hydrotreating reactors. Major hydrotreating reactions, namely, hydrodesulphurization, hydrodearomatization and olefins saturation were modeled. Experiments were carried out in a fixed bed reactor to study the effect of different process variables and these results were used for estimating kinetic parameters. Significant amount of feed vaporization (20-50%) was estimated under normal operating conditions of DHDS suggesting the importance of considering feed vaporization in DHDS modeling. The model was validated with plant operating data, under close to ultra low sulphur levels by correctly accounting for feed vaporization in heat balance relations and appropriate use of hydrodynamic correlations. The model could predict the product quality, reactor bed temperature profiles and chemical hydrogen consumption in commercial plant adequately. 14 refs., 7 figs., 6 tabs.

  17. Towards a benchmark simulation model for plant-wide control strategy performance evaluation of WWTPs

    DEFF Research Database (Denmark)

    Jeppsson, Ulf; Rosen, Christian; Alex, Jens

    2006-01-01

    The COST/IWA benchmark simulation model has been available for seven years. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the benchmark has resulted in more than 100 publications, not only in Europe but also...... worldwide, demonstrates the interest in such a tool within the research community In this paper, an extension of the benchmark simulation model no 1 (BSM1) is proposed. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently...... the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one-week BSM1 evaluation period. In the paper, the extended plant...

  18. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    Science.gov (United States)

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  19. The cognitive environment simulation as a tool for modeling human performance and reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Pople, H. Jr.; Roth, E.M.

    1990-01-01

    The US Nuclear Regulatory Commission is sponsoring a research program to develop improved methods to model the cognitive behavior of nuclear power plant (NPP) personnel. Under this program, a tool for simulating how people form intentions to act in NPP emergency situations was developed using artificial intelligence (AI) techniques. This tool is called Cognitive Environment Simulation (CES). The Cognitive Reliability Assessment Technique (or CREATE) was also developed to specify how CBS can be used to enhance the measurement of the human contribution to risk in probabilistic risk assessment (PRA) studies. The next step in the research program was to evaluate the modeling tool and the method for using the tool for Human Reliability Analysis (HRA) in PRAs. Three evaluation activities were conducted. First, a panel of highly distinguished experts in cognitive modeling, AI, PRA and HRA provided a technical review of the simulation development work. Second, based on panel recommendations, CES was exercised on a family of steam generator tube rupture incidents where empirical data on operator performance already existed. Third, a workshop with HRA practitioners was held to analyze a worked example of the CREATE method to evaluate the role of CES/CREATE in HRA. The results of all three evaluations indicate that CES/CREATE represents a promising approach to modeling operator intention formation during emergency operations

  20. High-Performance Modeling of Carbon Dioxide Sequestration by Coupling Reservoir Simulation and Molecular Dynamics

    KAUST Repository

    Bao, Kai; Yan, Mi; Allen, Rebecca; Salama, Amgad; Lu, Ligang; Jordan, Kirk E.; Sun, Shuyu; Keyes, David E.

    2015-01-01

    The present work describes a parallel computational framework for carbon dioxide (CO2) sequestration simulation by coupling reservoir simulation and molecular dynamics (MD) on massively parallel high-performance-computing (HPC) systems

  1. High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications

    Science.gov (United States)

    Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad

    2012-01-01

    NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure

  2. Maintenance Personnel Performance Simulation (MAPPS) model: a human reliability analysis tool

    International Nuclear Information System (INIS)

    Knee, H.E.

    1985-01-01

    The Maintenance Personnel Performance Simulation (MAPPS) model is a computerized, stochastic, task-oriented human behavioral model developed to provide estimates of nuclear power plant (NPP) maintenance team performance measures. It is capable of addressing person-machine, person-environment, and person-person relationships, and accounts for interdependencies that exist between the subelements that make up the maintenance task of interest. The primary measures of performance estimated by MAPPS are: (1) the probability of successfully completing the task of interest; and (2) the task duration time. MAPPS also estimates a host of other performance indices, including the probability of an undetected error, identification of the most- and least-likely error-prone subelements, and maintenance team stress profiles during task execution. The MAPPS model was subjected to a number of evaluation efforts that focused upon its practicality, acceptability, usefulness, and validity. Methods used for these efforts included a case method approach, consensus estimation, and comparison with observed task performance measures at a NPP. Favorable results, such as close agreement between task duration times for two tasks observed in the field (67.0 and 119.8 minutes, respectively), and estimates by MAPPS (72.0 and 124.0 minutes, respectively) enhance the confidence in the future use of MAPPS. 8 refs., 1 fig

  3. An evidence accumulation model for conflict detection performance in a simulated air traffic control task.

    Science.gov (United States)

    Neal, Andrew; Kwantes, Peter J

    2009-04-01

    The aim of this article is to develop a formal model of conflict detection performance. Our model assumes that participants iteratively sample evidence regarding the state of the world and accumulate it over time. A decision is made when the evidence reaches a threshold that changes over time in response to the increasing urgency of the task. Two experiments were conducted to examine the effects of conflict geometry and timing on response proportions and response time. The model is able to predict the observed pattern of response times, including a nonmonotonic relationship between distance at point of closest approach and response time, as well as effects of angle of approach and relative velocity. The results demonstrate that evidence accumulation models provide a good account of performance on a conflict detection task. Evidence accumulation models are a form of dynamic signal detection theory, allowing for the analysis of response times as well as response proportions, and can be used for simulating human performance on dynamic decision tasks.

  4. Performance of Regional Climate Model in Simulating Monsoon Onset Over Indian Subcontinent

    Science.gov (United States)

    Bhatla, R.; Mandal, B.; Verma, Shruti; Ghosh, Soumik; Mall, R. K.

    2018-06-01

    The performance of various Convective Parameterization Schemes (CPSs) of Regional Climate Model version 4.3 (RegCM-4.3) for simulation of onset phase of Indian summer monsoon (ISM) over Kerala was studied for the period of 2001-2010. The onset date and its associated spatial variation were simulated using RegCM-4.3 four core CPS, namely Kuo, Tiedtke, Emanuel and Grell; and with two mixed convection schemes Mix98 (Emanuel over land and Grell over ocean) and Mix99 (Grell over land and Emanuel over ocean) on the basis of criteria given by the India Meteorological Department (IMD) (Pai and Rajeevan in Indian summer monsoon onset: variability and prediction. National Climate Centre, India Meteorological Department, 2007). It has been found that out of six CPS, two schemes, namely Tiedtke and Mix99 simulated the onset date properly. The onset phase is characterized with several transition phases of atmosphere. Therefore, to study the thermal response or the effect of different sea surface temperature (SST), namely ERA interim (ERSST) and weekly optimal interpolation (OI_WK SST) on Indian summer monsoon, the role of two different types of SST has been used to investigate the simulated onset date. In addition, spatial atmospheric circulation pattern during onset phase were analyzed using reanalyze dataset of ERA Interim (EIN15) and National Oceanic and Atmospheric Administration (NOAA), respectively, for wind and outgoing long-wave radiation (OLR) pattern. Among the six convective schemes of RegCM-4.3 model, Tiedtke is in good agreement with actual onset dates and OI_WK SST forcing is better for simulating onset of ISM over Kerala.

  5. Performance and Evaluation of the Global Modeling and Assimilation Office Observing System Simulation Experiment

    Science.gov (United States)

    Prive, Nikki; Errico, R. M.; Carvalho, D.

    2018-01-01

    The National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASA/GMAO) has spent more than a decade developing and implementing a global Observing System Simulation Experiment framework for use in evaluting both new observation types as well as the behavior of data assimilation systems. The NASA/GMAO OSSE has constantly evolved to relect changes in the Gridpoint Statistical Interpolation data assimiation system, the Global Earth Observing System model, version 5 (GEOS-5), and the real world observational network. Software and observational datasets for the GMAO OSSE are publicly available, along with a technical report. Substantial modifications have recently been made to the NASA/GMAO OSSE framework, including the character of synthetic observation errors, new instrument types, and more sophisticated atmospheric wind vectors. These improvements will be described, along with the overall performance of the current OSSE. Lessons learned from investigations into correlated errors and model error will be discussed.

  6. Improving streamflow simulations and forecasting performance of SWAT model by assimilating remotely sensed soil moisture observations

    Science.gov (United States)

    Patil, Amol; Ramsankaran, RAAJ

    2017-12-01

    This article presents a study carried out using EnKF based assimilation of coarser-scale SMOS soil moisture retrievals to improve the streamflow simulations and forecasting performance of SWAT model in a large catchment. This study has been carried out in Munneru river catchment, India, which is about 10,156 km2. In this study, an EnkF based new approach is proposed for improving the inherent vertical coupling of soil layers of SWAT hydrological model during soil moisture data assimilation. Evaluation of the vertical error correlation obtained between surface and subsurface layers indicates that the vertical coupling can be improved significantly using ensemble of soil storages compared to the traditional static soil storages based EnKF approach. However, the improvements in the simulated streamflow are moderate, which is due to the limitations in SWAT model in reflecting the profile soil moisture updates in surface runoff computations. Further, it is observed that the durability of streamflow improvements is longer when the assimilation system effectively updates the subsurface flow component. Overall, the results of the present study indicate that the passive microwave-based coarser-scale soil moisture products like SMOS hold significant potential to improve the streamflow estimates when assimilating into large-scale distributed hydrological models operating at a daily time step.

  7. Performance evaluation of RANS-based turbulence models in simulating a honeycomb heat sink

    Science.gov (United States)

    Subasi, Abdussamet; Ozsipahi, Mustafa; Sahin, Bayram; Gunes, Hasan

    2017-07-01

    As well-known, there is not a universal turbulence model that can be used to model all engineering problems. There are specific applications for each turbulence model that make it appropriate to use, and it is vital to select an appropriate model and wall function combination that matches the physics of the problem considered. Therefore, in this study, performance of six well-known Reynolds-Averaged Navier-Stokes ( RANS) based turbulence models which are the Standard k {{-}} ɛ, the Renormalized Group k- ɛ, the Realizable k- ɛ, the Reynolds Stress Model, the k- ω and the Shear Stress Transport k- ω and accompanying wall functions which are the standard, the non-equilibrium and the enhanced are evaluated via 3D simulation of a honeycomb heat sink. The CutCell method is used to generate grid for the part including heat sink called test section while a hexahedral mesh is employed to discretize to inlet and outlet sections. A grid convergence study is conducted for verification process while experimental data and well-known correlations are used to validate the numerical results. Prediction of pressure drop along the test section, mean base plate temperature of the heat sink and temperature at the test section outlet are regarded as a measure of the performance of employed models and wall functions. The results indicate that selection of turbulence models and wall functions has a great influence on the results and, therefore, need to be selected carefully. Hydraulic and thermal characteristics of the honeycomb heat sink can be determined in a reasonable accuracy using RANS- based turbulence models provided that a suitable turbulence model and wall function combination is selected.

  8. Scale-dependent performances of CMIP5 earth system models in simulating terrestrial vegetation carbon

    Science.gov (United States)

    Jiang, L.; Luo, Y.; Yan, Y.; Hararuk, O.

    2013-12-01

    Mitigation of global changes will depend on reliable projection for the future situation. As the major tools to predict future climate, Earth System Models (ESMs) used in Coupled Model Intercomparison Project Phase 5 (CMIP5) for the IPCC Fifth Assessment Report have incorporated carbon cycle components, which account for the important fluxes of carbon between the ocean, atmosphere, and terrestrial biosphere carbon reservoirs; and therefore are expected to provide more detailed and more certain projections. However, ESMs are never perfect; and evaluating the ESMs can help us to identify uncertainties in prediction and give the priorities for model development. In this study, we benchmarked carbon in live vegetation in the terrestrial ecosystems simulated by 19 ESMs models from CMIP5 with an observationally estimated data set of global carbon vegetation pool 'Olson's Major World Ecosystem Complexes Ranked by Carbon in Live Vegetation: An Updated Database Using the GLC2000 Land Cover Product' by Gibbs (2006). Our aim is to evaluate the ability of ESMs to reproduce the global vegetation carbon pool at different scales and what are the possible causes for the bias. We found that the performance CMIP5 ESMs is very scale-dependent. While CESM1-BGC, CESM1-CAM5, CESM1-FASTCHEM and CESM1-WACCM, and NorESM1-M and NorESM1-ME (they share the same model structure) have very similar global sums with the observation data but they usually perform poorly at grid cell and biome scale. In contrast, MIROC-ESM and MIROC-ESM-CHEM simulate the best on at grid cell and biome scale but have larger differences in global sums than others. Our results will help improve CMIP5 ESMs for more reliable prediction.

  9. Using queuing theory and simulation model to optimize hospital pharmacy performance.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan

    2014-03-01

    Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation results showed that in the evening, decreasing the staff

  10. Using Queuing Theory and Simulation Model to Optimize Hospital Pharmacy Performance

    Science.gov (United States)

    Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan

    2014-01-01

    Background: Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. Objectives: This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. Patients and Methods: A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results: Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation

  11. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  12. Performance of the CORDEX regional climate models in simulating offshore wind and wind potential

    Science.gov (United States)

    Kulkarni, Sumeet; Deo, M. C.; Ghosh, Subimal

    2018-03-01

    This study is oriented towards quantification of the skill addition by regional climate models (RCMs) in the parent general circulation models (GCMs) while simulating wind speed and wind potential with particular reference to the Indian offshore region. To arrive at a suitable reference dataset, the performance of wind outputs from three different reanalysis datasets is evaluated. The comparison across the RCMs and their corresponding parent GCMs is done on the basis of annual/seasonal wind statistics, intermodel bias, wind climatology, and classes of wind potential. It was observed that while the RCMs could simulate spatial variability of winds, well for certain subregions, they generally failed to replicate the overall spatial pattern, especially in monsoon and winter. Various causes of biases in RCMs were determined by assessing corresponding maps of wind vectors, surface temperature, and sea-level pressure. The results highlight the necessity to carefully assess the RCM-yielded winds before using them for sensitive applications such as coastal vulnerability and hazard assessment. A supplementary outcome of this study is in form of wind potential atlas, based on spatial distribution of wind classes. This could be beneficial in suitably identifying viable subregions for developing offshore wind farms by intercomparing both the RCM and GCM outcomes. It is encouraging that most of the RCMs and GCMs indicate that around 70% of the Indian offshore locations in monsoon would experience mean wind potential greater than 200 W/m2.

  13. The cognitive environment simulation as a tool for modeling human performance and reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Pople, H. Jr.

    1989-01-01

    Various studies have shown that intention errors, or cognitive error, are a major contributor to the risk of disaster. Intention formation refers to the cognitive processes by which an agent decides on what actions are appropriate to carry out (information gathering, situation assessment, diagnosis, response selection). Understanding, measuring, predicting and correcting cognitive errors depends on the answers to the question - what are difficult problems? The answer to this question defines what are risky situations from the point of view of what incidents will the human-technical system manage safely and what incidents will the human-technical system manage poorly and evolve towards negative outcomes. The authors have made progress in the development of such measuring devices through an NRC sponsored research program on cognitive modeling of operator performance. The approach is based on the demand-resource match view of human error. In this approach the difficulty of a problem depends on both the nature of the problem itself and on the resources (e.g., knowledge, plans) available to solve the problem. One can test the difficulty posed by a domain incident, given some set of resources by running the incident through a cognitive simulation that carries out the cognitive activities of a limited resource problem solver in a dynamic, uncertain, risky and highly doctrinal (pre-planned routines and procedures) world. The cognitive simulation that they have developed to do this in NPP accidents is called the Cognitive Environment Simulation (CES). They will illustrate the power of this approach by comparing the behavior of operators in variants on a simulated accident to the behavior of CES in the same accidents

  14. NASA-STD-7009 Guidance Document for Human Health and Performance Models and Simulations

    Science.gov (United States)

    Walton, Marlei; Mulugeta, Lealem; Nelson, Emily S.; Myers, Jerry G.

    2014-01-01

    Rigorous verification, validation, and credibility (VVC) processes are imperative to ensure that models and simulations (MS) are sufficiently reliable to address issues within their intended scope. The NASA standard for MS, NASA-STD-7009 (7009) [1] was a resultant outcome of the Columbia Accident Investigation Board (CAIB) to ensure MS are developed, applied, and interpreted appropriately for making decisions that may impact crew or mission safety. Because the 7009 focus is engineering systems, a NASA-STD-7009 Guidance Document is being developed to augment the 7009 and provide information, tools, and techniques applicable to the probabilistic and deterministic biological MS more prevalent in human health and performance (HHP) and space biomedical research and operations.

  15. A high-performance model for shallow-water simulations in distributed and heterogeneous architectures

    Science.gov (United States)

    Conde, Daniel; Canelas, Ricardo B.; Ferreira, Rui M. L.

    2017-04-01

    unstructured nature of the mesh topology with the corresponding employed solution, based on space-filling curves, being analyzed and discussed. Intra-node parallelism is achieved through OpenMP for CPUs and CUDA for GPUs, depending on which kind of device the process is running. Here the main difficulty is associated with the Object-Oriented approach, where the presence of complex data structures can degrade model performance considerably. STAV-2D now supports fully distributed and heterogeneous simulations where multiple different devices can be used to accelerate computation time. The advantages, short-comings and specific solutions for the employed unified Object-Oriented approach, where the source code for CPU and GPU has the same compilation units (no device specific branches like seen in available models), are discussed and quantified with a thorough scalability and performance analysis. The assembled parallel model is expected to achieve faster than real-time simulations for high resolutions (from meters to sub-meter) in large scaled problems (from cities to watersheds), effectively bridging the gap between detailed and timely simulation results. Acknowledgements This research as partially supported by Portuguese and European funds, within programs COMPETE2020 and PORL-FEDER, through project PTDC/ECM-HID/6387/2014 and Doctoral Grant SFRH/BD/97933/2013 granted by the National Foundation for Science and Technology (FCT). References Canelas, R.; Murillo, J. & Ferreira, R.M.L. (2013), Two-dimensional depth-averaged modelling of dam-break flows over mobile beds. Journal of Hydraulic Research, 51(4), 392-407. Conde, D. A. S.; Baptista, M. A. V.; Sousa Oliveira, C. & Ferreira, R. M. L. (2013), A shallow-flow model for the propagation of tsunamis over complex geometries and mobile beds, Nat. Hazards and Earth Syst. Sci., 13, 2533-2542. Conde, D. A. S.; Telhado, M. J.; Viana Baptista, M. A. & Ferreira, R. M. L. (2015) Severity and exposure associated with tsunami actions in

  16. Comparing the performance of 11 crop simulation models in predicting yield response to nitrogen fertilization

    OpenAIRE

    Salo , Tapio J.; Palosuo , Taru; Kersebaum , Kurt Christian; Nendel , Claas; Angulo , Carlos; Ewert , Frank; Bindi , Marco; Calanca , Pierluigi; Klein , Tommy; Moriondo , Marco; Ferrise , Roberto; Olesen , Jørgen Eivind; Patil , Rasmi H.; Ruget , Francoise; Takac , Jozef

    2016-01-01

    Eleven widely used crop simulation models (APSIM, CERES, CROPSYST, COUP, DAISY, EPIC, FASSET, HERMES, MONICA, STICS and WOFOST) were tested using spring barley (Hordeum vulgare L.) data set under varying nitrogen (N) fertilizer rates from three experimental years in the boreal climate of Jokioinen, Finland. This is the largest standardized crop model inter-comparison under different levels of N supply to date. The models were calibrated using data from 2002 and 2008, of which 2008 included si...

  17. LIAR -- A computer program for the modeling and simulation of high performance linacs

    International Nuclear Information System (INIS)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-04-01

    The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm

  18. Advanced Models and Algorithms for Self-Similar IP Network Traffic Simulation and Performance Analysis

    Science.gov (United States)

    Radev, Dimitar; Lokshina, Izabella

    2010-11-01

    The paper examines self-similar (or fractal) properties of real communication network traffic data over a wide range of time scales. These self-similar properties are very different from the properties of traditional models based on Poisson and Markov-modulated Poisson processes. Advanced fractal models of sequentional generators and fixed-length sequence generators, and efficient algorithms that are used to simulate self-similar behavior of IP network traffic data are developed and applied. Numerical examples are provided; and simulation results are obtained and analyzed.

  19. Model and tool requirements for co-simulation of building performance

    NARCIS (Netherlands)

    Trcka, M.; Hensen, J.L.M.

    2006-01-01

    The use of building performance simulation (BPS) can substantially help in improving building design towards higher occupant comfort and lower fuel consumption, while reducing emission of greenhouse gasses. Unfortunately, current BPS tools do not allow inter-tool communication and thus limit a

  20. High-performance modeling of CO2 sequestration by coupling reservoir simulation and molecular dynamics

    KAUST Repository

    Bao, Kai; Yan, Mi; Lu, Ligang; Allen, Rebecca; Salam, Amgad; Jordan, Kirk E.; Sun, Shuyu

    2013-01-01

    multicomponent compositional flow simulation to handle more complicated physical process in the future. Accuracy and scalability analysis are performed on an IBM BlueGene/P and on an IBM BlueGene/Q, the latest IBM supercomputer. Results show good accuracy of our

  1. Performance modelling and simulation of an absorption solar cooling system for Malaysia

    International Nuclear Information System (INIS)

    Assilzadeh, F.; Ali, Y.; Kamaruzzaman Sopian

    2006-01-01

    Solar radiation contains huge amounts of energy and is required for almost all the natural processes on earth. Solar-powered air-conditioning has many advantages when compared to normal electricity system. This paper presents a solar cooling system that has been designed for Malaysia and other tropical regions using evacuated tube solar collector and LiBr absorption system. A modelling and simulation of absorption solar cooling system is modeled in Transient System Simulation (TRNSYS) environment. The typical meteorological year file containing the weather parameters is used to simulate the system. Then a system optimization is carried out in order to select the appropriate type of collector, the optimum size of storage tank, the optimum collector slope and area and the optimum thermostat setting of the auxiliary boiler

  2. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  3. Comparing the performance of 11 crop simulation models in predicting yield response to nitrogen fertilization

    DEFF Research Database (Denmark)

    Salo, T J; Palosuo, T; Kersebaum, K C

    2016-01-01

    Eleven widely used crop simulation models (APSIM, CERES, CROPSYST, COUP, DAISY, EPIC, FASSET, HERMES, MONICA, STICS and WOFOST) were tested using spring barley (Hordeum vulgare L.) data set under varying nitrogen (N) fertilizer rates from three experimental years in the boreal climate of Jokioinen......, Finland. This is the largest standardized crop model inter-comparison under different levels of N supply to date. The models were calibrated using data from 2002 and 2008, of which 2008 included six N rates ranging from 0 to 150 kg N/ha. Calibration data consisted of weather, soil, phenology, leaf area...... ranged from 170 to 870 kg/ha. During the test year 2009, most models failed to accurately reproduce the observed low yield without N fertilizer as well as the steep yield response to N applications. The multi-model predictions were closer to observations than most single-model predictions, but multi...

  4. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  5. Comparison of the development of performance skills in ultrasound-guided regional anesthesia simulations with different phantom models.

    Science.gov (United States)

    Liu, Yang; Glass, Nancy L; Glover, Chris D; Power, Robert W; Watcha, Mehernoor F

    2013-12-01

    Ultrasound-guided regional anesthesia (UGRA) skills are traditionally obtained by supervised performance on patients, but practice on phantom models improves success. Currently available models are expensive or use perishable products, for example, olive-in-chicken breasts (OCB). We constructed 2 inexpensive phantom (transparent and opaque) models with readily available nonperishable products and compared the process of learning UGRA skills by novice practitioners on these models with the OCB model. Three experts first established criteria for a satisfactory completion of the simulated UGRA task in the 3 models. Thirty-six novice trainees (simulations was accomplished. The number of errors, needle passes, and time for task completion per attempt progressively decreased in all 3 groups. However, failure to identify the target and to visualize the needle on the ultrasound image occurred more frequently with the OCB model. The time to complete simulator training was shortest with the transparent model, owing to shorter target identification times. However, trainees were less likely to agree strongly that this model was realistic for teaching UGRA skills. Training on inexpensive synthetic simulation models with no perishable products permits learning of UGRA skills by novices. The OCB model has disadvantages of containing potentially infective material, requires refrigeration, cannot be used after multiple needle punctures, and is associated with more failures during simulated UGRA. Direct visualization of the target in the transparent model allows the trainee to focus on needle insertion skills, but the opaque model may be more realistic for learning target identification skills required when UGRA is performed on real patients in the operating room.

  6. Mathematical modelling and simulation of the thermal performance of a solar heated indoor swimming pool

    Directory of Open Access Journals (Sweden)

    Mančić Marko V.

    2014-01-01

    Full Text Available Buildings with indoor swimming pools have a large energy footprint. The source of major energy loss is the swimming pool hall where air humidity is increased by evaporation from the pool water surface. This increases energy consumption for heating and ventilation of the pool hall, fresh water supply loss and heat demand for pool water heating. In this paper, a mathematical model of the swimming pool was made to assess energy demands of an indoor swimming pool building. The mathematical model of the swimming pool is used with the created multi-zone building model in TRNSYS software to determine pool hall energy demand and pool losses. Energy loss for pool water and pool hall heating and ventilation are analyzed for different target pool water and air temperatures. The simulation showed that pool water heating accounts for around 22%, whereas heating and ventilation of the pool hall for around 60% of the total pool hall heat demand. With a change of preset controller air and water temperatures in simulations, evaporation loss was in the range 46-54% of the total pool losses. A solar thermal sanitary hot water system was modelled and simulated to analyze it's potential for energy savings of the presented demand side model. The simulation showed that up to 87% of water heating demands could be met by the solar thermal system, while avoiding stagnation. [Projekat Ministarstva nauke Republike Srbije, br. III 42006: Research and development of energy and environmentally highly effective polygeneration systems based on using renewable energy sources

  7. Assessing the Impact of Equipment Aging on System Performance Using Simulation Modeling Methods

    International Nuclear Information System (INIS)

    Gupta, N. K.

    2005-01-01

    The radiological Inductively Coupled Plasma Mass Spectrometer (ICP-MS) is used to analyze the radioactive samples collected from different radioactive material processing operations at Savannah River Site (SRS). The expeditious processing of these samples is important for safe and reliable operations at SRS. As the radiological (RAD) ICP-MS machine ages, the experience shows that replacement parts and repairs are difficult to obtain on time for reliable operations after 5 years of service. A discrete event model using commercial software EXTEND was prepared to assess the impact on sample turn around times as the ICP-MS gets older. The model was prepared using the sample statistics from the previous 4 years. Machine utilization rates were calculated for the new machine, 5 year old machine, 10 year old machine, and a 12 year old machine. Computer simulations were run for these periods and the sample delay times calculated. The model was validated against the sample statistics collected from the previous 4 quarters. 90% confidence intervals were calculated for the 10th, 25th, 50th, and 90th quantiles of the samples. The simulation results show that if 50% of the samples are needed on time for efficient site operations, a 10 year old machine could take nearly 50 days longer to process these samples than a 5-year old machine. This simulation effort quantifies the impact on sample turn around time as the ICP-MS gets older

  8. Interactions of Team Mental Models and Monitoring Behaviors Predict Team Performance in Simulated Anesthesia Inductions

    Science.gov (United States)

    Burtscher, Michael J.; Kolbe, Michaela; Wacker, Johannes; Manser, Tanja

    2011-01-01

    In the present study, we investigated how two team mental model properties (similarity vs. accuracy) and two forms of monitoring behavior (team vs. systems) interacted to predict team performance in anesthesia. In particular, we were interested in whether the relationship between monitoring behavior and team performance was moderated by team…

  9. The performance of simulated annealing in parameter estimation for vapor-liquid equilibrium modeling

    Directory of Open Access Journals (Sweden)

    A. Bonilla-Petriciolet

    2007-03-01

    Full Text Available In this paper we report the application and evaluation of the simulated annealing (SA optimization method in parameter estimation for vapor-liquid equilibrium (VLE modeling. We tested this optimization method using the classical least squares and error-in-variable approaches. The reliability and efficiency of the data-fitting procedure are also considered using different values for algorithm parameters of the SA method. Our results indicate that this method, when properly implemented, is a robust procedure for nonlinear parameter estimation in thermodynamic models. However, in difficult problems it still can converge to local optimums of the objective function.

  10. Performance and Uncertainty Evaluation of Snow Models on Snowmelt Flow Simulations over a Nordic Catchment (Mistassibi, Canada

    Directory of Open Access Journals (Sweden)

    Magali Troin

    2015-11-01

    Full Text Available An analysis of hydrological response to a multi-model approach based on an ensemble of seven snow models (SM; degree-day and mixed degree-day/energy balance models coupled with three hydrological models (HM is presented for a snowmelt-dominated basin in Canada. The present study aims to compare the performance and the reliability of different types of SM-HM combinations at simulating snowmelt flows over the 1961–2000 historical period. The multi-model approach also allows evaluating the uncertainties associated with the structure of the SM-HM ensemble to better predict river flows in Nordic environments. The 20-year calibration shows a satisfactory performance of the ensemble of 21 SM-HM combinations at simulating daily discharges and snow water equivalents (SWEs, with low streamflow volume biases. The validation of the ensemble of 21 SM-HM combinations is conducted over a 20-year period. Performances are similar to the calibration in simulating the daily discharges and SWEs, again with low model biases for streamflow. The spring-snowmelt-generated peak flow is captured only in timing by the ensemble of 21 SM-HM combinations. The results of specific hydrologic indicators show that the uncertainty related to the choice of the given HM in the SM-HM combinations cannot be neglected in a more quantitative manner in simulating snowmelt flows. The selection of the SM plays a larger role than the choice of the SM approach (degree-day versus mixed degree-day/energy balance in simulating spring flows. Overall, the snow models provide a low degree of uncertainty to the total uncertainty in hydrological modeling for snow hydrology studies.

  11. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  12. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  13. An Assessment of Mean Areal Precipitation Methods on Simulated Stream Flow: A SWAT Model Performance Assessment

    Directory of Open Access Journals (Sweden)

    Sean Zeiger

    2017-06-01

    Full Text Available Accurate mean areal precipitation (MAP estimates are essential input forcings for hydrologic models. However, the selection of the most accurate method to estimate MAP can be daunting because there are numerous methods to choose from (e.g., proximate gauge, direct weighted average, surface-fitting, and remotely sensed methods. Multiple methods (n = 19 were used to estimate MAP with precipitation data from 11 distributed monitoring sites, and 4 remotely sensed data sets. Each method was validated against the hydrologic model simulated stream flow using the Soil and Water Assessment Tool (SWAT. SWAT was validated using a split-site method and the observed stream flow data from five nested-scale gauging sites in a mixed-land-use watershed of the central USA. Cross-validation results showed the error associated with surface-fitting and remotely sensed methods ranging from −4.5 to −5.1%, and −9.8 to −14.7%, respectively. Split-site validation results showed the percent bias (PBIAS values that ranged from −4.5 to −160%. Second order polynomial functions especially overestimated precipitation and subsequent stream flow simulations (PBIAS = −160 in the headwaters. The results indicated that using an inverse-distance weighted, linear polynomial interpolation or multiquadric function method to estimate MAP may improve SWAT model simulations. Collectively, the results highlight the importance of spatially distributed observed hydroclimate data for precipitation and subsequent steam flow estimations. The MAP methods demonstrated in the current work can be used to reduce hydrologic model uncertainty caused by watershed physiographic differences.

  14. Microcomputer simulation model for facility performance assessment: a case study of nuclear spent fuel handling facility operations

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.; Otis, P.T.

    1985-10-01

    A microcomputer based simulation model was recently developed at the Pacific Northwest Laboratory (PNL) to assist in the evaluation of design alternatives for a proposed facility to receive, consolidate and store nuclear spent fuel from US commercial power plants. Previous performance assessments were limited to deterministic calculations and Gantt chart representations of the facility operations. To insure that the design of the facility will be adequate to meet the specified throughput requirements, the simulation model was used to analyze such factors as material flow, equipment capability and the interface between the MRS facility and the nuclear waste transportation system. The simulation analysis model was based on commercially available software and application programs designed to represent the MRS waste handling facility operations. The results of the evaluation were used by the design review team at PNL to identify areas where design modifications should be considered. 4 figs

  15. Development of a simplified simulation model for performance characterization of a pixellated CdZnTe multimodality imaging system

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, P; Santos, A [Departamento de IngenierIa Electronica, Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Darambara, D G [Joint Department of Physics, Royal Marsden NHS Foundation Trust and The Institute of Cancer Research, Fulham Road, London SW3 6JJ (United Kingdom)], E-mail: pguerra@die.um.es

    2008-02-21

    Current requirements of molecular imaging lead to the complete integration of complementary modalities in a single hybrid imaging system to correlate function and structure. Among the various existing detector technologies, which can be implemented to integrate nuclear modalities (PET and/or single-photon emission computed tomography with x-rays (CT) and most probably with MR, pixellated wide bandgap room temperature semiconductor detectors, such as CdZnTe and/or CdTe, are promising candidates. This paper deals with the development of a simplified simulation model for pixellated semiconductor radiation detectors, as a first step towards the performance characterization of a multimodality imaging system based on CdZnTe. In particular, this work presents a simple computational model, based on a 1D approximate solution of the Schockley-Ramo theorem, and its integration into the Geant4 application for tomographic emission (GATE) platform in order to perform accurately and, therefore, improve the simulations of pixellated detectors in different configurations with a simultaneous cathode and anode pixel readout. The model presented here is successfully validated against an existing detailed finite element simulator, the multi-geometry simulation code, with respect to the charge induced at the anode, taking into consideration interpixel charge sharing and crosstalk, and to the detector charge induction efficiency. As a final point, the model provides estimated energy spectra and time resolution for {sup 57}Co and {sup 18}F sources obtained with the GATE code after the incorporation of the proposed model.

  16. A cycle simulation model for predicting the performance of a diesel engine fuelled by diesel and biodiesel blends

    International Nuclear Information System (INIS)

    Gogoi, T.K.; Baruah, D.C.

    2010-01-01

    Among the alternative fuels, biodiesel and its blends are considered suitable and the most promising fuel for diesel engine. The properties of biodiesel are found similar to that of diesel. Many researchers have experimentally evaluated the performance characteristics of conventional diesel engines fuelled by biodiesel and its blends. However, experiments require enormous effort, money and time. Hence, a cycle simulation model incorporating a thermodynamic based single zone combustion model is developed to predict the performance of diesel engine. The effect of engine speed and compression ratio on brake power and brake thermal efficiency is analysed through the model. The fuel considered for the analysis are diesel, 20%, 40%, 60% blending of diesel and biodiesel derived from Karanja oil (Pongamia Glabra). The model predicts similar performance with diesel, 20% and 40% blending. However, with 60% blending, it reveals better performance in terms of brake power and brake thermal efficiency.

  17. Mode of operation and performance of a simulation model for the electricity management

    International Nuclear Information System (INIS)

    Weible, H.

    1981-01-01

    In the first two main parts of this report the structure of a simulation model to define the consequences of decisions in the electricity supply sector is explained, based on a careful revision of relevant problems. By means of the system analysis the model represents an attempt to describe all essential relations between the electricity demand on the one hand and the consequences of the generation of electricity (indluding transport) for consumers, state, environment, market of capital and fuel on the other hand. The multifarious ways of operation and application of the model are demonstrated taking the public electricity management of Baden-Wuerttemberg as an example. Besides a validation of the model for 1970 to 1977, possible trends until the year 2000 are shown. As an essential result of the analyses it can be shown, that a renunciation on the further extension of nuclear energy turns out to be unrealistic by a supposed average increase of the electricity demand of 4% p.a.. A comparison of different model conceptions shows, that the information loss, which proceeds from the approximation of year-load-curves, leads to important deviations in the model results. According to the sensitivity analyses the increase of the electricity consumption turns out to be the most essential influence on the time dependent results. (orig./UA) [de

  18. The computer program LIAR for the simulation and modeling of high performance linacs

    International Nuclear Information System (INIS)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.O.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-07-01

    High performance linear accelerators are the central components of the proposed next generation of linear colliders. They must provide acceleration of up to 750 GeV per beam while maintaining small normalized emittances. Standard simulation programs, mainly developed for storage rings, did not meet the specific requirements for high performance linacs with high bunch charges and strong wakefields. The authors present the program. LIAR (LInear Accelerator Research code) that includes single and multi-bunch wakefield effects, a 6D coupled beam description, specific optimization algorithms and other advanced features. LIAR has been applied to and checked against the existing Stanford Linear Collider (SLC), the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS) at SLAC. Its modular structure allows easy extension for different purposes. The program is available for UNIX workstations and Windows PC's

  19. Modeling and Simulation of Long-Term Performance of Near-Surface Barriers

    International Nuclear Information System (INIS)

    Piet, S. J.; Jacobson, J. J.; Martian, P.; Martineau, R.; Soto, R.

    2003-01-01

    INEEL started a new project on long-term barrier integrity in April 2002 that aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late, prior to system-level failure). This paper describes our computer simulation approach for better understanding the relationships and dynamics between the various components and management decisions in a cap. The simulation is designed to clarify the complex relationships between the various components within the cap system and the various management practices that affect the barrier performance. We have also conceptualized a time-dependent 3-D simulation with rigorous solution to unsaturated flow physics with complex surface boundary conditions

  20. Predictive Maturity of Multi-Scale Simulation Models for Fuel Performance

    International Nuclear Information System (INIS)

    Atamturktur, Sez; Unal, Cetin; Hemez, Francois; Williams, Brian; Tome, Carlos

    2015-01-01

    The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy's resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed framework is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this

  1. Predictive Maturity of Multi-Scale Simulation Models for Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Atamturktur, Sez [Clemson Univ., SC (United States); Unal, Cetin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hemez, Francois [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Brian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tome, Carlos [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-16

    The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed framework is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this

  2. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  3. Signal and image processing systems performance evaluation, simulation, and modeling; Proceedings of the Meeting, Orlando, FL, Apr. 4, 5, 1991

    Science.gov (United States)

    Nasr, Hatem N.; Bazakos, Michael E.

    The various aspects of the evaluation and modeling problems in algorithms, sensors, and systems are addressed. Consideration is given to a generic modular imaging IR signal processor, real-time architecture based on the image-processing module family, application of the Proto Ware simulation testbed to the design and evaluation of advanced avionics, development of a fire-and-forget imaging infrared seeker missile simulation, an adaptive morphological filter for image processing, laboratory development of a nonlinear optical tracking filter, a dynamic end-to-end model testbed for IR detection algorithms, wind tunnel model aircraft attitude and motion analysis, an information-theoretic approach to optimal quantization, parametric analysis of target/decoy performance, neural networks for automated target recognition parameters adaptation, performance evaluation of a texture-based segmentation algorithm, evaluation of image tracker algorithms, and multisensor fusion methodologies. (No individual items are abstracted in this volume)

  4. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1990-12-01

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  5. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, M S

    1990-12-15

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  6. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  7. Modeling and Simulation of Thermal Performance of Solar-Assisted Air Conditioning System under Iraq Climate

    Directory of Open Access Journals (Sweden)

    Najim Abid Jassim

    2016-08-01

    Full Text Available In Iraq most of the small buildings deployed a conventional air conditioning technology which typically uses electrically driven compressor systems which exhibits several clear disadvantages such as high energy consumption, high electricity at peak loads. In this work a thermal performance of air conditioning system combined with a solar collector is investigated theoretically. The hybrid air conditioner consists of a semi hermetic compressor, water cooled shell and tube condenser, thermal expansion valve and coil with tank evaporator. The theoretical analysis included a simulation for the solar assisted air-conditioning system using EES software to analyze the effect of different parameters on the power consumption of compressor and the performance of system. The results show that refrigeration capacity is increased from 2.7 kW to 4.4kW, as the evaporating temperature increased from 3 to 18 ºC. Also the power consumption is increased from 0.89 kW to 1.08 kW. So the COP of the system is increased from 3.068 to 4.117. The power consumption is increased from 0.897 kW to 1.031 kW as the condensing temperature increased from 35 ºC to 45 ºC. While the COP is decreased from 3.89 to 3.1. The power consumption is decreased from 1.05 kW to 0.7kW as the solar radiation intensity increased from 300 W/m2 to 1000 W/m2, while the COP is increased from 3.15 to 4.8. A comparison between the simulation and available experimental data showed acceptable agreement.

  8. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation.

    Science.gov (United States)

    Palme, M; Inostroza, L; Villacreses, G; Lobato, A; Carrasco, C

    2017-10-01

    This data article presents files supporting calculation for urban heat island (UHI) inclusion in building performance simulation (BPS). Methodology is used in the research article "From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect" (Palme et al., 2017) [1]. In this research, a Geographical Information System (GIS) study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso). Then, a Principal Component Analysis (PCA) is done to obtain reference Urban Tissues Categories (UTC) to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG) software (version 4.1 beta). Finally, BPS is run out with the Transient System Simulation (TRNSYS) software (version 17). In this data paper, four sets of data are presented: 1) PCA data (excel) to explain how to group different urban samples in representative UTC; 2) UWG data (text) to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso); 3) weather data (text) with the resulting rural and urban weather; 4) BPS models (text) data containing the TRNSYS models (four building models).

  9. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation

    Directory of Open Access Journals (Sweden)

    M. Palme

    2017-10-01

    Full Text Available This data article presents files supporting calculation for urban heat island (UHI inclusion in building performance simulation (BPS. Methodology is used in the research article “From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect” (Palme et al., 2017 [1]. In this research, a Geographical Information System (GIS study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso. Then, a Principal Component Analysis (PCA is done to obtain reference Urban Tissues Categories (UTC to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG software (version 4.1 beta. Finally, BPS is run out with the Transient System Simulation (TRNSYS software (version 17. In this data paper, four sets of data are presented: 1 PCA data (excel to explain how to group different urban samples in representative UTC; 2 UWG data (text to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso; 3 weather data (text with the resulting rural and urban weather; 4 BPS models (text data containing the TRNSYS models (four building models.

  10. Physical modelling of the composting environment: A review. Part 2: Simulation performance

    International Nuclear Information System (INIS)

    Mason, I.G.; Milke, M.W.

    2005-01-01

    This paper reviews previously published heat balance data for experimental and full-scale composting reactors, and then presents an evaluation of the simulation performance of laboratory and pilot-scale reactors, using both quantitative and qualitative temperature profile characteristics. The review indicates that laboratory-scale reactors have typically demonstrated markedly different heat balance behaviour in comparison to full-scale systems, with ventilative heat losses of 36-67%, and 70-95% of the total flux, respectively. Similarly, conductive/convective/radiative (CCR) heat losses from laboratory reactors have been reported at 33-62% of the total flux, whereas CCR losses from full-scale composting systems have ranged from 3% to 15% of the total. Full-scale windrow temperature-time profiles from the literature were characterised by the present authors. Areas bounded by the curve and a 40 deg. C baseline (A 40 ) exceeded 624 deg. C. days, areas bounded by the curve and a 55 deg. C baseline (A 55 ) exceeded 60 deg. C days, and times at 40 and 55 deg. C were >46 days and >24 days, respectively, over periods of 50-74 days. For forced aeration systems at full scale, values of A 40 exceeded 224 deg. C days, values of A 55 exceeded 26 deg. C days, and times at 40 and 55 deg. C were >14 days and >10 days, respectively, over periods of 15-35 days. Values of these four parameters for laboratory-scale reactors were typically considerably lower than for the full-scale systems, although temperature shape characteristics were often similar to those in full-scale profiles. Evaluation of laboratory-, pilot- and full-scale profiles from systems treating the same substrate showed that a laboratory-scale reactor and two pilot-scale reactors operated at comparatively high aeration rates poorly simulated full-scale temperature profiles. However, the curves from two moderately insulated, self-heating, pilot-scale reactors operated at relatively low aeration rates appeared to

  11. Modelling and simulation of the dynamic performance of a natural-gas turbine flowmeter

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Gonzalez, L.M. [Escuela Tecnica Superior de Ingenieria Industrial, Universidad de La Rioja, C/Luis de Ulloa, 20, E-26004 Logrono (La Rioja) (Spain); Sala, J.M.; Gonzalez-Bustamante, J.A. [Escuela Superior de Ingenieros Industriales de Bilbao, Universidad del Pais Vasco, Alameda de Urquijo, s/n 48013 Bilbao (Bizkaia) (Spain); Miguez, J.L. [Universidad de Vigo, Escuela Tecnica Superior de Ingenieros Industriales, C/Lagoas-Marcosende, s/n 36200 Vigo (Pontevedra) (Spain)

    2006-11-15

    Installations involving fluids often present problems in terms of the dynamic performances of their different parts. These problems can be analysed and dealt with at the design stage. This means that both the technologists who design the thermohydraulic process and those who carry out the regulation and control must be involved in the process from the early stages of the design. In this study, a dynamic model of the behaviour of a gas flowmeter has been developed, based on the laws of conservation of mass, linear momentum, energy and angular momentum. The model has been computerised via a software module. As there is no information available with which to compare the model's behaviour, a continuous rating validation has been carried out, using a comparison with the actual calibration curve of the flowmeter. The results obtained are satisfactory. (author)

  12. Multidisciplinary Energy Assessment of Tertiary Buildings: Automated Geomatic Inspection, Building Information Modeling Reconstruction and Building Performance Simulation

    Directory of Open Access Journals (Sweden)

    Faustino Patiño-Cambeiro

    2017-07-01

    Full Text Available There is an urgent need for energy efficiency in buildings within the European framework, considering its environmental implications, and Europe’s energy dependence. Furthermore, the need for enhancing and increasing productivity in the building industry turns new technologies and building energy performance simulation environments into extremely interesting solutions towards rigorous analysis and decision making in renovation within acceptable risk levels. The present work describes a multidisciplinary approach for the estimation of the energy performance of an educational building. The research involved data acquisition with advanced geomatic tools, the development of an optimized building information model, and energy assessment in Building Performance Simulation (BPS software. Interoperability issues were observed in the different steps of the process. The inspection and diagnostic phases were conducted in a timely, accurate manner thanks to automated data acquisition and subsequent analysis using Building Information Modeling based tools (BIM-based tools. Energy simulation was performed using Design Builder, and the results obtained were compared with those yielded by the official software tool established by Spanish regulations for energy certification. The discrepancies between the results of both programs have proven that the official software program is conservative in this sense. This may cause the depreciation of the assessed buildings.

  13. Performance assessment of Large Eddy Simulation (LES) for modeling dispersion in an urban street canyon with tree planting

    Science.gov (United States)

    Moonen, P.; Gromke, C.; Dorer, V.

    2013-08-01

    The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are considered. The model performance is assessed in several steps, ranging from a qualitative comparison to measured concentrations, over statistical data analysis by means of scatter plots and box plots, up to the calculation of objective validation metrics. The extensive validation effort highlights and quantifies notable features and shortcomings of the model, which would otherwise remain unnoticed. The model performance is found to be spatially non-uniform. Closer agreement with measurement data is achieved near the canyon ends than for the central part of the canyon, and typical model acceptance criteria are satisfied more easily for the leeward than for the windward canyon wall. This demonstrates the need for rigorous model evaluation. Only quality-assured models can be used with confidence to support assessment, planning and implementation of pollutant mitigation strategies.

  14. Identifying a key physical factor sensitive to the performance of Madden-Julian oscillation simulation in climate models

    Science.gov (United States)

    Kim, Go-Un; Seo, Kyong-Hwan

    2018-01-01

    A key physical factor in regulating the performance of Madden-Julian oscillation (MJO) simulation is examined by using 26 climate model simulations from the World Meteorological Organization's Working Group for Numerical Experimentation/Global Energy and Water Cycle Experiment Atmospheric System Study (WGNE and MJO-Task Force/GASS) global model comparison project. For this, intraseasonal moisture budget equation is analyzed and a simple, efficient physical quantity is developed. The result shows that MJO skill is most sensitive to vertically integrated intraseasonal zonal wind convergence (ZC). In particular, a specific threshold value of the strength of the ZC can be used as distinguishing between good and poor models. An additional finding is that good models exhibit the correct simultaneous convection and large-scale circulation phase relationship. In poor models, however, the peak circulation response appears 3 days after peak rainfall, suggesting unfavorable coupling between convection and circulation. For an improving simulation of the MJO in climate models, we propose that this delay of circulation in response to convection needs to be corrected in the cumulus parameterization scheme.

  15. Evaluation of manual and automatic manually triggered ventilation performance and ergonomics using a simulation model.

    Science.gov (United States)

    Marjanovic, Nicolas; Le Floch, Soizig; Jaffrelot, Morgan; L'Her, Erwan

    2014-05-01

    In the absence of endotracheal intubation, the manual bag-valve-mask (BVM) is the most frequently used ventilation technique during resuscitation. The efficiency of other devices has been poorly studied. The bench-test study described here was designed to evaluate the effectiveness of an automatic, manually triggered system, and to compare it with manual BVM ventilation. A respiratory system bench model was assembled using a lung simulator connected to a manikin to simulate a patient with unprotected airways. Fifty health-care providers from different professional groups (emergency physicians, residents, advanced paramedics, nurses, and paramedics; n = 10 per group) evaluated manual BVM ventilation, and compared it with an automatic manually triggered device (EasyCPR). Three pathological situations were simulated (restrictive, obstructive, normal). Standard ventilation parameters were recorded; the ergonomics of the system were assessed by the health-care professionals using a standard numerical scale once the recordings were completed. The tidal volume fell within the standard range (400-600 mL) for 25.6% of breaths (0.6-45 breaths) using manual BVM ventilation, and for 28.6% of breaths (0.3-80 breaths) using the automatic manually triggered device (EasyCPR) (P < .0002). Peak inspiratory airway pressure was lower using the automatic manually triggered device (EasyCPR) (10.6 ± 5 vs 15.9 ± 10 cm H2O, P < .001). The ventilation rate fell consistently within the guidelines, in the case of the automatic manually triggered device (EasyCPR) only (10.3 ± 2 vs 17.6 ± 6, P < .001). Significant pulmonary overdistention was observed when using the manual BVM device during the normal and obstructive sequences. The nurses and paramedics considered the ergonomics of the automatic manually triggered device (EasyCPR) to be better than those of the manual device. The use of an automatic manually triggered device may improve ventilation efficiency and decrease the risk of

  16. Qualification of a Plant Disease Simulation Model: Performance of the LATEBLIGHT Model Across a Broad Range of Environments.

    Science.gov (United States)

    Andrade-Piedra, Jorge L; Forbes, Gregory A; Shtienberg, Dani; Grünwald, Niklaus J; Chacón, María G; Taipe, Marco V; Hijmans, Robert J; Fry, William E

    2005-12-01

    ABSTRACT The concept of model qualification, i.e., discovering the domain over which a validated model may be properly used, was illustrated with LATEBLIGHT, a mathematical model that simulates the effect of weather, host growth and resistance, and fungicide use on asexual development and growth of Phytophthora infestans on potato foliage. Late blight epidemics from Ecuador, Mexico, Israel, and the United States involving 13 potato cultivars (32 epidemics in total) were compared with model predictions using graphical and statistical tests. Fungicides were not applied in any of the epidemics. For the simulations, a host resistance level was assigned to each cultivar based on general categories reported by local investigators. For eight cultivars, the model predictions fit the observed data. For four cultivars, the model predictions overestimated disease, likely due to inaccurate estimates of host resistance. Model predictions were inconsistent for one cultivar and for one location. It was concluded that the domain of applicability of LATEBLIGHT can be extended from the range of conditions in Peru for which it has been previously validated to those observed in this study. A sensitivity analysis showed that, within the range of values observed empirically, LATEBLIGHT is more sensitive to changes in variables related to initial inoculum and to weather than to changes in variables relating to host resistance.

  17. A comprehensive simulation model of the performance of photochromic films in absorbance-modulation-optical-lithography

    Directory of Open Access Journals (Sweden)

    Apratim Majumder

    2016-03-01

    Full Text Available Optical lithography is the most prevalent method of fabricating micro-and nano-scale structures in the semiconductor industry due to the fact that patterning using photons is fast, accurate and provides high throughput. However, the resolution of this technique is inherently limited by the physical phenomenon of diffraction. Absorbance-Modulation-Optical Lithography (AMOL, a recently developed technique has been successfully demonstrated to be able to circumvent this diffraction limit. AMOL employs a dual-wavelength exposure system in conjunction with spectrally selective reversible photo-transitions in thin films of photochromic molecules to achieve patterning of features with sizes beyond the far-field diffraction limit. We have developed a finite-element-method based full-electromagnetic-wave solution model that simulates the photo-chemical processes that occur within the thin film of the photochromic molecules under illumination by the exposure and confining wavelengths in AMOL. This model allows us to understand how the material characteristics influence the confinement to sub-diffraction dimensions, of the transmitted point spread function (PSF of the exposure wavelength inside the recording medium. The model reported here provides the most comprehensive analysis of the AMOL process to-date, and the results show that the most important factors that govern the process, are the polarization of the two beams, the ratio of the intensities of the two wavelengths, the relative absorption coefficients and the concentration of the photochromic species, the thickness of the photochromic layer and the quantum yields of the photoreactions at the two wavelengths. The aim of this work is to elucidate the requirements of AMOL in successfully circumventing the far-field diffraction limit.

  18. A comprehensive simulation model of the performance of photochromic films in absorbance-modulation-optical-lithography

    Energy Technology Data Exchange (ETDEWEB)

    Majumder, Apratim; Helms, Phillip L.; Menon, Rajesh, E-mail: rmenon@eng.utah.edu [Department of Electrical and Computer Engineering, University of Utah, Salt Lake City, Utah 84112 (United States); Andrew, Trisha L. [Department of Chemistry, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States)

    2016-03-15

    Optical lithography is the most prevalent method of fabricating micro-and nano-scale structures in the semiconductor industry due to the fact that patterning using photons is fast, accurate and provides high throughput. However, the resolution of this technique is inherently limited by the physical phenomenon of diffraction. Absorbance-Modulation-Optical Lithography (AMOL), a recently developed technique has been successfully demonstrated to be able to circumvent this diffraction limit. AMOL employs a dual-wavelength exposure system in conjunction with spectrally selective reversible photo-transitions in thin films of photochromic molecules to achieve patterning of features with sizes beyond the far-field diffraction limit. We have developed a finite-element-method based full-electromagnetic-wave solution model that simulates the photo-chemical processes that occur within the thin film of the photochromic molecules under illumination by the exposure and confining wavelengths in AMOL. This model allows us to understand how the material characteristics influence the confinement to sub-diffraction dimensions, of the transmitted point spread function (PSF) of the exposure wavelength inside the recording medium. The model reported here provides the most comprehensive analysis of the AMOL process to-date, and the results show that the most important factors that govern the process, are the polarization of the two beams, the ratio of the intensities of the two wavelengths, the relative absorption coefficients and the concentration of the photochromic species, the thickness of the photochromic layer and the quantum yields of the photoreactions at the two wavelengths. The aim of this work is to elucidate the requirements of AMOL in successfully circumventing the far-field diffraction limit.

  19. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  20. Determining the energy performance of manually controlled solar shades: A stochastic model based co-simulation analysis

    International Nuclear Information System (INIS)

    Yao, Jian

    2014-01-01

    Highlights: • Driving factor for adjustment of manually controlled solar shades was determined. • A stochastic model for manual solar shades was constructed using Markov method. • Co-simulation with Energyplus was carried out in BCVTB. • External shading even manually controlled should be used prior to LOW-E windows. • Previous studies on manual solar shades may overestimate energy savings. - Abstract: Solar shading devices play a significant role in reducing building energy consumption and maintaining a comfortable indoor condition. In this paper, a typical office building with internal roller shades in hot summer and cold winter zone was selected to determine the driving factor of control behavior of manual solar shades. Solar radiation was determined as the major factor in driving solar shading adjustment based on field measurements and logit analysis and then a stochastic model for manually adjusted solar shades was constructed by using Markov method. This model was used in BCVTB for further co-simulation with Energyplus to determine the impact of the control behavior of solar shades on energy performance. The results show that manually adjusted solar shades, whatever located inside or outside, have a relatively high energy saving performance than clear-pane windows while only external shades perform better than regularly used LOW-E windows. Simulation also indicates that using an ideal assumption of solar shade adjustment as most studies do in building simulation may lead to an overestimation of energy saving by about 16–30%. There is a need to improve occupants’ actions on shades to more effectively respond to outdoor conditions in order to lower energy consumption, and this improvement can be easily achieved by using simple strategies as a guide to control manual solar shades

  1. Lasertron performance simulation

    International Nuclear Information System (INIS)

    Dubrovin, A.; Coulon, J.P.

    1987-05-01

    This report presents a comparative simulation study of the Lasertron at different frequency and emission conditions, in view to establish choice criteria for future experiments. The RING program for these simulations is an improved version of the one presented in an other report. The self-consistent treatment of the R.F. extraction zone is added to it, together with the possibility to vary initial conditions to better describe the laser illumination and the electron extraction from cathode. Plane or curved cathodes are used [fr

  2. Modeling of electrochemistry and steam-methane reforming performance for simulating pressurized solid oxide fuel cell stacks

    Energy Technology Data Exchange (ETDEWEB)

    Recknagle, Kurtis P.; Ryan, Emily M.; Koeppel, Brian J.; Mahoney, Lenna A.; Khaleel, Moe A. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States)

    2010-10-01

    This paper examines the electrochemical and direct internal steam-methane reforming performance of the solid oxide fuel cell when subjected to pressurization. Pressurized operation boosts the Nernst potential and decreases the activation polarization, both of which serve to increase cell voltage and power while lowering the heat load and operating temperature. A model considering the activation polarization in both the fuel and the air electrodes was adopted to address this effect on the electrochemical performance. The pressurized methane conversion kinetics and the increase in equilibrium methane concentration are considered in a new rate expression. The models were then applied in simulations to predict how the distributions of direct internal reforming rate, temperature, and current density are effected within stacks operating at elevated pressure. A generic 10 cm counter-flow stack model was created and used for the simulations of pressurized operation. The predictions showed improved thermal and electrical performance with increased operating pressure. The average and maximum cell temperatures decreased by 3% (20 C) while the cell voltage increased by 9% as the operating pressure was increased from 1 to 10 atm. (author)

  3. Building performance simulation in the early design stage: An introduction to integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2015-01-01

    , a visual programming language and a BPS to provide better support for the designer during the early stages of design as opposed to alternatives such as the current implementation of IFC or gbXML or the unaccompanied use of simulation packages. (C) 2015 Elsevier B.V. All rights reserved....

  4. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  5. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  6. International survey on current occupant modelling approaches in building performance simulation

    NARCIS (Netherlands)

    O'Brien, W.; Gaetani, I.; Gilani, S.; Carlucci, S.; Hoes, P.; Hensen, J.L.M.

    2017-01-01

    It is not evident that practitioners have kept pace with latest research developments in building occupant behaviour modelling; nor are the attitudes of practitioners regarding occupant behaviour modelling well understood. In order to guide research and development efforts, researchers,

  7. LIAR: A COMPUTER PROGRAM FOR THE SIMULATION AND MODELING OF HIGH PERFORMANCE LINACS

    International Nuclear Information System (INIS)

    Adolphsen, Chris

    2003-01-01

    The computer program LIAR (''LInear Accelerator Research code'') is a numerical simulation and tracking program for linear colliders. The LIAR project was started at SLAC in August 1995 in order to provide a computing and simulation tool that specifically addresses the needs of high energy linear colliders. LIAR is designed to be used for a variety of different linear accelerators. It has been applied for and checked against the existing Stanford Linear Collider (SLC) as well as the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS). The program includes wakefield effects, a 4D coupled beam description, specific optimization algorithms and other advanced features. We describe the most important concepts and highlights of the program. After having presented the LIAR program at the LINAC96 and the PAC97 conferences, we do now introduce it to the European particle accelerator community

  8. Comparing the performance of 11 crop simulation models in predicting yield response to nitrogen fertilization

    Czech Academy of Sciences Publication Activity Database

    Salo, T.; Palosuo, T.; Kersebaum, K. C.; Nendel, C.; Angulo, C.; Ewert, F.; Bindi, M.; Calanca, P.; Klein, T.; Moriondo, M.; Ferrise, R.; Olesen, J. E.; Patil, R. H.; Ruget, F.; Takáč, J.; Hlavinka, Petr; Trnka, Miroslav; Rötter, R. P.

    2016-01-01

    Roč. 154, č. 7 (2016), s. 1218-1240 ISSN 0021-8596 R&D Projects: GA MŠk(CZ) LO1415; GA MZe QJ1310123; GA MŠk(CZ) LD13030 EU Projects: European Commission(XE) 268277; European Commission(XE) 292944 Institutional support: RVO:67179843 Keywords : Northern growing conditions * climate change impacts * spring barley * system simulations * soil properties * winter-wheat * dynamics * growth Subject RIV: GC - Agronomy Impact factor: 1.291, year: 2016

  9. Performance of process-based models for simulation of grain N in crop rotations across Europe

    Czech Academy of Sciences Publication Activity Database

    Xiaogang, Y.; Kesebaum, K. C.; Kollas, C.; Manevski, K.; Baby, S.; Beaudoin, N.; Öztürk, I.; Gaiser, T.; Wu, L.; Hoffmann, M.; Charfeddine, M.; Conradt, T.; Constantin, J.; Ewert, F.; de Cortazar-Atauri, I. G.; Giglio, L.; Hlavinka, Petr; Hoffmann, H.; Launay, M.; Louarn, G.; Manderscheid, R.; Mary, B.; Mirschel, W.; Nendel, C.; Pacholski, A.; Palouso, T.; Ripoche-Wachter, D.; Rötter, R. P.; Ruget, F.; Sharif, B.; Trnka, Miroslav; Ventrella, D.; Weigel, H-J.; Olesen, J. E.

    2017-01-01

    Roč. 154, JUN (2017), s. 63-77 ISSN 0308-521X R&D Projects: GA MŠk(CZ) LO1415; GA MZe QJ1310123 Institutional support: RVO:67179843 Keywords : Calibration * Crop model * Crop rotation * Grain N content * Model evaluation * Model initialization Subject RIV: EH - Ecology, Behaviour OBOR OECD: Environmental sciences (social aspects to be 5.7) Impact factor: 2.571, year: 2016

  10. Stochastic Modeling and Simulation of Near-Fault Ground Motions for Performance-Based Earthquake Engineering

    OpenAIRE

    Dabaghi, Mayssa

    2014-01-01

    A comprehensive parameterized stochastic model of near-fault ground motions in two orthogonal horizontal directions is developed. The proposed model uniquely combines several existing and new sub-models to represent major characteristics of recorded near-fault ground motions. These characteristics include near-fault effects of directivity and fling step; temporal and spectral non-stationarity; intensity, duration and frequency content characteristics; directionality of components, as well as ...

  11. A predictive model of nuclear power plant crew decision-making and performance in a dynamic simulation environment

    Science.gov (United States)

    Coyne, Kevin Anthony

    The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional

  12. Challenge problem and milestones for : Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe, Jr.

    2010-09-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  13. Challenge problem and milestones for: Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC)

    International Nuclear Information System (INIS)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe Jr.

    2010-01-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  14. Energy Performance Measurement and Simulation Modeling of Tactical Soft-Wall Shelters

    Science.gov (United States)

    2015-07-01

    was too low to measure was on the order of 5 hours. Because the research team did not have access to the site between 1700 and 0500 hours the...Basic for Applications ( VBA ). The objective function was the root mean square (RMS) errors between modeled and measured heating load and the modeled...References Phase Change Energy Solutions. (2013). BioPCM web page, http://phasechange.com/index.php/en/about/our-material. Accessed 16 September

  15. Contribution to the modelling and analysis of logistics system performance by Petri nets and simulation models: Application in a supply chain

    Science.gov (United States)

    Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said

    2016-02-01

    In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.

  16. A high performance computing framework for physics-based modeling and simulation of military ground vehicles

    Science.gov (United States)

    Negrut, Dan; Lamb, David; Gorsich, David

    2011-06-01

    This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and

  17. Process Modeling, Performance Analysis and Configuration Simulation in Integrated Supply Chain Network Design

    OpenAIRE

    Dong, Ming

    2001-01-01

    Supply chain management has been recently introduced to address the integration of organizational functions ranging from the ordering and receipt of raw materials throughout the manufacturing processes, to the distribution and delivery of products to the customer. Its application demonstrates that this idea enables organizations to achieve higher quality products, better customer service, and lower inventory cost. In order to achieve high performance, supply chain functions must operate ...

  18. Comparing the Performance of Commonly Available Digital Elevation Models in GIS-based Flood Simulation

    Science.gov (United States)

    Ybanez, R. L.; Lagmay, A. M. A.; David, C. P.

    2016-12-01

    With climatological hazards increasing globally, the Philippines is listed as one of the most vulnerable countries in the world due to its location in the Western Pacific. Flood hazards mapping and modelling is one of the responses by local government and research institutions to help prepare for and mitigate the effects of flood hazards that constantly threaten towns and cities in floodplains during the 6-month rainy season. Available digital elevation maps, which serve as the most important dataset used in 2D flood modelling, are limited in the Philippines and testing is needed to determine which of the few would work best for flood hazards mapping and modelling. Two-dimensional GIS-based flood modelling with the flood-routing software FLO-2D was conducted using three different available DEMs from the ASTER GDEM, the SRTM GDEM, and the locally available IfSAR DTM. All other parameters kept uniform, such as resolution, soil parameters, rainfall amount, and surface roughness, the three models were run over a 129-sq. kilometer watershed with only the basemap varying. The output flood hazard maps were compared on the basis of their flood distribution, extent, and depth. The ASTER and SRTM GDEMs contained too much error and noise which manifested as dissipated and dissolved hazard areas in the lower watershed where clearly delineated flood hazards should be present. Noise on the two datasets are clearly visible as erratic mounds in the floodplain. The dataset which produced the only feasible flood hazard map is the IfSAR DTM which delineates flood hazard areas clearly and properly. Despite the use of ASTER and SRTM with their published resolution and accuracy, their use in GIS-based flood modelling would be unreliable. Although not as accessible, only IfSAR or better datasets should be used for creating secondary products from these base DEM datasets. For developing countries which are most prone to hazards, but with limited choices for basemaps used in hazards

  19. Performance Evaluation of a PID and a Fuzzy PID Controllers Designed for Controlling a Simulated Quadcopter Rotational Dynamics Model

    Directory of Open Access Journals (Sweden)

    Laith Jasim Saud

    2017-07-01

    Full Text Available This work is concerned with designing two types of controllers, a PID and a Fuzzy PID, to be used for flying and stabilizing a quadcopter. The designed controllers have been tuned, tested, and compared using two performance indices which are the Integral Square Error (ISE and the Integral Absolute Error (IAE, and also some response characteristics like the rise time, overshoot, settling time, and the steady state error. To try and test the controllers, a quadcopter mathematical model has been developed. The model concentrated on the rotational dynamics of the quadcopter, i.e. the roll, pitch, and yaw variables. The work has been simulated with “MATLAB”. To make testing the simulated model and the controllers more realistic, the testing signals have been applied by a user through a joystick interfaced to the computer. The results obtained indicated a general superiority in performance for the Fuzzy PID controller over the PID controller used in this work. This conclusion is based by the following figures:lesser ISA for the roll, pitch, and yaw consequently, lesser IAE for the roll, pitch, and yaw consequently, lesser rise time and settling time for the roll and pitch consequently, and lesser settling time for the yaw. Moreover, the FPID gave zero overshoot versus and in the PID case for the roll, pitch, and yaw consequently. Both controllers gave zero steady state error with close rise times for the yaw. This superiority of the FPID controller is gained as the fuzzy part of it continuously and online adapts the parameters of the PID part.

  20. Mathematical modelling and simulation of the thermal performance of a solar heated indoor swimming pool

    OpenAIRE

    Mančić Marko V.; Živković Dragoljub S.; Milosavljević Peđa M.; Todorović Milena N.

    2014-01-01

    Buildings with indoor swimming pools have a large energy footprint. The source of major energy loss is the swimming pool hall where air humidity is increased by evaporation from the pool water surface. This increases energy consumption for heating and ventilation of the pool hall, fresh water supply loss and heat demand for pool water heating. In this paper, a mathematical model of the swimming pool was made to assess energy demands of an indoor swimming po...

  1. The Effect of Bypass Nozzle Exit Area on Fan Aerodynamic Performance and Noise in a Model Turbofan Simulator

    Science.gov (United States)

    Hughes, Christopher E.; Podboy, Gary, G.; Woodward, Richard P.; Jeracki, Robert, J.

    2013-01-01

    The design of effective new technologies to reduce aircraft propulsion noise is dependent on identifying and understanding the noise sources and noise generation mechanisms in the modern turbofan engine, as well as determining their contribution to the overall aircraft noise signature. Therefore, a comprehensive aeroacoustic wind tunnel test program was conducted called the Fan Broadband Source Diagnostic Test as part of the NASA Quiet Aircraft Technology program. The test was performed in the anechoic NASA Glenn 9- by 15-Foot Low Speed Wind Tunnel using a 1/5 scale model turbofan simulator which represented a current generation, medium pressure ratio, high bypass turbofan aircraft engine. The investigation focused on simulating in model scale only the bypass section of the turbofan engine. The test objectives were to: identify the noise sources within the model and determine their noise level; investigate several component design technologies by determining their impact on the aerodynamic and acoustic performance of the fan stage; and conduct detailed flow diagnostics within the fan flow field to characterize the physics of the noise generation mechanisms in a turbofan model. This report discusses results obtained for one aspect of the Source Diagnostic Test that investigated the effect of the bypass or fan nozzle exit area on the bypass stage aerodynamic performance, specifically the fan and outlet guide vanes or stators, as well as the farfield acoustic noise level. The aerodynamic performance, farfield acoustics, and Laser Doppler Velocimeter flow diagnostic results are presented for the fan and four different fixed-area bypass nozzle configurations. The nozzles simulated fixed engine operating lines and encompassed the fan stage operating envelope from near stall to cruise. One nozzle was selected as a baseline reference, representing the nozzle area which would achieve the design point operating conditions and fan stage performance. The total area change from

  2. Methods for implementing Building Information Modeling and Building Performance Simulation approaches

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø

    methodologies. Thesis studies showed that BIM approaches have the potential to improve AEC/FM communication and collaboration. BIM is by its nature multidisciplinary, bringing AEC/FM project participants together and creating constant communication. However, BIM adoption can lead to technical challenges......, Engineering, Construction, and Facility Management (AEC/ FM) communication, and (b) BPS as a platform for early-stage building performance prediction. The second is to develop (a) relevant AEC/FM communication support instruments, and (b) standardized BIM and BPS execution guidelines and information exchange......, for example, getting BIM-compatible tools to communicate properly. Furthermore, BIM adoption requires organizational change, that is changes in AEC/FM work practices and interpersonal dynamics. Consequently, to ensure that the adoption of BIM is successful, it is recommended that common IT regulations...

  3. In-Service Design and Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    International Nuclear Information System (INIS)

    G. R. Odette; G. E. Lucas

    2005-01-01

    This final report on ''In-Service Design and Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation'' (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: (1) A Transport and Fate Model for Helium and Helium Management; (2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; (3) Multiscale Modeling of Fracture consisting of: (3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), (3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, (3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, (3d) A Model for the KJc(T) of a High Strength NFA MA957, (3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, (3f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; (4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and (5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES

  4. Designing Citizen Business Loan Model to Reduce Non-Performing Loan: An Agent-based Modeling and Simulation Approach in Regional Development

    Directory of Open Access Journals (Sweden)

    Moses L Singgih

    2015-09-01

    Full Text Available Citizen Business Loan (CBL constitutes a program poverty alleviation based on economic empowerment of small and medium enterprise. This study focuses on implementation of CBL at Regional Development Bank branch X. The problem is the existing of interdependencies between CBL’s implements (Bank and the uncertainty of debtor’s capability in returning the credit. The impact of this circumstance is non-performing loan (NPL becomes relatively high (22%. The ultimate objective is to minimize NPL by designing the model based on the agent that can represent the problem through a simulation using agent-based modeling and simulation (ABMS. The model is considered by managing the probability of the debtor to pay or not based on 5 C categories, they are: character, capacity, capital, condition, and collateral that inherent to each debtor. There are two improvement scenarios proposed in this model. The first scenario only involves the first category of debtor in simulation. The result of this scenario is NPL value as 0%. The second scenario includes the first and second of debtor’s category in simulation and resulting NPL value between 4.6% and 11.4%.

  5. Aviation Safety Simulation Model

    Science.gov (United States)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  6. Experimental measurements and theoretical model of the cryogenic performance of bialkali photocathode and characterization with Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Huamu Xie

    2016-10-01

    Full Text Available High-average-current, high-brightness electron sources have important applications, such as in high-repetition-rate free-electron lasers, or in the electron cooling of hadrons. Bialkali photocathodes are promising high-quantum-efficiency (QE cathode materials, while superconducting rf (SRF electron guns offer continuous-mode operation at high acceleration, as is needed for high-brightness electron sources. Thus, we must have a comprehensive understanding of the performance of bialkali photocathode at cryogenic temperatures when they are to be used in SRF guns. To remove the heat produced by the radio-frequency field in these guns, the cathode should be cooled to cryogenic temperatures. We recorded an 80% reduction of the QE upon cooling the K_{2}CsSb cathode from room temperature down to the temperature of liquid nitrogen in Brookhaven National Laboratory (BNL’s 704 MHz SRF gun. We conducted several experiments to identify the underlying mechanism in this reduction. The change in the spectral response of the bialkali photocathode, when cooled from room temperature (300 K to 166 K, suggests that a change in the ionization energy (defined as the energy gap from the top of the valence band to vacuum level is the main reason for this reduction. We developed an analytical model of the process, based on Spicer’s three-step model. The change in ionization energy, with falling temperature, gives a simplified description of the QE’s temperature dependence. We also developed a 2D Monte Carlo code to simulate photoemission that accounts for the wavelength-dependent photon absorption in the first step, the scattering and diffusion in the second step, and the momentum conservation in the emission step. From this simulation, we established a correlation between ionization energy and reduction in the QE. The simulation yielded results comparable to those from the analytical model. The simulation offers us additional capabilities such as calculation

  7. Performance assessment of Large Eddy Simulation (LES) for modeling dispersion in an urban street canyon with tree planting

    NARCIS (Netherlands)

    Moonen, P.; Gromke, C.B.; Dorer, V.

    2013-01-01

    The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are

  8. Wake modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, G.C.; Aagaard Madsen, H.; Larsen, T.J.; Troldborg, N.

    2008-07-15

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, however, have the potential to include also mutual wake interaction phenomenons. The basic conjecture behind the dynamic wake meandering (DWM) model is that wake transportation in the atmospheric boundary layer is driven by the large scale lateral- and vertical turbulence components. Based on this conjecture a stochastic model of the downstream wake meandering is formulated. In addition to the kinematic formulation of the dynamics of the 'meandering frame of reference', models characterizing the mean wake deficit as well as the added wake turbulence, described in the meandering frame of reference, are an integrated part the DWM model complex. For design applications, the computational efficiency of wake deficit prediction is a key issue. A computationally low cost model is developed for this purpose. Likewise, the character of the added wake turbulence, generated by the up-stream turbine in the form of shed and trailed vorticity, has been approached by a simple semi-empirical model essentially based on an eddy viscosity philosophy. Contrary to previous attempts to model wake loading, the DWM approach opens for a unifying description in the sense that turbine power- and load aspects can be treated simultaneously. This capability is a direct and attractive consequence of the model being based on the underlying physical process, and it potentially opens for optimization of wind farm topology, of wind farm operation as well as of control strategies for the individual turbine. To establish an integrated modeling tool, the DWM methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjaereborg wind farm, have

  9. Investigation the performance of 0-D and 3-d combustion simulation softwares for modelling HCCI engine with high air excess ratios

    Directory of Open Access Journals (Sweden)

    Gökhan Coşkun

    2017-10-01

    Full Text Available In this study, performance of zero and three dimensional simulations codes that used for simulate a homogenous charge compression ignition (HCCI engine fueled with Primary Reference Fuel PRF (85% iso-octane and 15% n-heptane were investigated. 0-D code, called as SRM Suite (Stochastic Reactor Model which can simulate engine combustion by using stochastic reactor model technique were used. Ansys-Fluent which can simulate computational fluid dynamics (CFD was used for 3-D engine combustion simulations. Simulations were evaluated for both commercial codes in terms of combustion, heat transfer and emissions in a HCCI engine. Chemical kinetic mechanisms which developed by Tsurushima including 33 species and 38 reactions for surrogate PRF fuel were used for combustion simulations. Analysis showed that both codes have advantages over each other.

  10. Human Performance Modeling in Military Simulation: Current State of the Art and the Way Ahead (2002 TTCP HUM Group Meeting)

    National Research Council Canada - National Science Library

    2004-01-01

    .... This report examines the requirements for human performance modeling within the military, assesses the state of the practice in current operational models, documents ongoing human performance research and development (R and D...

  11. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  12. Improvements in Thermal Performance of Mango Hot-water Treatment Equipments: Data Analysis, Mathematical Modelling and Numerical-computational Simulation

    Directory of Open Access Journals (Sweden)

    Elder M. Mendoza Orbegoso

    2017-06-01

    Full Text Available Mango is one of the most popular and best paid tropical fruits in worldwide markets, its exportation is regulated within a phytosanitary quality control for killing the “fruit fly”. Thus, mangoes must be subject to hot-water treatment process that involves their immersion in hot water over a period of time. In this work, field measurements, analytical and simulation studies are developed on available hot-water treatment equipment called “Original” that only complies with United States phytosanitary protocols. These approaches are made to characterize the fluid-dynamic and thermal behaviours that occur during the mangoes’ hot-water treatment process. Then, analytical model and Computational fluid dynamics simulations are developed for designing new hot-water treatment equipment called “Hybrid” that simultaneously meets with both United States and Japan phytosanitary certifications. Comparisons of analytical results with data field measurements demonstrate that “Hybrid” equipment offers a better fluid-dynamic and thermal performance than “Original” ones.

  13. Evaluation of performance of CMIP5 models in simulating the North Pacific Oscillation and El Niño Modoki

    Science.gov (United States)

    Wang, Xin; Chen, Mengyan; Wang, Chunzai; Yeh, Sang-Wook; Tan, Wei

    2018-04-01

    Previous observational studies have documented that the occurrence frequency of El Niño Modoki is closely linked to the North Pacific Oscillation (NPO). The present paper evaluates the relationships between the frequency of El Niño Modoki and the NPO in the historical runs of the Coupled Model Intercomparison Project Phase 5 (CMIP5) and examines the related physical processes. It is found that six of 25 CMIP5 models can reproduce both the spatial patterns of the NPO and El Niño Modoki. Four of these six models exhibit good performance in simulating the positive correlation between the NPO index and the frequency of El Niño Modoki. The analyses further show that the key physical processes determining the relationships between the NPO and the frequency of El Niño Modoki are the intensity of wind-evaporation-SST (WES) feedback in the subtropical northeastern North Pacific. This study enhances the understanding of the connections between the North Pacific mid-latitude climate system and El Niño Modoki, and has an important implication for the change of El Niño Modoki under global warming. If global warming favors to produce an oceanic and atmospheric pattern similar to the positive phase of the NPO in the North Pacific, more El Niño Modoki events will occur in the tropical Pacific with the assistance of the WES feedback processes.

  14. Performance of overlapped shield tunneling through an integrated physical model tests, numerical simulations and real-time field monitoring

    Directory of Open Access Journals (Sweden)

    Junlong Yang

    2017-03-01

    Full Text Available In this work, deformations and internal forces of an existing tunnel subjected to a closely overlapped shield tunneling are monitored and analyzed using a series of physical model experiments and numerical simulations. Effects of different excavation sequences and speeds are explicitly considered in the analysis. The results of the physical model experiments show that the bottom-up tunneling procedure is better than the top-down tunneling procedure. The incurred deformations and internal forces of the existing tunnel increase with the excavation speed and the range of influence areas also increase accordingly. For construction process control, real-time monitoring of the power tunnel is used. The monitoring processes feature full automation, adjustable frequency, real-time monitor and dynamic feedback, which are used to guide the construction to achieve micro-disturbance control. In accordance with the situation of crossing construction, a numerical study on the performance of power tunnel is carried out. Construction control measures are given for the undercrossing construction, which helps to accomplish the desired result and meet protection requirements of the existing tunnel structure. Finally, monitoring data and numerical results are compared, and the displacement and joint fracture change models in the power tunnel subject to the overlapped shield tunnel construction are analyzed. Keywords: Overlapped tunnel, Automatic monitoring, Micro-disturbance control

  15. Simulating Performance Risk for Lighting Retrofit Decisions

    Directory of Open Access Journals (Sweden)

    Jia Hu

    2015-05-01

    Full Text Available In building retrofit projects, dynamic simulations are performed to simulate building performance. Uncertainty may negatively affect model calibration and predicted lighting energy savings, which increases the chance of default on performance-based contracts. Therefore, the aim of this paper is to develop a simulation-based method that can analyze lighting performance risk in lighting retrofit decisions. The model uses a surrogate model, which is constructed by adaptively selecting sample points and generating approximation surfaces with fast computing time. The surrogate model is a replacement of the computation intensive process. A statistical method is developed to generate extreme weather profile based on the 20-year historical weather data. A stochastic occupancy model was created using actual occupancy data to generate realistic occupancy patterns. Energy usage of lighting, and heating, ventilation, and air conditioning (HVAC is simulated using EnergyPlus. The method can evaluate the influence of different risk factors (e.g., variation of luminaire input wattage, varying weather conditions on lighting and HVAC energy consumption and lighting electricity demand. Probability distributions are generated to quantify the risk values. A case study was conducted to demonstrate and validate the methods. The surrogate model is a good solution for quantifying the risk factors and probability distribution of the building performance.

  16. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  17. Using sea surface temperatures to improve performance of single dynamical downscaling model in flood simulation under climate change

    Science.gov (United States)

    Chao, Y.; Cheng, C. T.; Hsiao, Y. H.; Hsu, C. T.; Yeh, K. C.; Liu, P. L.

    2017-12-01

    There are 5.3 typhoons hit Taiwan per year on average in last decade. Typhoon Morakot in 2009, the most severe typhoon, causes huge damage in Taiwan, including 677 casualties and roughly NT 110 billion (3.3 billion USD) in economic loss. Some researches documented that typhoon frequency will decrease but increase in intensity in western North Pacific region. It is usually preferred to use high resolution dynamical model to get better projection of extreme events; because coarse resolution models cannot simulate intense extreme events. Under that consideration, dynamical downscaling climate data was chosen to describe typhoon satisfactorily, this research used the simulation data from AGCM of Meteorological Research Institute (MRI-AGCM). Considering dynamical downscaling methods consume massive computing power, and typhoon number is very limited in a single model simulation, using dynamical downscaling data could cause uncertainty in disaster risk assessment. In order to improve the problem, this research used four sea surfaces temperatures (SSTs) to increase the climate change scenarios under RCP 8.5. In this way, MRI-AGCMs project 191 extreme typhoons in Taiwan (when typhoon center touches 300 km sea area of Taiwan) in late 21th century. SOBEK, a two dimensions flood simulation model, was used to assess the flood risk under four SSTs climate change scenarios in Tainan, Taiwan. The results show the uncertainty of future flood risk assessment is significantly decreased in Tainan, Taiwan in late 21th century. Four SSTs could efficiently improve the problems of limited typhoon numbers in single model simulation.

  18. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  19. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  20. Simulation of Lake Surface Heat Fluxes by the Canadian Small Lake Model: Offline Performance Assessment for Future Coupling with a Regional Climate Model

    Science.gov (United States)

    Pernica, P.; Guerrero, J. L.; MacKay, M.; Wheater, H. S.

    2014-12-01

    Lakes strongly influence local and regional climate especially in regions where they are abundant. Development of a lake model for the purpose of integration within a regional climate model is therefore a subject of scientific interest. Of particular importance are the heat flux predictions provided by the lake model since they function as key forcings in a fully coupled atmosphere-land-lake system. The first step towards a coupled model is to validate and characterize the accuracy of the lake model over a range of conditions and to identify limitations. In this work, validation results from offline tests of the Canadian Small Lake Model; a deterministic, computationally efficient, 1D integral model, are presented. Heat fluxes (sensible and latent) and surface water temperatures simulated by the model are compared with in situ observations from two lakes; Landing Lake (NWT, Canada) and L239 (ELA, Canada) for the 2007-2009 period. Sensitivity analysis is performed to identify key parameters important for heat flux predictions. The results demonstrate the ability of the 1-D lake model to reproduce both diurnal and seasonal variations in heat fluxes and surface temperatures for the open water period. These results, in context of regional climate modelling are also discussed.

  1. Predictive performance models and multiple task performance

    Science.gov (United States)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  2. Numerical simulation of Higgs models

    International Nuclear Information System (INIS)

    Jaster, A.

    1995-10-01

    The SU(2) Higgs and the Schwinger model on the lattice were analysed. Numerical simulations of the SU(2) Higgs model were performed to study the finite temperature electroweak phase transition. With the help of the multicanonical method the distribution of an order parameter at the phase transition point was measured. This was used to obtain the order of the phase transition and the value of the interface tension with the histogram method. Numerical simulations were also performed at zero temperature to perform renormalization. The measured values for the Wilson loops were used to determine the static potential and from this the renormalized gauge coupling. The Schwinger model was simulated at different gauge couplings to analyse the properties of the Kaplan-Shamir fermions. The prediction that the mass parameter gets only multiplicative renormalization was tested and verified. (orig.)

  3. A VRLA battery simulation model

    International Nuclear Information System (INIS)

    Pascoe, Phillip E.; Anbuky, Adnan H.

    2004-01-01

    A valve regulated lead acid (VRLA) battery simulation model is an invaluable tool for the standby power system engineer. The obvious use for such a model is to allow the assessment of battery performance. This may involve determining the influence of cells suffering from state of health (SOH) degradation on the performance of the entire string, or the running of test scenarios to ascertain the most suitable battery size for the application. In addition, it enables the engineer to assess the performance of the overall power system. This includes, for example, running test scenarios to determine the benefits of various load shedding schemes. It also allows the assessment of other power system components, either for determining their requirements and/or vulnerabilities. Finally, a VRLA battery simulation model is vital as a stand alone tool for educational purposes. Despite the fundamentals of the VRLA battery having been established for over 100 years, its operating behaviour is often poorly understood. An accurate simulation model enables the engineer to gain a better understanding of VRLA battery behaviour. A system level multipurpose VRLA battery simulation model is presented. It allows an arbitrary battery (capacity, SOH, number of cells and number of strings) to be simulated under arbitrary operating conditions (discharge rate, ambient temperature, end voltage, charge rate and initial state of charge). The model accurately reflects the VRLA battery discharge and recharge behaviour. This includes the complex start of discharge region known as the coup de fouet

  4. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  5. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  6. High performance electromagnetic simulation tools

    Science.gov (United States)

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  7. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    Science.gov (United States)

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  8. Performance Evaluation of PBL Schemes of ARW Model in Simulating Thermo-Dynamical Structure of Pre-Monsoon Convective Episodes over Kharagpur Using STORM Data Sets

    Science.gov (United States)

    Madala, Srikanth; Satyanarayana, A. N. V.; Srinivas, C. V.; Tyagi, Bhishma

    2016-05-01

    In the present study, advanced research WRF (ARW) model is employed to simulate convective thunderstorm episodes over Kharagpur (22°30'N, 87°20'E) region of Gangetic West Bengal, India. High-resolution simulations are conducted using 1 × 1 degree NCEP final analysis meteorological fields for initial and boundary conditions for events. The performance of two non-local [Yonsei University (YSU), Asymmetric Convective Model version 2 (ACM2)] and two local turbulence kinetic energy closures [Mellor-Yamada-Janjic (MYJ), Bougeault-Lacarrere (BouLac)] are evaluated in simulating planetary boundary layer (PBL) parameters and thermodynamic structure of the atmosphere. The model-simulated parameters are validated with available in situ meteorological observations obtained from micro-meteorological tower as well has high-resolution DigiCORA radiosonde ascents during STORM-2007 field experiment at the study location and Doppler Weather Radar (DWR) imageries. It has been found that the PBL structure simulated with the TKE closures MYJ and BouLac are in better agreement with observations than the non-local closures. The model simulations with these schemes also captured the reflectivity, surface pressure patterns such as wake-low, meso-high, pre-squall low and the convective updrafts and downdrafts reasonably well. Qualitative and quantitative comparisons reveal that the MYJ followed by BouLac schemes better simulated various features of the thunderstorm events over Kharagpur region. The better performance of MYJ followed by BouLac is evident in the lesser mean bias, mean absolute error, root mean square error and good correlation coefficient for various surface meteorological variables as well as thermo-dynamical structure of the atmosphere relative to other PBL schemes. The better performance of the TKE closures may be attributed to their higher mixing efficiency, larger convective energy and better simulation of humidity promoting moist convection relative to non

  9. Driving Simulator Development and Performance Study

    OpenAIRE

    Juto, Erik

    2010-01-01

    The driving simulator is a vital tool for much of the research performed at theSwedish National Road and Transport Institute (VTI). Currently VTI posses three driving simulators, two high fidelity simulators developed and constructed by VTI, and a medium fidelity simulator from the German company Dr.-Ing. Reiner Foerst GmbH. The two high fidelity simulators run the same simulation software, developed at VTI. The medium fidelity simulator runs a proprietary simulation software. At VTI there is...

  10. Verification of Temperature and Precipitation Simulated Data by Individual and Ensemble Performance of Five AOGCM Models for North East of Iran

    Directory of Open Access Journals (Sweden)

    B. Ashraf

    2014-08-01

    Full Text Available Scince climatic models are the basic tools to study climate change and because of the multiplicity of these models, selecting the most appropriate model for the studying location is very considerable. In this research the temperature and precipitation simulated data by BCM2, CGCM3, CNRMCM3, MRICGCM2.3 and MIROC3 models are downscaled with proportional method according A1B, A2 and B1 emission scenarios for Torbat-heydariye, Sabzevar and Mashhad initially. Then using coefficient of determination (R2, index of agreement (D and mean-square deviations (MSD, models were verified individually and as ensemble performance. The results showed that, based on individual performance and three emission scenarios, MRICGCM2.3 model in Torbat-heydariye and Mashhad and MIROC3.2 model in Sabzevar had the best performance in simulation of temperature and MIROC3.2, MRICGCM2.3 and CNRMCM3 models have provided the most accurate predictions for precipitation in Torbat-heydariye, Sabzevar and Mashahad respectively. Also simulated temperature by all models in Torbat-heydariye and Sabzevar base on B1 scenario and, in Mashhad based on A2 scenario had the lowest uncertainty. The most accuracy in modeling of precipitation was resulted based on A2 scenario in Torbat-heydariye and, B1 scenario in Sabzevar and Mashhad. Investigation of calculated statistics driven from ensemble performance of 5 selected models caused notable reduction of simulation error and thus increase the accuracy of predictions based on all emission scenarios generally. In this case, the best fitting of simulated and observed temperature data were achieved based on B1 scenario in Torbat-heydariye and Sabzevar and, A2 scenario in Mashhad. And the best fitting simulated and observed precipitation data were obtained based on A2 scenario in Torbat-heydariye and, B1 scenario in Sabzevar and Mashhad. According to the results of this research, before any climate change research it is necessary to select the

  11. Models and simulations

    International Nuclear Information System (INIS)

    Lee, M.J.; Sheppard, J.C.; Sullenberger, M.; Woodley, M.D.

    1983-09-01

    On-line mathematical models have been used successfully for computer controlled operation of SPEAR and PEP. The same model control concept is being implemented for the operation of the LINAC and for the Damping Ring, which will be part of the Stanford Linear Collider (SLC). The purpose of this paper is to describe the general relationships between models, simulations and the control system for any machine at SLAC. The work we have done on the development of the empirical model for the Damping Ring will be presented as an example

  12. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  13. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents. Part 2: IDAC performance influencing factors model

    International Nuclear Information System (INIS)

    Chang, Y.H.J.; Mosleh, A.

    2007-01-01

    This is the second in a series of five papers describing the information, decision, and action in crew context (IDAC) model for human reliability analysis. An example application of this modeling technique is also discussed in this series. The model is developed to probabilistically predict the responses of the nuclear power plant control room operating crew in accident conditions. The operator response spectrum includes cognitive, psychological, and physical activities during the course of an accident. This paper identifies the IDAC set of performance influencing factors (PIFs), providing their definitions and causal organization in the form of a modular influence diagram. Fifty PIFs are identified to support the IDAC model to be implemented in a computer simulation environment. They are classified into eleven hierarchically structured groups. The PIFs within each group are independent to each other; however, dependencies may exist between PIFs within different groups. The supporting evidence for the selection and organization of the influence paths based on psychological literature, observations, and various human reliability analysis methodologies is also indicated

  14. A Simulation Study: The Impact of Random and Realistic Mobility Models on the Performance of Bypass-AODV in Ad Hoc Wireless Networks

    Directory of Open Access Journals (Sweden)

    Baroudi Uthman

    2010-01-01

    Full Text Available To bring VANET into reality, it is crucial to devise routing protocols that can exploit the inherited characteristics of VANET environment to enhance the performance of the running applications. Previous studies have shown that a certain routing protocol behaves differently under different presumed mobility patterns. Bypass-AODV is a new optimization of the AODV routing protocol for mobile ad-hoc networks. It is proposed as a local recovery mechanism to enhance the performance of the AODV routing protocol. It shows outstanding performance under the Random Waypoint mobility model compared with AODV. However, Random Waypoint is a simple model that may be applicable to some scenarios but it is not sufficient to capture some important mobility characteristics of scenarios where VANETs are deployed. In this paper, we will investigate the performance of Bypass-AODV under a wide range of mobility models including other random mobility models, group mobility models, and vehicular mobility models. Simulation results show an interesting feature that is the insensitivity of Bypass-AODV to the selected random mobility model, and it has a clear performance improvement compared to AODV. For group mobility model, both protocols show a comparable performance, but for vehicular mobility models, Bypass-AODV suffers from performance degradation in high-speed conditions.

  15. Modeling and simulation performance of photovoltaic system integration battery and supercapacitor paralellization of MPPT prototipe for solar vehicle

    Science.gov (United States)

    Ajiatmo, Dwi; Robandi, Imam

    2017-03-01

    This paper proposes a control scheme photovoltaic, battery and super capacitor connected in parallel for use in a solar vehicle. Based on the features of battery charging, the control scheme consists of three modes, namely, mode dynamic irradian, constant load mode and constant voltage charging mode. The shift of the three modes can be realized by controlling the duty cycle of the mosffet Boost converter system. Meanwhile, the high voltage which is more suitable for the application can be obtained. Compared with normal charging method with parallel connected current limiting detention and charging method with dynamic irradian mode, constant load mode and constant voltage charging mode, the control scheme is proposed to shorten the charging time and increase the use of power generated from the PV array. From the simulation results and analysis conducted to determine the performance of the system in state transient and steady-state by using simulation software Matlab / Simulink. Response simulation results demonstrate the suitability of the proposed concept.

  16. Evaluating the performance of coupled snow-soil models in SURFEXv8 to simulate the permafrost thermal regime at a high Arctic site

    Science.gov (United States)

    Barrere, Mathieu; Domine, Florent; Decharme, Bertrand; Morin, Samuel; Vionnet, Vincent; Lafaysse, Matthieu

    2017-09-01

    Climate change projections still suffer from a limited representation of the permafrost-carbon feedback. Predicting the response of permafrost temperature to climate change requires accurate simulations of Arctic snow and soil properties. This study assesses the capacity of the coupled land surface and snow models ISBA-Crocus and ISBA-ES to simulate snow and soil properties at Bylot Island, a high Arctic site. Field measurements complemented with ERA-Interim reanalyses were used to drive the models and to evaluate simulation outputs. Snow height, density, temperature, thermal conductivity and thermal insulance are examined to determine the critical variables involved in the soil and snow thermal regime. Simulated soil properties are compared to measurements of thermal conductivity, temperature and water content. The simulated snow density profiles are unrealistic, which is most likely caused by the lack of representation in snow models of the upward water vapor fluxes generated by the strong temperature gradients within the snowpack. The resulting vertical profiles of thermal conductivity are inverted compared to observations, with high simulated values at the bottom of the snowpack. Still, ISBA-Crocus manages to successfully simulate the soil temperature in winter. Results are satisfactory in summer, but the temperature of the top soil could be better reproduced by adequately representing surface organic layers, i.e., mosses and litter, and in particular their water retention capacity. Transition periods (soil freezing and thawing) are the least well reproduced because the high basal snow thermal conductivity induces an excessively rapid heat transfer between the soil and the snow in simulations. Hence, global climate models should carefully consider Arctic snow thermal properties, and especially the thermal conductivity of the basal snow layer, to perform accurate predictions of the permafrost evolution under climate change.

  17. ATR performance modeling concepts

    Science.gov (United States)

    Ross, Timothy D.; Baker, Hyatt B.; Nolan, Adam R.; McGinnis, Ryan E.; Paulson, Christopher R.

    2016-05-01

    Performance models are needed for automatic target recognition (ATR) development and use. ATRs consume sensor data and produce decisions about the scene observed. ATR performance models (APMs) on the other hand consume operating conditions (OCs) and produce probabilities about what the ATR will produce. APMs are needed for many modeling roles of many kinds of ATRs (each with different sensing modality and exploitation functionality combinations); moreover, there are different approaches to constructing the APMs. Therefore, although many APMs have been developed, there is rarely one that fits a particular need. Clarified APM concepts may allow us to recognize new uses of existing APMs and identify new APM technologies and components that better support coverage of the needed APMs. The concepts begin with thinking of ATRs as mapping OCs of the real scene (including the sensor data) to reports. An APM is then a mapping from explicit quantized OCs (represented with less resolution than the real OCs) and latent OC distributions to report distributions. The roles of APMs can be distinguished by the explicit OCs they consume. APMs used in simulations consume the true state that the ATR is attempting to report. APMs used online with the exploitation consume the sensor signal and derivatives, such as match scores. APMs used in sensor management consume neither of those, but estimate performance from other OCs. This paper will summarize the major building blocks for APMs, including knowledge sources, OC models, look-up tables, analytical and learned mappings, and tools for signal synthesis and exploitation.

  18. Software-In-the-Loop based Modeling and Simulation of Unmanned Semi-submersible Vehicle for Performance Verification of Autonomous Navigation

    Science.gov (United States)

    Lee, Kwangkook; Jeong, Mijin; Kim, Dong Hun

    2017-12-01

    Since an unmanned semi-submersible is mainly used for the purpose of carrying out dangerous missions in the sea, it is possible to work in a region where it is difficult to access due to safety reasons. In this study, an USV hull design was determined using Myring hull profile, and reinforcement work was performed by designing and implementing inner stiffener member for 3D printing. In order to simulate a sea state 5.0 or more at sea, which is difficult to implement in practice, a regular and irregular wave equation was implemented in Matlab / Simulink. We performed modeling and simulation of semi - submersible simulation based on DMWorks considering the rolling motion in wave. To verify and improve unpredicted errors, we implemented a numeric and physical simulation model of the USV based on software-in-the-loop (SIL) method. This simulation allows shipbuilders to participate in new value-added markets such as engineering, procurement, construction, installation, commissioning, operation, and maintenance for the USV.

  19. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  20. Modeling and Simulation for Safeguards

    International Nuclear Information System (INIS)

    Swinhoe, Martyn T.

    2012-01-01

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R and D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  1. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  2. Calibration and validation of a model for simulating thermal and electric performance of an internal combustion engine-based micro-cogeneration device

    International Nuclear Information System (INIS)

    Rosato, A.; Sibilio, S.

    2012-01-01

    The growing worldwide demand for more efficient and less polluting forms of energy production has led to a renewed interest in the use of micro-cogeneration technologies in the residential. Among the others technologies, internal combustion engine-based micro-cogeneration devices are a market-ready technology gaining an increasing appeal thanks to their high efficiency, fuel flexibility, low emissions, low noise and vibration. In order to explore and assess the feasibility of using internal combustion engine-based cogeneration systems in the residential sector, an accurate and practical simulation model that can be used to conduct sensitivity and what-if analyses is needed. A residential cogeneration device model has been developed within IEA/ECBCS Annex 42 and implemented into a number of building simulation programs. This model is potentially able to accurately predict the thermal and electrical outputs of the residential cogeneration devices, but it relies almost entirely on empirical data because the model specification uses experimental measurements contained within a performance map to represent the device specific performance characteristics coupled with thermally massive elements to characterize the device's dynamic thermal performance. At the Built Environment Control Laboratory of Seconda Università degli studi di Napoli, an AISIN SEIKI micro-cogeneration device based on natural gas fuelled reciprocating internal combustion engine is available. This unit has been intensively tested in order to calibrate and validate the Annex 42 model. This paper shows in detail the series of experiments conducted for the calibration activity and examines the validity of this model by contrasting simulation predictions to measurements derived by operating the system in electric load following control strategy. The statistical comparison was made both for the whole database and the segregated data by system mode operation. The good agreement found in the predictions of

  3. Biomolecular modelling and simulations

    CERN Document Server

    Karabencheva-Christova, Tatyana

    2014-01-01

    Published continuously since 1944, the Advances in Protein Chemistry and Structural Biology series is the essential resource for protein chemists. Each volume brings forth new information about protocols and analysis of proteins. Each thematically organized volume is guest edited by leading experts in a broad range of protein-related topics. Describes advances in biomolecular modelling and simulations Chapters are written by authorities in their field Targeted to a wide audience of researchers, specialists, and students The information provided in the volume is well supported by a number of high quality illustrations, figures, and tables.

  4. Problem reporting management system performance simulation

    Science.gov (United States)

    Vannatta, David S.

    1993-01-01

    This paper proposes the Problem Reporting Management System (PRMS) model as an effective discrete simulation tool that determines the risks involved during the development phase of a Trouble Tracking Reporting Data Base replacement system. The model considers the type of equipment and networks which will be used in the replacement system as well as varying user loads, size of the database, and expected operational availability. The paper discusses the dynamics, stability, and application of the PRMS and addresses suggested concepts to enhance the service performance and enrich them.

  5. Stereoscopic (3D) versus monoscopic (2D) laparoscopy: comparative study of performance using advanced HD optical systems in a surgical simulator model.

    Science.gov (United States)

    Schoenthaler, Martin; Schnell, Daniel; Wilhelm, Konrad; Schlager, Daniel; Adams, Fabian; Hein, Simon; Wetterauer, Ulrich; Miernik, Arkadiusz

    2016-04-01

    To compare task performances of novices and experts using advanced high-definition 3D versus 2D optical systems in a surgical simulator model. Fifty medical students (novices in laparoscopy) were randomly assigned to perform five standardized tasks adopted from the Fundamentals of Laparoscopic Surgery (FLS) curriculum in either a 2D or 3D laparoscopy simulator system. In addition, eight experts performed the same tasks. Task performances were evaluated using a validated scoring system of the SAGES/FLS program. Participants were asked to rate 16 items in a questionnaire. Overall task performance of novices was significantly better using stereoscopic visualization. Superiority of performances in 3D reached a level of significance for tasks peg transfer and precision cutting. No significant differences were noted in performances of experts when using either 2D or 3D. Overall performances of experts compared to novices were better in both 2D and 3D. Scorings in the questionnaires showed a tendency toward lower scores in the group of novices using 3D. Stereoscopic imaging significantly improves performance of laparoscopic phantom tasks of novices. The current study confirms earlier data based on a large number of participants and a standardized task and scoring system. Participants felt more confident and comfortable when using a 3D laparoscopic system. However, the question remains open whether these findings translate into faster and safer operations in a clinical setting.

  6. Repository simulation model: Final report

    International Nuclear Information System (INIS)

    1988-03-01

    This report documents the application of computer simulation for the design analysis of the nuclear waste repository's waste handling and packaging operations. The Salt Repository Simulation Model was used to evaluate design alternatives during the conceptual design phase of the Salt Repository Project. Code development and verification was performed by the Office of Nuclear Waste Isolation (ONWL). The focus of this report is to relate the experience gained during the development and application of the Salt Repository Simulation Model to future repository design phases. Design of the repository's waste handling and packaging systems will require sophisticated analysis tools to evaluate complex operational and logistical design alternatives. Selection of these design alternatives in the Advanced Conceptual Design (ACD) and License Application Design (LAD) phases must be supported by analysis to demonstrate that the repository design will cost effectively meet DOE's mandated emplacement schedule and that uncertainties in the performance of the repository's systems have been objectively evaluated. Computer simulation of repository operations will provide future repository designers with data and insights that no other analytical form of analysis can provide. 6 refs., 10 figs

  7. NRTA simulation by modeling PFPF

    International Nuclear Information System (INIS)

    Asano, Takashi; Fujiwara, Shigeo; Takahashi, Saburo; Shibata, Junichi; Totsu, Noriko

    2003-01-01

    In PFPF, NRTA system has been applied since 1991. It has been confirmed by evaluating facility material accountancy data provided from operator in each IIV that a significant MUF was not generated. In case of throughput of PFPF scale, MUF can be evaluated with a sufficient detection probability by the present NRTA evaluation manner. However, by increasing of throughput, the uncertainty of material accountancy will increase, and the detection probability will decline. The relationship between increasing of throughput and declining of detection probability and the maximum throughput upon application of following measures with a sufficient detection probability were evaluated by simulation of NRTA system. This simulation was performed by modeling of PFPF. Measures for increasing detection probability are shown as follows. Shortening of the evaluation interval. Segmentation of evaluation area. This report shows the results of these simulations. (author)

  8. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  9. SNR and BER Models and the Simulation for BER Performance of Selected Spectral Amplitude Codes for OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-01-01

    Full Text Available Many encoding schemes are used in OCDMA (Optical Code Division Multiple Access Network but SAC (Spectral Amplitude Codes is widely used. It is considered an effective arrangement to eliminate dominant noise called MAI (Multi Access Interference. Various codes are studied for evaluation with respect to their performance against three noises namely shot noise, thermal noise and PIIN (Phase Induced Intensity Noise. Various Mathematical models for SNR (Signal to Noise Ratios and BER (Bit Error Rates are discussed where the SNRs are calculated and BERs are computed using Gaussian distribution assumption. After analyzing the results mathematically, it is concluded that ZCC (Zero Cross Correlation Code performs better than the other selected SAC codes and can serve larger number of active users than the other codes do. At various receiver power levels, analysis points out that RDC (Random Diagonal Code also performs better than the other codes. For the power interval between -10 and -20 dBm performance of RDC is better ZCC. Their lowest BER values suggest that these codes should be part of an efficient and cost effective OCDM access network in the future.

  10. A predictive analytic model for high-performance tunneling field-effect transistors approaching non-equilibrium Green's function simulations

    International Nuclear Information System (INIS)

    Salazar, Ramon B.; Appenzeller, Joerg; Ilatikhameneh, Hesameddin; Rahman, Rajib; Klimeck, Gerhard

    2015-01-01

    A new compact modeling approach is presented which describes the full current-voltage (I-V) characteristic of high-performance (aggressively scaled-down) tunneling field-effect-transistors (TFETs) based on homojunction direct-bandgap semiconductors. The model is based on an analytic description of two key features, which capture the main physical phenomena related to TFETs: (1) the potential profile from source to channel and (2) the elliptic curvature of the complex bands in the bandgap region. It is proposed to use 1D Poisson's equations in the source and the channel to describe the potential profile in homojunction TFETs. This allows to quantify the impact of source/drain doping on device performance, an aspect usually ignored in TFET modeling but highly relevant in ultra-scaled devices. The compact model is validated by comparison with state-of-the-art quantum transport simulations using a 3D full band atomistic approach based on non-equilibrium Green's functions. It is shown that the model reproduces with good accuracy the data obtained from the simulations in all regions of operation: the on/off states and the n/p branches of conduction. This approach allows calculation of energy-dependent band-to-band tunneling currents in TFETs, a feature that allows gaining deep insights into the underlying device physics. The simplicity and accuracy of the approach provide a powerful tool to explore in a quantitatively manner how a wide variety of parameters (material-, size-, and/or geometry-dependent) impact the TFET performance under any bias conditions. The proposed model presents thus a practical complement to computationally expensive simulations such as the 3D NEGF approach

  11. Simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  12. SEMI Modeling and Simulation Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Hermina, W.L.

    2000-10-02

    With the exponential growth in the power of computing hardware and software, modeling and simulation is becoming a key enabler for the rapid design of reliable Microsystems. One vision of the future microsystem design process would include the following primary software capabilities: (1) The development of 3D part design, through standard CAD packages, with automatic design rule checks that guarantee the manufacturability and performance of the microsystem. (2) Automatic mesh generation, for 3D parts as manufactured, that permits computational simulation of the process steps, and the performance and reliability analysis for the final microsystem. (3) Computer generated 2D layouts for process steps that utilize detailed process models to generate the layout and process parameter recipe required to achieve the desired 3D part. (4) Science-based computational tools that can simulate the process physics, and the coupled thermal, fluid, structural, solid mechanics, electromagnetic and material response governing the performance and reliability of the microsystem. (5) Visualization software that permits the rapid visualization of 3D parts including cross-sectional maps, performance and reliability analysis results, and process simulation results. In addition to these desired software capabilities, a desired computing infrastructure would include massively parallel computers that enable rapid high-fidelity analysis, coupled with networked compute servers that permit computing at a distance. We now discuss the individual computational components that are required to achieve this vision. There are three primary areas of focus: design capabilities, science-based capabilities and computing infrastructure. Within each of these areas, there are several key capability requirements.

  13. Study of the performance of three micromixing models in transported scalar PDF simulations of a piloted jet diffusion flame ('Delft Flame III')

    Energy Technology Data Exchange (ETDEWEB)

    Merci, Bart [Department of Flow, Heat and Combustion Mechanics, Ghent University-UGent, B-9000 Ghent (Belgium); Roekaerts, Dirk [Department of Multi-Scale Physics, Delft University of Technology, Delft (Netherlands); Naud, Bertrand [CIEMAT, Madrid (Spain)

    2006-02-01

    Numerical simulation results are presented for a turbulent nonpremixed flame with local extinction and reignition. The transported scalar PDF approach is applied to the turbulence-chemistry interaction. The turbulent flow field is obtained with a nonlinear two-equation turbulence model. A C{sub 1} skeletal scheme is used as the chemistry model. The performance of three micromixing models is compared: the interaction by exchange with the mean model (IEM), the modified Curl's coalescence/dispersion model (CD) and the Euclidean minimum spanning tree model (EMST). With the IEM model, global extinction occurs. With the standard value of model constant C{sub f}=2, the CD model yields a lifted flame, unlike the experiments, while with the EMST model the correct flame shape is obtained. However, the conditional variances of the thermochemical quantities are underestimated with the EMST model, due to a lack of local extinction in the simulations. With the CD model, the flame becomes attached when either the value of C{sub f} is increased to 3 or the pilot flame thermal power is increased by a factor of 1.5. With increased value of C{sub f} better results for mixture fraction variance are obtained with both the CD and the EMST model. Lowering the value of C{sub f} leads to better predictions for mean temperature with EMST, but at the cost of stronger overprediction of mixture fraction variance. These trends are explained as a consequence of variance production by macroscopic inhomogeneity and the specific properties of the micromixing models. Local time stepping is applied so that convergence is obtained more quickly. Iteration averaging reduces statistical error so that the limited number of 50 particles per cell is sufficient to obtain accurate results. (author)

  14. Manufacturing plant performance evaluation by discrete event simulation

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mohd Rasid Osman; Rosnah Mohd Yusuff; Napsiah Ismail; Zulkiflie Leman

    2002-01-01

    A case study was conducted to evaluate the performance of a manufacturing plant using discrete event simulation technique. The study was carried out on animal feed production plant. Sterifeed plant at Malaysian Institute for Nuclear Technology Research (MINT), Selangor, Malaysia. The plant was modelled base on the actual manufacturing activities recorded by the operators. The simulation was carried out using a discrete event simulation software. The model was validated by comparing the simulation results with the actual operational data of the plant. The simulation results show some weaknesses with the current plant design and proposals were made to improve the plant performance. (Author)

  15. Using Modeling and Simulation to Analyze Application and Network Performance at the Radioactive Waste and Nuclear Material Disposition Facility

    International Nuclear Information System (INIS)

    LIFE, ROY A.; MAESTAS, JOSEPH H.; BATEMAN, DENNIS B.

    2003-01-01

    Telecommunication services customers at the Radioactive Waste and Nuclear Material Disposition Facility (RWNMDF) have endured regular service outages that seem to be associated with a custom Microsoft Access Database. In addition, the same customers have noticed periods when application response times are noticeably worse than at others. To the customers, the two events appear to be correlated. Although many network design activities can be accomplished using trial-and-error methods, there are as many, if not more occasions where computerized analysis is necessary to verify the benefits of implementing one design alternative versus another. This is particularly true when network design is performed with application flows and response times in mind. More times than not, it is unclear whether upgrading certain aspects of the network will provide sufficient benefit to justify the corresponding costs, and network modeling tools can be used to help staff make these decisions. This report summarizes our analysis of the situation at the RWNMDF, in which computerized analysis was used to accomplish four objectives: (1) identify the source of the problem; (2) identify areas where improvements make the most sense; (3) evaluate various scenarios ranging from upgrading the network infrastructure, installing an additional fiber trunk as a way to improve local network performance, and re-locating the RWNMDF database onto corporate servers; and (4) demonstrate a methodology for network design using actual application response times to predict, select, and implement the design alternatives that provide the best performance and cost benefits

  16. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  17. Impact of a function-based payment model on the financial performance of acute inpatient medical rehabilitation providers: a simulation analysis.

    Science.gov (United States)

    Sutton, J P; DeJong, G; Song, H; Wilkerson, D

    1997-12-01

    To operationalize research findings about a medical rehabilitation classification and payment model by building a prototype of a prospective payment system, and to determine whether this prototype model promotes payment equity. This latter objective is accomplished by identifying whether any facility or payment model characteristics are systematically associated with financial performance. This study was conducted in two phases. In Phase 1 the components of a diagnosis-related group (DRG)-like payment system, including a base rate, function-related group (FRG) weights, and adjusters, were identified and estimated using hospital cost functions. Phase 2 consisted of a simulation analysis in which each facility's financial performance was modeled, based on its 1990-1991 case mix. A multivariate regression equation was conducted to assess the extent to which characteristics of 42 rehabilitation facilities contribute toward determining financial performance under the present Medicare payment system as well as under the hypothetical model developed. Phase 1 (model development) included 61 rehabilitation hospitals. Approximately 59% were rehabilitation units within a general hospital and 48% were teaching facilities. The number of rehabilitation beds averaged 52. Phase 2 of the stimulation analysis included 42 rehabilitation facilities, subscribers to UDS in 1990-1991. Of these, 69% were rehabilitation units and 52% were teaching facilities. The number of rehabilitation beds averaged 48. Financial performance, as measured by the ratio of reimbursement to average costs. Case-mix index is the primary determinant of financial performance under the present Medicare payment system. None of the facility characteristics included in this analysis were associated with financial performance under the hypothetical FRG payment model. The most notable impact of an FRG-based payment model would be to create a stronger link between resource intensity and level of reimbursement

  18. Simulating atmospheric composition over a South-East Asian tropical rainforest: performance of a chemistry box model

    Directory of Open Access Journals (Sweden)

    T. A. M. Pugh

    2010-01-01

    Full Text Available Atmospheric composition and chemistry above tropical rainforests is currently not well established, particularly for south-east Asia. In order to examine our understanding of chemical processes in this region, the performance of a box model of atmospheric boundary layer chemistry is tested against measurements made at the top of the rainforest canopy near Danum Valley, Malaysian Borneo. Multi-variate optimisation against ambient concentration measurements was used to estimate average canopy-scale emissions for isoprene, total monoterpenes and nitric oxide. The excellent agreement between estimated values and measured fluxes of isoprene and total monoterpenes provides confidence in the overall modelling strategy, and suggests that this method may be applied where measured fluxes are not available, assuming that the local chemistry and mixing are adequately understood. The largest contributors to the optimisation cost function at the point of best-fit are OH (29%, NO (22% and total peroxy radicals (27%. Several factors affect the modelled VOC chemistry. In particular concentrations of methacrolein (MACR and methyl-vinyl ketone (MVK are substantially overestimated, and the hydroxyl radical (OH concentration is substantially underestimated; as has been seen before in tropical rainforest studies. It is shown that inclusion of dry deposition of MACR and MVK and wet deposition of species with high Henry's Law values substantially improves the fit of these oxidised species, whilst also substantially decreasing the OH sink. Increasing OH production arbitrarily, through a simple OH recycling mechanism , adversely affects the model fit for volatile organic compounds (VOCs. Given the constraints on isoprene flux provided by measurements, a substantial decrease in the rate of reaction of VOCs with OH is the only remaining option to explain the measurement/model discrepancy for OH. A reduction in the isoprene+OH rate constant of 50%, in conjunction with

  19. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  20. Acoustic Performance of Novel Fan Noise Reduction Technologies for a High Bypass Model Turbofan at Simulated Flights Conditions

    Science.gov (United States)

    Elliott, David M.; Woodward, Richard P.; Podboy, Gary G.

    2010-01-01

    Two novel fan noise reduction technologies, over the rotor acoustic treatment and soft stator vane technologies, were tested in an ultra-high bypass ratio turbofan model in the NASA Glenn Research Center s 9- by 15-Foot Low-Speed Wind Tunnel. The performance of these technologies was compared to that of the baseline fan configuration, which did not have these technologies. Sideline acoustic data and hot film flow data were acquired and are used to determine the effectiveness of the various treatments. The material used for the over the rotor treatment was foam metal and two different types were used. The soft stator vanes had several internal cavities tuned to target certain frequencies. In order to accommodate the cavities it was necessary to use a cut-on stator to demonstrate the soft vane concept.

  1. Performance of Irikura's Recipe Rupture Model Generator in Earthquake Ground Motion Simulations as Implemented in the Graves and Pitarka Hybrid Approach.

    Energy Technology Data Exchange (ETDEWEB)

    Pitarka, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-11-22

    We analyzed the performance of the Irikura and Miyake (2011) (IM2011) asperity-­ based kinematic rupture model generator, as implemented in the hybrid broadband ground-­motion simulation methodology of Graves and Pitarka (2010), for simulating ground motion from crustal earthquakes of intermediate size. The primary objective of our study is to investigate the transportability of IM2011 into the framework used by the Southern California Earthquake Center broadband simulation platform. In our analysis, we performed broadband (0 -­ 20Hz) ground motion simulations for a suite of M6.7 crustal scenario earthquakes in a hard rock seismic velocity structure using rupture models produced with both IM2011 and the rupture generation method of Graves and Pitarka (2016) (GP2016). The level of simulated ground motions for the two approaches compare favorably with median estimates obtained from the 2014 Next Generation Attenuation-­West2 Project (NGA-­West2) ground-­motion prediction equations (GMPEs) over the frequency band 0.1–10 Hz and for distances out to 22 km from the fault. We also found that, compared to GP2016, IM2011 generates ground motion with larger variability, particularly at near-­fault distances (<12km) and at long periods (>1s). For this specific scenario, the largest systematic difference in ground motion level for the two approaches occurs in the period band 1 – 3 sec where the IM2011 motions are about 20 – 30% lower than those for GP2016. We found that increasing the rupture speed by 20% on the asperities in IM2011 produced ground motions in the 1 – 3 second bandwidth that are in much closer agreement with the GMPE medians and similar to those obtained with GP2016. The potential implications of this modification for other rupture mechanisms and magnitudes are not yet fully understood, and this topic is the subject of ongoing study.

  2. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  3. Development of design technology on thermal-hydraulic performance in tight-lattice rod bundle. 4. Large paralleled simulation by the advanced two-fluid model code

    International Nuclear Information System (INIS)

    Misawa, Takeharu; Yoshida, Hiroyuki; Akimoto, Hajime

    2008-01-01

    In Japan Atomic Energy Agency (JAEA), the Innovative Water Reactor for Flexible Fuel Cycle (FLWR) has been developed. For thermal design of FLWR, it is necessary to develop analytical method to predict boiling transition of FLWR. Japan Atomic Energy Agency (JAEA) has been developing three-dimensional two-fluid model analysis code ACE-3D, which adopts boundary fitted coordinate system to simulate complex shape channel flow. In this paper, as a part of development of ACE-3D to apply to rod bundle analysis, introduction of parallelization to ACE-3D and assessments of ACE-3D are shown. In analysis of large-scale domain such as a rod bundle, even two-fluid model requires large number of computational cost, which exceeds upper limit of memory amount of 1 CPU. Therefore, parallelization was introduced to ACE-3D to divide data amount for analysis of large-scale domain among large number of CPUs, and it is confirmed that analysis of large-scale domain such as a rod bundle can be performed by parallel computation with keeping parallel computation performance even using large number of CPUs. ACE-3D adopts two-phase flow models, some of which are dependent upon channel geometry. Therefore, analyses in the domains, which simulate individual subchannel and 37 rod bundle, are performed, and compared with experiments. It is confirmed that the results obtained by both analyses using ACE-3D show agreement with past experimental result qualitatively. (author)

  4. Modeling and performance simulation of 100 MW PTC based solar thermal power plant in Udaipur India

    Directory of Open Access Journals (Sweden)

    Deepak Bishoyi

    2017-09-01

    Full Text Available Solar energy is a key renewable energy source and the most abundant energy source on the globe. Solar energy can be converted into electric energy by using two different processes: by means of photovoltaic (PV conversion and the thermodynamic cycles. Concentrated solar power (CSP is viewed as one of the most promising alternatives in the field of solar energy utilization. Lifetime and efficiency of PV system are very less compared to the CSP technology. A 100 MW parabolic trough solar thermal power plant with 6 h of thermal energy storage has been evaluated in terms of design and thermal performance, based on the System Advisor Model (SAM. A location receiving an annual DNI of 2248.17 kW h/m2 in Rajasthan is chosen for the technical feasibility of hypothetical CSP plant. The plant design consists of 194 solar collector loops with each loop comprising of 8 parabolic trough collectors. HITEC solar salt is chosen as an HTF due to its excellent thermodynamic properties. The designed plant can generate annual electricity of 285,288,352 kW h with the plant efficiency of 21%. The proposed design of PTC based solar thermal power plant and its performance analysis encourages further innovation and development of solar thermal power plants in India.

  5. Aircraft Performance for Open Air Traffic Simulations

    NARCIS (Netherlands)

    Metz, I.C.; Hoekstra, J.M.; Ellerbroek, J.; Kugler, D.

    2016-01-01

    The BlueSky Open Air Tra_c Simulator developed by the Control & Simulation section of TU Delft aims at supporting research for analysing Air Tra_c Management concepts by providing an open source simulation platform. The goal of this study was to complement BlueSky with aircraft performance

  6. Performance Assessment Tools for Distance Learning and Simulation: Knowledge, Models and Tools to Improve the Effectiveness of Naval Distance Learning

    National Research Council Canada - National Science Library

    Baker, Eva L; Munro, Allen; Pizzini, Quentin A; Brill, David G; Michiuye, Joanne K

    2006-01-01

    ... by (a) extending CRESST's knowledge mapping tool's authoring and scoring functionality and (b) providing the capability to embed a knowledge mapping assessment in simulation-based training developed by BTL...

  7. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  8. Predicting the performance of a power amplifier using large-signal circuit simulations of an AlGaN/GaN HFET model

    Science.gov (United States)

    Bilbro, Griff L.; Hou, Danqiong; Yin, Hong; Trew, Robert J.

    2009-02-01

    We have quantitatively modeled the conduction current and charge storage of an HFET in terms its physical dimensions and material properties. For DC or small-signal RF operation, no adjustable parameters are necessary to predict the terminal characteristics of the device. Linear performance measures such as small-signal gain and input admittance can be predicted directly from the geometric structure and material properties assumed for the device design. We have validated our model at low-frequency against experimental I-V measurements and against two-dimensional device simulations. We discuss our recent extension of our model to include a larger class of electron velocity-field curves. We also discuss the recent reformulation of our model to facilitate its implementation in commercial large-signal high-frequency circuit simulators. Large signal RF operation is more complex. First, the highest CW microwave power is fundamentally bounded by a brief, reversible channel breakdown in each RF cycle. Second, the highest experimental measurements of efficiency, power, or linearity always require harmonic load pull and possibly also harmonic source pull. Presently, our model accounts for these facts with an adjustable breakdown voltage and with adjustable load impedances and source impedances for the fundamental frequency and its harmonics. This has allowed us to validate our model for large signal RF conditions by simultaneously fitting experimental measurements of output power, gain, and power added efficiency of real devices. We show that the resulting model can be used to compare alternative device designs in terms of their large signal performance, such as their output power at 1dB gain compression or their third order intercept points. In addition, the model provides insight into new device physics features enabled by the unprecedented current and voltage levels of AlGaN/GaN HFETs, including non-ohmic resistance in the source access regions and partial depletion of

  9. Simulation and performance of brushless DC motor actuators

    OpenAIRE

    Gerba, Alex

    1985-01-01

    The simulation model for a Brushless D.C. Motor and the associated commutation power conditioner transistor model are presented. The necessary conditions for maximum power output while operating at steady-state speed and sinusoidally distributed air-gap flux are developed. Comparisons of simulated model with the measured performance of a typical motor are done both on time response waveforms and on average performance characteristics. These preliminary results indicate good ...

  10. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    In the present work a framework for optimizing the design of boilers for dynamic operation has been developed. A cost function to be minimized during the optimization has been formulated and for the present design variables related to the Boiler Volume and the Boiler load Gradient (i.e. ring rate...... on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...

  11. Operations planning simulation: Model study

    Science.gov (United States)

    1974-01-01

    The use of simulation modeling for the identification of system sensitivities to internal and external forces and variables is discussed. The technique provides a means of exploring alternate system procedures and processes, so that these alternatives may be considered on a mutually comparative basis permitting the selection of a mode or modes of operation which have potential advantages to the system user and the operator. These advantages are measurements is system efficiency are: (1) the ability to meet specific schedules for operations, mission or mission readiness requirements or performance standards and (2) to accomplish the objectives within cost effective limits.

  12. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  13. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  14. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  15. Modeling and simulation of blood collection systems.

    Science.gov (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier

    2012-03-01

    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  16. Used Nuclear Fuel Loading and Structural Performance Under Normal Conditions of Transport - Modeling, Simulation and Experimental Integration RD&D Plan

    Energy Technology Data Exchange (ETDEWEB)

    Adkins, Harold E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-04-01

    Under current U.S. Nuclear Regulatory Commission regulation, it is not sufficient for used nuclear fuel (UNF) to simply maintain its integrity during the storage period, it must maintain its integrity in such a way that it can withstand the physical forces of handling and transportation associated with restaging the fuel and moving it to treatment or recycling facilities, or a geologic repository. Hence it is necessary to understand the performance characteristics of aged UNF cladding and ancillary components under loadings stemming from transport initiatives. Researchers would like to demonstrate that enough information, including experimental support and modeling and simulation capabilities, exists to establish a preliminary determination of UNF structural performance under normal conditions of transport (NCT). This research, development and demonstration (RD&D) plan describes a methodology, including development and use of analytical models, to evaluate loading and associated mechanical responses of UNF rods and key structural components. This methodology will be used to provide a preliminary assessment of the performance characteristics of UNF cladding and ancillary components under rail-related NCT loading. The methodology couples modeling and simulation and experimental efforts currently under way within the Used Fuel Disposition Campaign (UFDC). The methodology will involve limited uncertainty quantification in the form of sensitivity evaluations focused around available fuel and ancillary fuel structure properties exclusively. The work includes collecting information via literature review, soliciting input/guidance from subject matter experts, performing computational analyses, planning experimental measurement and possible execution (depending on timing), and preparing a variety of supporting documents that will feed into and provide the basis for future initiatives. The methodology demonstration will focus on structural performance evaluation of

  17. Using discrete event simulation to compare the performance of family health unit and primary health care centre organizational models in Portugal.

    Science.gov (United States)

    Fialho, André S; Oliveira, Mónica D; Sá, Armando B

    2011-10-15

    Recent reforms in Portugal aimed at strengthening the role of the primary care system, in order to improve the quality of the health care system. Since 2006 new policies aiming to change the organization, incentive structures and funding of the primary health care sector were designed, promoting the evolution of traditional primary health care centres (PHCCs) into a new type of organizational unit--family health units (FHUs). This study aimed to compare performances of PHCC and FHU organizational models and to assess the potential gains from converting PHCCs into FHUs. Stochastic discrete event simulation models for the two types of organizational models were designed and implemented using Simul8 software. These models were applied to data from nineteen primary care units in three municipalities of the Greater Lisbon area. The conversion of PHCCs into FHUs seems to have the potential to generate substantial improvements in productivity and accessibility, while not having a significant impact on costs. This conversion might entail a 45% reduction in the average number of days required to obtain a medical appointment and a 7% and 9% increase in the average number of medical and nursing consultations, respectively. Reorganization of PHCC into FHUs might increase accessibility of patients to services and efficiency in the provision of primary care services.

  18. Multiprocessor performance modeling with ADAS

    Science.gov (United States)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  19. Protein Simulation Data in the Relational Model.

    Science.gov (United States)

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  20. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operati...

  1. HVDC System Characteristics and Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.I.; Han, B.M.; Jang, G.S. [Electric Enginnering and Science Research Institute, Seoul (Korea)

    2001-07-01

    This report deals with the AC-DC power system simulation method by PSS/E and EUROSTAG for the development of a strategy for the reliable operation of the Cheju-Haenam interconnected system. The simulation using both programs is performed to analyze HVDC simulation models. In addition, the control characteristics of the Cheju-Haenam HVDC system as well as Cheju AC system characteristics are described in this work. (author). 104 figs., 8 tabs.

  2. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  3. A Systematic Analysis of the Sensitivity of Plasma Pharmacokinetics to Detect Differences in the Pulmonary Performance of Inhaled Fluticasone Propionate Products Using a Model-Based Simulation Approach.

    Science.gov (United States)

    Weber, Benjamin; Hochhaus, Guenther

    2015-07-01

    The role of plasma pharmacokinetics (PK) for assessing bioequivalence at the target site, the lung, for orally inhaled drugs remains unclear. A validated semi-mechanistic model, considering the presence of mucociliary clearance in central lung regions, was expanded for quantifying the sensitivity of PK studies in detecting differences in the pulmonary performance (total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics) between test (T) and reference (R) inhaled fluticasone propionate (FP) products. PK bioequivalence trials for inhaled FP were simulated based on this PK model for a varying number of subjects and T products. The statistical power to conclude bioequivalence when T and R products are identical was demonstrated to be 90% for approximately 50 subjects. Furthermore, the simulations demonstrated that PK metrics (area under the concentration time curve (AUC) and C max) are capable of detecting differences between T and R formulations of inhaled FP products when the products differ by more than 20%, 30%, and 25% for total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics, respectively. These results were derived using a rather conservative risk assessment approach with an error rate of <10%. The simulations thus indicated that PK studies might be a viable alternative to clinical studies comparing pulmonary efficacy biomarkers for slowly dissolving inhaled drugs. PK trials for pulmonary efficacy equivalence testing should be complemented by in vitro studies to avoid false positive bioequivalence assessments that are theoretically possible for some specific scenarios. Moreover, a user-friendly web application for simulating such PK equivalence trials with inhaled FP is provided.

  4. Key performance indicators for successful simulation projects

    OpenAIRE

    Jahangirian, M; Taylor, SJE; Young, T; Robinson, S

    2016-01-01

    There are many factors that may contribute to the successful delivery of a simulation project. To provide a structured approach to assessing the impact various factors have on project success, we propose a top-down framework whereby 15 Key Performance Indicators (KPI) are developed that represent the level of successfulness of simulation projects from various perspectives. They are linked to a set of Critical Success Factors (CSF) as reported in the simulation literature. A single measure cal...

  5. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    trials. However, if simulation models would be used, good quality input data must be available. To model FMD, several disease spread models are available. For this project, we chose three simulation model; Davis Animal Disease Spread (DADS), that has been upgraded to DTU-DADS, InterSpread Plus (ISP......Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and field...... trials to investigate the effect of alternative conditions or actions on a specific system. Nonetheless, field trials are expensive and sometimes not possible to conduct, as in case of foot-and-mouth disease (FMD). Instead, simulation models can be a good and cheap substitute for experiments and field...

  6. SLC positron source: Simulation and performance

    International Nuclear Information System (INIS)

    Pitthan, R.; Braun, H.; Clendenin, J.E.; Ecklund, S.D.; Helm, R.H.; Kulikov, A.V.; Odian, A.C.; Pei, G.X.; Ross, M.C.; Woodley, M.D.

    1991-06-01

    Performance of the source was found to be in good general agreement with computer simulations with S-band acceleration, and where not, the simulations lead to identification of problems, in particular the underestimated impact of linac misalignments due to the 1989 Loma Prieta Earthquake. 13 refs., 7 figs

  7. Team Culture and Business Strategy Simulation Performance

    Science.gov (United States)

    Ritchie, William J.; Fornaciari, Charles J.; Drew, Stephen A. W.; Marlin, Dan

    2013-01-01

    Many capstone strategic management courses use computer-based simulations as core pedagogical tools. Simulations are touted as assisting students in developing much-valued skills in strategy formation, implementation, and team management in the pursuit of superior strategic performance. However, despite their rich nature, little is known regarding…

  8. Building performance simulation for sustainable buildings

    NARCIS (Netherlands)

    Hensen, J.L.M.

    2010-01-01

    This paper aims to provide a general view of the background and current state of building performance simulation, which has the potential to deliver, directly or indirectly, substantial benefits to building stakeholders and to the environment. However the building simulation community faces many

  9. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  10. Simulation and performance of brushless dc motor actuators

    Science.gov (United States)

    Gerba, A., Jr.

    1985-12-01

    The simulation model for a Brushless D.C. Motor and the associated commutation power conditioner transistor model are presented. The necessary conditions for maximum power output while operating at steady-state speed and sinusoidally distributed air-gap flux are developed. Comparison of simulated model with the measured performance of a typical motor are done both on time response waveforms and on average performance characteristics. These preliminary results indicate good agreement. Plans for model improvement and testing of a motor-driven positioning device for model evaluation are outlined.

  11. Real-Gas Effects in ORC Turbine Flow Simulations : Influence of Thermodynamic Models on Flow Fields and Performance Parameters

    NARCIS (Netherlands)

    Colonna, P.; Rebay, S.; Harinck, J.; Guardone, A.

    2006-01-01

    The analysis and design of turbomachinery is usually performed by means of fluid dynamic computations employing ideal gas laws. This can lead to inaccurate redictions for Organic Rankine Cycle (ORC) turbines, which operate partly in the nonideal thermodynamic region. The objective of this work is to

  12. Terrestrial ecosystem model performance in simulating productivity and its vulnerability to climate change in the northern permafrost region

    DEFF Research Database (Denmark)

    Xia, Jianyang; McGuire, A. David; Lawrence, David

    2017-01-01

    productivity (NPP) and responses to historical climate change in permafrost regions in the Northern Hemisphere. In comparison with the satellite estimate from the Moderate Resolution Imaging Spectroradiometer (MODIS; 246 ± 6 g C m−2 yr−1), most models produced higher NPP (309 ± 12 g C m−2 yr−1) over...... and the maximum rate of carboxylation by the enzyme Rubisco at 25°C (Vcmax_25), respectively. The models also varied in their sensitivities of NPP, GPP, and CUE to historical changes in climate and atmospheric CO2 concentration. These results indicate that model predictive ability of the C cycle in permafrost...... regions can be improved by better representation of the processes controlling CUE and GPPmax as well as their sensitivity to climate change....

  13. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  14. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  15. Improving the performance of a filling line based on simulation

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  16. ECONOMIC MODELING STOCKS CONTROL SYSTEM: SIMULATION MODEL

    OpenAIRE

    Климак, М.С.; Войтко, С.В.

    2016-01-01

    Considered theoretical and applied aspects of the development of simulation models to predictthe optimal development and production systems that create tangible products andservices. It isproved that theprocessof inventory control needs of economicandmathematical modeling in viewof thecomplexity of theoretical studies. A simulation model of stocks control that allows make managementdecisions with production logistics

  17. Progress in modeling and simulation.

    Science.gov (United States)

    Kindler, E

    1998-01-01

    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  18. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  19. FASTBUS simulation models in VHDL

    International Nuclear Information System (INIS)

    Appelquist, G.

    1992-11-01

    Four hardware simulation models implementing the FASTBUS protocol are described. The models are written in the VHDL hardware description language to obtain portability, i.e. without relations to any specific simulator. They include two complete FASTBUS devices, a full-duplex segment interconnect and ancillary logic for the segment. In addition, master and slave models using a high level interface to describe FASTBUS operations, are presented. With these models different configurations of FASTBUS systems can be evaluated and the FASTBUS transactions of new devices can be verified. (au)

  20. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  1. Well performance model

    International Nuclear Information System (INIS)

    Thomas, L.K.; Evans, C.E.; Pierson, R.G.; Scott, S.L.

    1992-01-01

    This paper describes the development and application of a comprehensive oil or gas well performance model. The model contains six distinct sections: stimulation design, tubing and/or casing flow, reservoir and near-wellbore calculations, production forecasting, wellbore heat transmission, and economics. These calculations may be performed separately or in an integrated fashion with data and results shared among the different sections. The model analysis allows evaluation of all aspects of well completion design, including the effects on future production and overall well economics

  2. Equivalent drawbead performance in deep drawing simulations

    NARCIS (Netherlands)

    Meinders, Vincent T.; Geijselaers, Hubertus J.M.; Huetink, Han

    1999-01-01

    Drawbeads are applied in the deep drawing process to improve the control of the material flow during the forming operation. In simulations of the deep drawing process these drawbeads can be replaced by an equivalent drawbead model. In this paper the usage of an equivalent drawbead model in the

  3. Matlab-Based Modeling and Simulations to Study the Performance of Different MPPT Techniques Used for Photovoltaic Systems under Partially Shaded Conditions

    Directory of Open Access Journals (Sweden)

    Jehun Hahm

    2015-01-01

    Full Text Available A pulse-width-modulator- (PWM- based sliding mode controller is developed to study the effects of partial shade, temperature, and insolation on the performance of maximum power point tracking (MPPT used in photovoltaic (PV systems. Under partially shaded conditions and temperature, PV array characteristics become more complex, with multiple power-voltage maxima. MPPT is an automatic control technique to adjust power interfaces and deliver power for a diverse range of insolation values, temperatures, and partially shaded modules. The PV system is tested using two conventional algorithms: the Perturb and Observe (P&O algorithm and the Incremental Conductance (IncCond algorithm, which are simple to implement for a PV array. The proposed method applied a model to simulate the performance of the PV system for solar energy usage, which is compared to the conventional methods under nonuniform insolation improving the PV system utilization efficiency and allowing optimization of the system performance. The PWM-based sliding mode controller successfully overcomes the issues presented by nonuniform conditions and tracks the global MPP. In this paper, the PV system consists of a solar module under shade connected to a boost converter that is controlled by three different algorithms and is generated using Matlab/Simulink.

  4. Greenhouse simulation models.

    NARCIS (Netherlands)

    Bot, G.P.A.

    1989-01-01

    A model is a representation of a real system to describe some properties i.e. internal factors of that system (out-puts) as function of some external factors (inputs). It is impossible to describe the relation between all internal factors (if even all internal factors could be defined) and all

  5. Landscape Modelling and Simulation Using Spatial Data

    Directory of Open Access Journals (Sweden)

    Amjed Naser Mohsin AL-Hameedawi

    2017-08-01

    Full Text Available In this paper a procedure was performed for engendering spatial model of landscape acclimated to reality simulation. This procedure based on combining spatial data and field measurements with computer graphics reproduced using Blender software. Thereafter that we are possible to form a 3D simulation based on VIS ALL packages. The objective was to make a model utilising GIS, including inputs to the feature attribute data. The objective of these efforts concentrated on coordinating a tolerable spatial prototype, circumscribing facilitation scheme and outlining the intended framework. Thus; the eventual result was utilized in simulation form. The performed procedure contains not only data gathering, fieldwork and paradigm providing, but extended to supply a new method necessary to provide the respective 3D simulation mapping production, which authorises the decision makers as well as investors to achieve permanent acceptance an independent navigation system for Geoscience applications.

  6. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  7. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  8. Performance Optimization of the ATLAS Detector Simulation

    CERN Document Server

    AUTHOR|(CDS)2091018

    In the thesis at hand the current performance of the ATLAS detector simulation, part of the Athena framework, is analyzed and possible optimizations are examined. For this purpose the event based sampling profiler VTune Amplifier by Intel is utilized. As the most important metric to measure improvements, the total execution time of the simulation of $t\\bar{t}$ events is also considered. All efforts are focused on structural changes, which do not influence the simulation output and can be attributed to CPU specific issues, especially front end stalls and vectorization. The most promising change is the activation of profile guided optimization for Geant4, which is a critical external dependency of the simulation. Profile guided optimization gives an average improvement of $8.9\\%$ and $10.0\\%$ for the two considered cases at the cost of one additional compilation (instrumented binaries) and execution (training to obtain profiling data) at build time.

  9. Ion thruster performance model

    International Nuclear Information System (INIS)

    Brophy, J.R.

    1984-01-01

    A model of ion thruster performance is developed for high flux density cusped magnetic field thruster designs. This model is formulated in terms of the average energy required to produce an ion in the discharge chamber plasma and the fraction of these ions that are extracted to form the beam. The direct loss of high energy (primary) electrons from the plasma to the anode is shown to have a major effect on thruster performance. The model provides simple algebraic equations enabling one to calculate the beam ion energy cost, the average discharge chamber plasma ion energy cost, the primary electron density, the primary-to-Maxwellian electron density ratio and the Maxwellian electron temperature. Experiments indicate that the model correctly predicts the variation in plasma ion energy cost for changes in propellant gas (Ar, Kr, and Xe), grid transparency to neutral atoms, beam extraction area, discharge voltage, and discharge chamber wall temperature

  10. Base Station Performance Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  11. Approaching Sentient Building Performance Simulation Systems

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas; Heller, Alfred

    2014-01-01

    Sentient BPS systems can combine one or more high precision BPS and provide near instantaneous performance feedback directly in the design tool, thus providing speed and precision of building performance in the early design stages. Sentient BPS systems are essentially combining: 1) design tools, 2......) parametric tools, 3) BPS tools, 4) dynamic databases 5) interpolation techniques and 6) prediction techniques as a fast and valid simulation system, in the early design stage....

  12. Modeling and simulation of the shading effect on the performance of a photovoltaic module in the presence of the bypass diode.

    Directory of Open Access Journals (Sweden)

    Zebiri Mohamed

    2018-01-01

    Full Text Available In photovoltaic renewable energy production systems where production is dependent on weather conditions, maintaining production at a suitable level is more than essential. The shading effect in photovoltaic panels affects the production of electrical energy by reducing it or even causing the destruction of some or all of the panels. To circumvent this problem, among the solutions proposed in the literature we find the use of by-pass diode and anti-return diode to minimize these consequences.In this paper we present a simulation under Matlab-Simulink of the shading effect and we compare the current voltages characteristics (I-V and power voltage (P-V of a photovoltaic system for different irradiations in the presence and absence of diode by -pass. For modeling, we will use the diode model and the Lambert W-function to solve the implicit equation of the output current. This method allows you to analyze the performance of a panel at different shading levels.

  13. Modeling and simulation of the shading effect on the performance of a photovoltaic module in the presence of the bypass diode.

    Science.gov (United States)

    Zebiri, Mohamed; Mediouni, Mohamed; Idadoub, Hicham

    2018-05-01

    In photovoltaic renewable energy production systems where production is dependent on weather conditions, maintaining production at a suitable level is more than essential. The shading effect in photovoltaic panels affects the production of electrical energy by reducing it or even causing the destruction of some or all of the panels. To circumvent this problem, among the solutions proposed in the literature we find the use of by-pass diode and anti-return diode to minimize these consequences.In this paper we present a simulation under Matlab-Simulink of the shading effect and we compare the current voltages characteristics (I-V) and power voltage (P-V) of a photovoltaic system for different irradiations in the presence and absence of diode by -pass. For modeling, we will use the diode model and the Lambert W-function to solve the implicit equation of the output current. This method allows you to analyze the performance of a panel at different shading levels.

  14. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  15. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  16. Performance modeling of Beamlet

    International Nuclear Information System (INIS)

    Auerbach, J.M.; Lawson, J.K.; Rotter, M.D.; Sacks, R.A.; Van Wonterghem, B.W.; Williams, W.H.

    1995-01-01

    Detailed modeling of beam propagation in Beamlet has been made to predict system performance. New software allows extensive use of optical component characteristics. This inclusion of real optical component characteristics has resulted in close agreement between calculated and measured beam distributions

  17. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  18. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  19. Modeling and simulation goals and accomplishments

    International Nuclear Information System (INIS)

    Turinsky, P.

    2013-01-01

    The CASL (Consortium for Advanced Simulation of Light Water Reactors) mission is to develop and apply the Virtual Reactor simulator (VERA) to optimise nuclear power in terms of capital and operating costs, of nuclear waste production and of nuclear safety. An efficient and reliable virtual reactor simulator relies on 3-dimensional calculations, accurate physics models and code coupling. Advances in computer hardware, along with comparable advances in numerical solvers make the VERA project achievable. This series of slides details the VERA project and presents the specificities and performance of the codes involved in the project and ends by listing the computing needs

  20. Simulation-based evaluation of the performance of the F test in a linear multilevel model setting with sparseness at the level of the primary unit.

    Science.gov (United States)

    Bruyndonckx, Robin; Aerts, Marc; Hens, Niel

    2016-09-01

    In a linear multilevel model, significance of all fixed effects can be determined using F tests under maximum likelihood (ML) or restricted maximum likelihood (REML). In this paper, we demonstrate that in the presence of primary unit sparseness, the performance of the F test under both REML and ML is rather poor. Using simulations based on the structure of a data example on ceftriaxone consumption in hospitalized children, we studied variability, type I error rate and power in scenarios with a varying number of secondary units within the primary units. In general, the variability in the estimates for the effect of the primary unit decreased as the number of secondary units increased. In the presence of singletons (i.e., only one secondary unit within a primary unit), REML consistently outperformed ML, although even under REML the performance of the F test was found inadequate. When modeling the primary unit as a random effect, the power was lower while the type I error rate was unstable. The options of dropping, regrouping, or splitting the singletons could solve either the problem of a high type I error rate or a low power, while worsening the other. The permutation test appeared to be a valid alternative as it outperformed the F test, especially under REML. We conclude that in the presence of singletons, one should be careful in using the F test to determine the significance of the fixed effects, and propose the permutation test (under REML) as an alternative. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Plasma modelling and numerical simulation

    International Nuclear Information System (INIS)

    Van Dijk, J; Kroesen, G M W; Bogaerts, A

    2009-01-01

    Plasma modelling is an exciting subject in which virtually all physical disciplines are represented. Plasma models combine the electromagnetic, statistical and fluid dynamical theories that have their roots in the 19th century with the modern insights concerning the structure of matter that were developed throughout the 20th century. The present cluster issue consists of 20 invited contributions, which are representative of the state of the art in plasma modelling and numerical simulation. These contributions provide an in-depth discussion of the major theories and modelling and simulation strategies, and their applications to contemporary plasma-based technologies. In this editorial review, we introduce and complement those papers by providing a bird's eye perspective on plasma modelling and discussing the historical context in which it has surfaced. (editorial review)

  2. High performance real-time flight simulation at NASA Langley

    Science.gov (United States)

    Cleveland, Jeff I., II

    1994-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be deterministic and be completed in as short a time as possible. This includes simulation mathematical model computational and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, personnel at NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to a standard input/output system to provide for high bandwidth, low latency data acquisition and distribution. The Computer Automated Measurement and Control technology (IEEE standard 595) was extended to meet the performance requirements for real-time simulation. This technology extension increased the effective bandwidth by a factor of ten and increased the performance of modules necessary for simulator communications. This technology is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications of this technology are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC have completed the development of the use of supercomputers for simulation mathematical model computational to support real-time flight simulation. This includes the development of a real-time operating system and the development of specialized software and hardware for the CAMAC simulator network. This work, coupled with the use of an open systems software architecture, has advanced the state of the art in real time flight simulation. The data acquisition technology innovation and experience with recent developments in this technology are described.

  3. Performance comparison of low and high temperature polymer electrolyte membrane fuel cells. Experimental examinations, modelling and numerical simulation; Leistungsvergleich von Nieder- und Hochtemperatur-Polymerelektrolytmembran-Brennstoffzellen. Experimentelle Untersuchungen, Modellierung und numerische Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Loehn, Helmut

    2010-11-03

    danger of washing out of the phosphoric acid. In an additional test row the Celtec-P-1000 HT-MEA was subjected to temperature change cycles (40 - 160 C), which lead to irreversible voltage losses. In a final test row performance tests were carried out with a HT-PEM fuel cell stack (16 cells /1 kW), developed in the fuel cell research centre of Volkswagen with a special gas diffusion electrode, which should avoid the degradation at deep temperatures. In these examinations no irreversible voltage losses could be detected, but the tests had to be aborted because of leakage problems. The by the experimental examinations gained insight of the superior operating behaviour and the further advantages of the HT-PEMFC in comparison to the LT-PEMFC were crucial for the construction of a simulation model for a single HT-PEM fuel cell in the theoretical part of this thesis, that also should be suitable as process simulation model for the computer based development of a virtual fuel cell within the interdisciplinary project ''Virtual Fuel Cell'' at the TU Darmstadt. The model is a numerical 2D ''along the channel'' - model, that was constructed with the finite element software COMSOL Multiphysics (version 3.5 a). The stationary, one phase model comprises altogether ten dependent variables in seven application modules in a highly complex, coupled non linear system of equations with 33713 degrees of freedom (1675 rectangle elements with 1768 nodes). The simulation model describes the mass transport processes and the electro-chemical reactions in a HT-PEM fuel cell with good accuracy, the model validation by comparing the model results with experimental data could be proved. So the 2D-model is basically suitable as process simulation model for the projecting of a virtual HT-PEM fuel cell. (orig.)

  4. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance...... eigenfunctions and estimates of the distributions of the corresponding expansion coefficients. The simulation method utilizes the eigenfunction expansion procedure to produce preliminary time histories of the three velocity components simultaneously. As a final step, a spectral shaping procedure is then applied....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  5. Predictors of laparoscopic simulation performance among practicing obstetrician gynecologists.

    Science.gov (United States)

    Mathews, Shyama; Brodman, Michael; D'Angelo, Debra; Chudnoff, Scott; McGovern, Peter; Kolev, Tamara; Bensinger, Giti; Mudiraj, Santosh; Nemes, Andreea; Feldman, David; Kischak, Patricia; Ascher-Walsh, Charles

    2017-11-01

    While simulation training has been established as an effective method for improving laparoscopic surgical performance in surgical residents, few studies have focused on its use for attending surgeons, particularly in obstetrics and gynecology. Surgical simulation may have a role in improving and maintaining proficiency in the operating room for practicing obstetrician gynecologists. We sought to determine if parameters of performance for validated laparoscopic virtual simulation tasks correlate with surgical volume and characteristics of practicing obstetricians and gynecologists. All gynecologists with laparoscopic privileges (n = 347) from 5 academic medical centers in New York City were required to complete a laparoscopic surgery simulation assessment. The physicians took a presimulation survey gathering physician self-reported characteristics and then performed 3 basic skills tasks (enforced peg transfer, lifting/grasping, and cutting) on the LapSim virtual reality laparoscopic simulator (Surgical Science Ltd, Gothenburg, Sweden). The association between simulation outcome scores (time, efficiency, and errors) and self-rated clinical skills measures (self-rated laparoscopic skill score or surgical volume category) were examined with regression models. The average number of laparoscopic procedures per month was a significant predictor of total time on all 3 tasks (P = .001 for peg transfer; P = .041 for lifting and grasping; P simulation performance as it correlates to active physician practice, further studies may help assess skill and individualize training to maintain skill levels as case volumes fluctuate. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. New exploration on TMSR: modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Si, S.; Chen, Q.; Bei, H.; Zhao, J., E-mail: ssy@snerdi.com.cn [Shanghai Nuclear Engineering Research & Design Inst., Shanghai (China)

    2015-07-01

    A tightly coupled multi-physics model for MSR (Molten Salt Reactor) system involving the reactor core and the rest of the primary loop has been developed and employed in an in-house developed computer code TANG-MSR. In this paper, the computer code is used to simulate the behavior of steady state operation and transient for our redesigned TMSR. The presented simulation results demonstrate that the models employed in TANG-MSR can capture major physics phenomena in MSR and the redesigned TMSR has excellent performance of safety and sustainability. (author)

  7. Simulating and stimulating performance: Introducing distributed simulation to enhance musical learning and performance

    Directory of Open Access Journals (Sweden)

    Aaron eWilliamon

    2014-02-01

    Full Text Available Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of real performance could be recreated. Advanced violin students (n=11 were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three expert virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for

  8. Simulating and stimulating performance: introducing distributed simulation to enhance musical learning and performance.

    Science.gov (United States)

    Williamon, Aaron; Aufegger, Lisa; Eiholzer, Hubert

    2014-01-01

    Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all) how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of "real" performance could be recreated. Advanced violin students (n = 11) were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three "expert" virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for performance training.

  9. Simulations of Technology-Induced and Crisis-Led Stochastic and Chaotic Fluctuations in Higher Education Processes: A Model and a Case Study for Performance and Expected Employment

    Science.gov (United States)

    Ahmet, Kara

    2015-01-01

    This paper presents a simple model of the provision of higher educational services that considers and exemplifies nonlinear, stochastic, and potentially chaotic processes. I use the methods of system dynamics to simulate these processes in the context of a particular sociologically interesting case, namely that of the Turkish higher education…

  10. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    Directory of Open Access Journals (Sweden)

    Yunzhou Fan

    Full Text Available BACKGROUND: Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. OBJECTIVE: This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. METHODS: Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1 outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. RESULTS: In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp <90%. CONCLUSIONS: The temporal simulation model based on healthcare-seeking behaviors offers an accessible method for evaluating the performance of multi-stream surveillance.

  11. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  12. Modeling and Simulation of Nanoindentation

    Science.gov (United States)

    Huang, Sixie; Zhou, Caizhi

    2017-11-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  13. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  14. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  15. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M

    2011-01-01

    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  16. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  17. Comparison of turbulence measurements from DIII-D low-mode and high-performance plasmas to turbulence simulations and models

    International Nuclear Information System (INIS)

    Rhodes, T.L.; Leboeuf, J.-N.; Sydora, R.D.; Groebner, R.J.; Doyle, E.J.; McKee, G.R.; Peebles, W.A.; Rettig, C.L.; Zeng, L.; Wang, G.

    2002-01-01

    Measured turbulence characteristics (correlation lengths, spectra, etc.) in low-confinement (L-mode) and high-performance plasmas in the DIII-D tokamak [Luxon et al., Proceedings Plasma Physics and Controlled Nuclear Fusion Research 1986 (International Atomic Energy Agency, Vienna, 1987), Vol. I, p. 159] show many similarities with the characteristics determined from turbulence simulations. Radial correlation lengths Δr of density fluctuations from L-mode discharges are found to be numerically similar to the ion poloidal gyroradius ρ θ,s , or 5-10 times the ion gyroradius ρ s over the radial region 0.2 θ,s or 5-10 times ρ s , an experiment was performed which modified ρ θs while keeping other plasma parameters approximately fixed. It was found that the experimental Δr did not scale as ρ θ,s , which was similar to low-resolution UCAN simulations. Finally, both experimental measurements and gyrokinetic simulations indicate a significant reduction in the radial correlation length from high-performance quiescent double barrier discharges, as compared to normal L-mode, consistent with reduced transport in these high-performance plasmas

  18. Regional model simulations of New Zealand climate

    Science.gov (United States)

    Renwick, James A.; Katzfey, Jack J.; Nguyen, Kim C.; McGregor, John L.

    1998-03-01

    Simulation of New Zealand climate is examined through the use of a regional climate model nested within the output of the Commonwealth Scientific and Industrial Research Organisation nine-level general circulation model (GCM). R21 resolution GCM output is used to drive a regional model run at 125 km grid spacing over the Australasian region. The 125 km run is used in turn to drive a simulation at 50 km resolution over New Zealand. Simulations with a full seasonal cycle are performed for 10 model years. The focus is on the quality of the simulation of present-day climate, but results of a doubled-CO2 run are discussed briefly. Spatial patterns of mean simulated precipitation and surface temperatures improve markedly as horizontal resolution is increased, through the better resolution of the country's orography. However, increased horizontal resolution leads to a positive bias in precipitation. At 50 km resolution, simulated frequency distributions of daily maximum/minimum temperatures are statistically similar to those of observations at many stations, while frequency distributions of daily precipitation appear to be statistically different to those of observations at most stations. Modeled daily precipitation variability at 125 km resolution is considerably less than observed, but is comparable to, or exceeds, observed variability at 50 km resolution. The sensitivity of the simulated climate to changes in the specification of the land surface is discussed briefly. Spatial patterns of the frequency of extreme temperatures and precipitation are generally well modeled. Under a doubling of CO2, the frequency of precipitation extremes changes only slightly at most locations, while air frosts become virtually unknown except at high-elevation sites.

  19. Standard for Models and Simulations

    Science.gov (United States)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  20. Numerical Simulation and Performance Analysis of Twin Screw Air Compressors

    Directory of Open Access Journals (Sweden)

    W. S. Lee

    2001-01-01

    Full Text Available A theoretical model is proposed in this paper in order to study the performance of oil-less and oil-injected twin screw air compressors. Based on this model, a computer simulation program is developed and the effects of different design parameters including rotor profile, geometric clearance, oil-injected angle, oil temperature, oil flow rate, built-in volume ratio and other operation conditions on the performance of twin screw air compressors are investigated. The simulation program gives us output variables such as specific power, compression ratio, compression efficiency, volumetric efficiency, and discharge temperature. Some of the above results are then compared with experimentally measured data and good agreement is found between the simulation results and the measured data.

  1. Facility/equipment performance evaluation using microcomputer simulation analysis

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered

  2. Visualization and Analysis of Climate Simulation Performance Data

    Science.gov (United States)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  3. Simulation modeling and analysis in safety. II

    International Nuclear Information System (INIS)

    Ayoub, M.A.

    1981-01-01

    The paper introduces and illustrates simulation modeling as a viable approach for dealing with complex issues and decisions in safety and health. The author details two studies: evaluation of employee exposure to airborne radioactive materials and effectiveness of the safety organization. The first study seeks to define a policy to manage a facility used in testing employees for radiation contamination. An acceptable policy is one that would permit the testing of all employees as defined under regulatory requirements, while not exceeding available resources. The second study evaluates the relationship between safety performance and the characteristics of the organization, its management, its policy, and communication patterns among various functions and levels. Both studies use models where decisions are reached based on the prevailing conditions and occurrence of key events within the simulation environment. Finally, several problem areas suitable for simulation studies are highlighted. (Auth.)

  4. Cognitive environment simulation: An artificial intelligence system for human performance assessment: Modeling human intention formation: [Technical report, May 1986-June 1987

    International Nuclear Information System (INIS)

    Woods, D.D.; Roth, E.M.; Pople, H. Jr.

    1987-11-01

    This report documents the results of Phase II of a three phase research program to develop and validate improved methods to model the cognitive behavior of nuclear power plant (NPP) personnel. In Phase II a dynamic simulation capability for modeling how people form intentions to act in NPP emergency situations was developed based on techniques from artificial intelligence. This modeling tool, Cognitive Environment Simulation or CES, simulates the cognitive processes that determine situation assessment and intention formation. It can be used to investigate analytically what situations and factors lead to intention failures, what actions follow from intention failures (e.g., errors of omission, errors of commission, common mode errors), the ability to recover from errors or additional machine failures, and the effects of changes in the NPP person-machine system. The Cognitive Reliability Assessment Technique (or CREATE) was also developed in Phase II to specify how CES can be used to enhance the measurement of the human contribution to risk in probabilistic risk assessment (PRA) studies. 43 refs., 20 figs., 1 tab

  5. Analysing the temporal dynamics of model performance for hydrological models

    NARCIS (Netherlands)

    Reusser, D.E.; Blume, T.; Schaefli, B.; Zehe, E.

    2009-01-01

    The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or

  6. Mesoscopic modelling and simulation of soft matter.

    Science.gov (United States)

    Schiller, Ulf D; Krüger, Timm; Henrich, Oliver

    2017-12-20

    The deformability of soft condensed matter often requires modelling of hydrodynamical aspects to gain quantitative understanding. This, however, requires specialised methods that can resolve the multiscale nature of soft matter systems. We review a number of the most popular simulation methods that have emerged, such as Langevin dynamics, dissipative particle dynamics, multi-particle collision dynamics, sometimes also referred to as stochastic rotation dynamics, and the lattice-Boltzmann method. We conclude this review with a short glance at current compute architectures for high-performance computing and community codes for soft matter simulation.

  7. Vermont Yankee simulator BOP model upgrade

    International Nuclear Information System (INIS)

    Alejandro, R.; Udbinac, M.J.

    2006-01-01

    The Vermont Yankee simulator has undergone significant changes in the 20 years since the original order was placed. After the move from the original Unix to MS Windows environment, and upgrade to the latest version of SimPort, now called MASTER, the platform was set for an overhaul and replacement of major plant system models. Over a period of a few months, the VY simulator team, in partnership with WSC engineers, replaced outdated legacy models of the main steam, condenser, condensate, circulating water, feedwater and feedwater heaters, and main turbine and auxiliaries. The timing was ideal, as the plant was undergoing a power up-rate, so the opportunity was taken to replace the legacy models with industry-leading, true on-line object oriented graphical models. Due to the efficiency of design and ease of use of the MASTER tools, VY staff performed the majority of the modeling work themselves with great success, with only occasional assistance from WSC, in a relatively short time-period, despite having to maintain all of their 'regular' simulator maintenance responsibilities. This paper will provide a more detailed view of the VY simulator, including how it is used and how it has benefited from the enhancements and upgrades implemented during the project. (author)

  8. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  9. Cognitive load predicts point-of-care ultrasound simulator performance.

    Science.gov (United States)

    Aldekhyl, Sara; Cavalcanti, Rodrigo B; Naismith, Laura M

    2018-02-01

    The ability to maintain good performance with low cognitive load is an important marker of expertise. Incorporating cognitive load measurements in the context of simulation training may help to inform judgements of competence. This exploratory study investigated relationships between demographic markers of expertise, cognitive load measures, and simulator performance in the context of point-of-care ultrasonography. Twenty-nine medical trainees and clinicians at the University of Toronto with a range of clinical ultrasound experience were recruited. Participants answered a demographic questionnaire then used an ultrasound simulator to perform targeted scanning tasks based on clinical vignettes. Participants were scored on their ability to both acquire and interpret ultrasound images. Cognitive load measures included participant self-report, eye-based physiological indices, and behavioural measures. Data were analyzed using a multilevel linear modelling approach, wherein observations were clustered by participants. Experienced participants outperformed novice participants on ultrasound image acquisition. Ultrasound image interpretation was comparable between the two groups. Ultrasound image acquisition performance was predicted by level of training, prior ultrasound training, and cognitive load. There was significant convergence between cognitive load measurement techniques. A marginal model of ultrasound image acquisition performance including prior ultrasound training and cognitive load as fixed effects provided the best overall fit for the observed data. In this proof-of-principle study, the combination of demographic and cognitive load measures provided more sensitive metrics to predict ultrasound simulator performance. Performance assessments which include cognitive load can help differentiate between levels of expertise in simulation environments, and may serve as better predictors of skill transfer to clinical practice.

  10. Performance of the WRF model to simulate the seasonal and interannual variability of hydrometeorological variables in East Africa: a case study for the Tana River basin in Kenya

    Science.gov (United States)

    Kerandi, Noah Misati; Laux, Patrick; Arnault, Joel; Kunstmann, Harald

    2017-10-01

    This study investigates the ability of the regional climate model Weather Research and Forecasting (WRF) in simulating the seasonal and interannual variability of hydrometeorological variables in the Tana River basin (TRB) in Kenya, East Africa. The impact of two different land use classifications, i.e., the Moderate Resolution Imaging Spectroradiometer (MODIS) and the US Geological Survey (USGS) at two horizontal resolutions (50 and 25 km) is investigated. Simulated precipitation and temperature for the period 2011-2014 are compared with Tropical Rainfall Measuring Mission (TRMM), Climate Research Unit (CRU), and station data. The ability of Tropical Rainfall Measuring Mission (TRMM) and Climate Research Unit (CRU) data in reproducing in situ observation in the TRB is analyzed. All considered WRF simulations capture well the annual as well as the interannual and spatial distribution of precipitation in the TRB according to station data and the TRMM estimates. Our results demonstrate that the increase of horizontal resolution from 50 to 25 km, together with the use of the MODIS land use classification, significantly improves the precipitation results. In the case of temperature, spatial patterns and seasonal cycle are well reproduced, although there is a systematic cold bias with respect to both station and CRU data. Our results contribute to the identification of suitable and regionally adapted regional climate models (RCMs) for East Africa.

  11. Cost and Performance Model for Photovoltaic Systems

    Science.gov (United States)

    Borden, C. S.; Smith, J. H.; Davisson, M. C.; Reiter, L. J.

    1986-01-01

    Lifetime cost and performance (LCP) model assists in assessment of design options for photovoltaic systems. LCP is simulation of performance, cost, and revenue streams associated with photovoltaic power systems connected to electric-utility grid. LCP provides user with substantial flexibility in specifying technical and economic environment of application.

  12. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    Science.gov (United States)

    Fan, Yunzhou; Wang, Ying; Jiang, Hongbo; Yang, Wenwen; Yu, Miao; Yan, Weirong; Diwan, Vinod K; Xu, Biao; Dong, Hengjin; Palm, Lars; Nie, Shaofa

    2014-01-01

    Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1) outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp performance of multi-stream surveillance.

  13. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  14. Performance engineering in the community atmosphere model

    International Nuclear Information System (INIS)

    Worley, P; Mirin, A; Drake, J; Sawyer, W

    2006-01-01

    The Community Atmosphere Model (CAM) is the atmospheric component of the Community Climate System Model (CCSM) and is the primary consumer of computer resources in typical CCSM simulations. Performance engineering has been an important aspect of CAM development throughout its existence. This paper briefly summarizes these efforts and their impacts over the past five years

  15. Nuclear reactor core modelling in multifunctional simulators

    International Nuclear Information System (INIS)

    Puska, E.K.

    1999-01-01

    studied to assess the possibilities for using three-dimensional cores in training simulators. The core model results have been compared with the Loviisa WWER-type plant measurement data in steady state and in some transients. Hypothetical control rod withdrawal, ejection and boron dilution transients have been calculated with various three-dimensional core models for the Loviisa WWER-440 core. Several ATWS analyses for the WWER-1000/91 plant have been performed using the three-dimensional core model. In this context, the results of APROS have been compared in detail with the results of the HEXTRAN code. The three-dimensional Olkiluoto BWR-type core model has been used for transient calculation and for severe accident re-criticality studies. The one-dimensional core model is at present used in several plant analyser and training simulator applications and it has been used extensively for safety analyses in the Loviisa WWER-440 plant modernisation project. (orig.)

  16. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    studied to assess the possibilities for using three-dimensional cores in training simulators. The core model results have been compared with the Loviisa WWER-type plant measurement data in steady state and in some transients. Hypothetical control rod withdrawal, ejection and boron dilution transients have been calculated with various three-dimensional core models for the Loviisa WWER-440 core. Several ATWS analyses for the WWER-1000/91 plant have been performed using the three-dimensional core model. In this context, the results of APROS have been compared in detail with the results of the HEXTRAN code. The three-dimensional Olkiluoto BWR-type core model has been used for transient calculation and for severe accident re-criticality studies. The one-dimensional core model is at present used in several plant analyser and training simulator applications and it has been used extensively for safety analyses in the Loviisa WWER-440 plant modernisation project. (orig.) 75 refs. The thesis includes also eight previous publications by author

  17. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  18. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  19. Impulse pumping modelling and simulation

    International Nuclear Information System (INIS)

    Pierre, B; Gudmundsson, J S

    2010-01-01

    Impulse pumping is a new pumping method based on propagation of pressure waves. Of particular interest is the application of impulse pumping to artificial lift situations, where fluid is transported from wellbore to wellhead using pressure waves generated at wellhead. The motor driven element of an impulse pumping apparatus is therefore located at wellhead and can be separated from the flowline. Thus operation and maintenance of an impulse pump are facilitated. The paper describes the different elements of an impulse pumping apparatus, reviews the physical principles and details the modelling of the novel pumping method. Results from numerical simulations of propagation of pressure waves in water-filled pipelines are then presented for illustrating impulse pumping physical principles, and validating the described modelling with experimental data.

  20. Simulation model of a PWR power plant

    International Nuclear Information System (INIS)

    Larsen, N.

    1987-03-01

    A simulation model of a hypothetical PWR power plant is described. A large number of disturbances and failures in plant function can be simulated. The model is written as seven modules to the modular simulation system for continuous processes DYSIM and serves also as a user example of this system. The model runs in Fortran 77 on the IBM-PC-AT. (author)

  1. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  2. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón

    2013-08-01

    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  3. Nonlinear friction model for servo press simulation

    Science.gov (United States)

    Ma, Ninshu; Sugitomo, Nobuhiko; Kyuno, Takunori; Tamura, Shintaro; Naka, Tetsuo

    2013-12-01

    The friction coefficient was measured under an idealized condition for a pulse servo motion. The measured friction coefficient and its changing with both sliding distance and a pulse motion showed that the friction resistance can be reduced due to the re-lubrication during unloading process of the pulse servo motion. Based on the measured friction coefficient and its changes with sliding distance and re-lubrication of oil, a nonlinear friction model was developed. Using the newly developed the nonlinear friction model, a deep draw simulation was performed and the formability was evaluated. The results were compared with experimental ones and the effectiveness was verified.

  4. Human Performance in Simulated Reduced Gravity Environments

    Science.gov (United States)

    Cowley, Matthew; Harvill, Lauren; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. Our current understanding of human performance in reduced gravity in a planetary environment (the moon or Mars) is limited to lunar observations, studies from the Apollo program, and recent suit tests conducted at JSC using reduced gravity simulators. This study will look at our most recent reduced gravity simulations performed on the new Active Response Gravity Offload System (ARGOS) compared to the C-9 reduced gravity plane. Methods: Subjects ambulated in reduced gravity analogs to obtain a baseline for human performance. Subjects were tested in lunar gravity (1.6 m/sq s) and Earth gravity (9.8 m/sq s) in shirt-sleeves. Subjects ambulated over ground at prescribed speeds on the ARGOS, but ambulated at a self-selected speed on the C-9 due to time limitations. Subjects on the ARGOS were given over 3 minutes to acclimate to the different conditions before data was collected. Nine healthy subjects were tested in the ARGOS (6 males, 3 females, 79.5 +/- 15.7 kg), while six subjects were tested on the C-9 (6 males, 78.8 +/- 11.2 kg). Data was collected with an optical motion capture system (Vicon, Oxford, UK) and was analyzed using customized analysis scripts in BodyBuilder (Vicon, Oxford, UK) and MATLAB (MathWorks, Natick, MA, USA). Results: In all offloaded conditions, variation between subjects increased compared to 1-g. Kinematics in the ARGOS at lunar gravity resembled earth gravity ambulation more closely than the C-9 ambulation. Toe-off occurred 10% earlier in both reduced gravity environments compared to earth gravity, shortening the stance phase. Likewise, ankle, knee, and hip angles remained consistently flexed and had reduced peaks compared to earth gravity. Ground reaction forces in lunar gravity (normalized to Earth body weight) were 0.4 +/- 0.2 on

  5. On the performance simulation of inter-stage turbine reheat

    International Nuclear Information System (INIS)

    Pellegrini, Alvise; Nikolaidis, Theoklis; Pachidis, Vassilios; Köhler, Stephan

    2017-01-01

    Highlights: • An innovative gas turbine performance simulation methodology is proposed. • It allows to perform DP and OD performance calculations for complex engines layouts. • It is essential for inter-turbine reheat (ITR) engine performance calculation. • A detailed description is provided for fast and flexible implementation. • The methodology is successfully verified against a commercial closed-source software. - Abstract: Several authors have suggested the implementation of reheat in high By-Pass Ratio (BPR) aero engines, to improve engine performance. In contrast to military afterburning, civil aero engines would aim at reducing Specific Fuel Consumption (SFC) by introducing ‘Inter-stage Turbine Reheat’ (ITR). To maximise benefits, the second combustor should be placed at an early stage of the expansion process, e.g. between the first and second High-Pressure Turbine (HPT) stages. The aforementioned cycle design requires the accurate simulation of two or more turbine stages on the same shaft. The Design Point (DP) performance can be easily evaluated by defining a Turbine Work Split (TWS) ratio between the turbine stages. However, the performance simulation of Off-Design (OD) operating points requires the calculation of the TWS parameter for every OD step, by taking into account the thermodynamic behaviour of each turbine stage, represented by their respective maps. No analytical solution of the aforementioned problem is currently available in the public domain. This paper presents an analytical methodology by which ITR can be simulated at DP and OD. Results show excellent agreement with a commercial, closed-source performance code; discrepancies range from 0% to 3.48%, and are ascribed to the different gas models implemented in the codes.

  6. Modeling and simulation of pressurized water reactor power plant

    International Nuclear Information System (INIS)

    Wang, S.J.

    1983-01-01

    Two kinds of balance of plant (BOP) models of a pressurized water reactor (PWR) system are developed in this work - the detailed BOP model and the simple BOP model. The detailed model is used to simulate the normal operational performance of a whole BOP system. The simple model is used to combine with the NSSS model for a whole plant simulation. The trends of the steady state values of the detailed model are correct and the dynamic responses are reasonable. The simple BOP model approach starts the modelling work from the overall point of view. The response of the normalized turbine power and the feedwater inlet temperature to the steam generator of the simple model are compared with those of the detailed model. Both the steady state values and the dynamic responses are close to those of the detailed model. The simple BOP model is found adequate to represent the main performance of the BOP system. The simple balance of plant model was coupled with a NSSS model for a whole plant simulation. The NSSS model consists of the reactor core model, the steam generator model, and the coolant temperature control system. A closed loop whole plant simulation for an electric load perturbation was performed. The results are plausible. The coupling effect between the NSSS system and the BOP system was analyzed. The feedback of the BOP system has little effect on the steam generator performance, while the performance of the BOP system is strongly affected by the steam flow rate from the NSSS

  7. Galaxy Alignments: Theory, Modelling & Simulations

    Science.gov (United States)

    Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais

    2015-11-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.

  8. High performance ultrasonic field simulation on complex geometries

    Science.gov (United States)

    Chouh, H.; Rougeron, G.; Chatillon, S.; Iehl, J. C.; Farrugia, J. P.; Ostromoukhov, V.

    2016-02-01

    Ultrasonic field simulation is a key ingredient for the design of new testing methods as well as a crucial step for NDT inspection simulation. As presented in a previous paper [1], CEA-LIST has worked on the acceleration of these simulations focusing on simple geometries (planar interfaces, isotropic materials). In this context, significant accelerations were achieved on multicore processors and GPUs (Graphics Processing Units), bringing the execution time of realistic computations in the 0.1 s range. In this paper, we present recent works that aim at similar performances on a wider range of configurations. We adapted the physical model used by the CIVA platform to design and implement a new algorithm providing a fast ultrasonic field simulation that yields nearly interactive results for complex cases. The improvements over the CIVA pencil-tracing method include adaptive strategies for pencil subdivisions to achieve a good refinement of the sensor geometry while keeping a reasonable number of ray-tracing operations. Also, interpolation of the times of flight was used to avoid time consuming computations in the impulse response reconstruction stage. To achieve the best performance, our algorithm runs on multi-core superscalar CPUs and uses high performance specialized libraries such as Intel Embree for ray-tracing, Intel MKL for signal processing and Intel TBB for parallelization. We validated the simulation results by comparing them to the ones produced by CIVA on identical test configurations including mono-element and multiple-element transducers, homogeneous, meshed 3D CAD specimens, isotropic and anisotropic materials and wave paths that can involve several interactions with interfaces. We show performance results on complete simulations that achieve computation times in the 1s range.

  9. Management of Industrial Performance Indicators: Regression Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Walter Roberto Hernandez Vergara

    2017-11-01

    Full Text Available Stochastic methods can be used in problem solving and explanation of natural phenomena through the application of statistical procedures. The article aims to associate the regression analysis and systems simulation, in order to facilitate the practical understanding of data analysis. The algorithms were developed in Microsoft Office Excel software, using statistical techniques such as regression theory, ANOVA and Cholesky Factorization, which made it possible to create models of single and multiple systems with up to five independent variables. For the analysis of these models, the Monte Carlo simulation and analysis of industrial performance indicators were used, resulting in numerical indices that aim to improve the goals’ management for compliance indicators, by identifying systems’ instability, correlation and anomalies. The analytical models presented in the survey indicated satisfactory results with numerous possibilities for industrial and academic applications, as well as the potential for deployment in new analytical techniques.

  10. Simulations of KSTAR high performance steady state operation scenarios

    International Nuclear Information System (INIS)

    Na, Yong-Su; Kessel, C.E.; Park, J.M.; Yi, Sumin; Kim, J.Y.; Becoulet, A.; Sips, A.C.C.

    2009-01-01

    We report the results of predictive modelling of high performance steady state operation scenarios in KSTAR. Firstly, the capabilities of steady state operation are investigated with time-dependent simulations using a free-boundary plasma equilibrium evolution code coupled with transport calculations. Secondly, the reproducibility of high performance steady state operation scenarios developed in the DIII-D tokamak, of similar size to that of KSTAR, is investigated using the experimental data taken from DIII-D. Finally, the capability of ITER-relevant steady state operation is investigated in KSTAR. It is found that KSTAR is able to establish high performance steady state operation scenarios; β N above 3, H 98 (y, 2) up to 2.0, f BS up to 0.76 and f NI equals 1.0. In this work, a realistic density profile is newly introduced for predictive simulations by employing the scaling law of a density peaking factor. The influence of the current ramp-up scenario and the transport model is discussed with respect to the fusion performance and non-inductive current drive fraction in the transport simulations. As observed in the experiments, both the heating and the plasma current waveforms in the current ramp-up phase produce a strong effect on the q-profile, the fusion performance and also on the non-inductive current drive fraction in the current flattop phase. A criterion in terms of q min is found to establish ITER-relevant steady state operation scenarios. This will provide a guideline for designing the current ramp-up phase in KSTAR. It is observed that the transport model also affects the predictive values of fusion performance as well as the non-inductive current drive fraction. The Weiland transport model predicts the highest fusion performance as well as non-inductive current drive fraction in KSTAR. In contrast, the GLF23 model exhibits the lowest ones. ITER-relevant advanced scenarios cannot be obtained with the GLF23 model in the conditions given in this work

  11. Performance evaluation of sea surface simulation methods for target detection

    Science.gov (United States)

    Xia, Renjie; Wu, Xin; Yang, Chen; Han, Yiping; Zhang, Jianqi

    2017-11-01

    With the fast development of sea surface target detection by optoelectronic sensors, machine learning has been adopted to improve the detection performance. Many features can be learned from training images by machines automatically. However, field images of sea surface target are not sufficient as training data. 3D scene simulation is a promising method to address this problem. For ocean scene simulation, sea surface height field generation is the key point to achieve high fidelity. In this paper, two spectra-based height field generation methods are evaluated. Comparison between the linear superposition and linear filter method is made quantitatively with a statistical model. 3D ocean scene simulating results show the different features between the methods, which can give reference for synthesizing sea surface target images with different ocean conditions.

  12. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    Science.gov (United States)

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  13. Artificial neural network simulation of battery performance

    Energy Technology Data Exchange (ETDEWEB)

    O`Gorman, C.C.; Ingersoll, D.; Jungst, R.G.; Paez, T.L.

    1998-12-31

    Although they appear deceptively simple, batteries embody a complex set of interacting physical and chemical processes. While the discrete engineering characteristics of a battery such as the physical dimensions of the individual components, are relatively straightforward to define explicitly, their myriad chemical and physical processes, including interactions, are much more difficult to accurately represent. Within this category are the diffusive and solubility characteristics of individual species, reaction kinetics and mechanisms of primary chemical species as well as intermediates, and growth and morphology characteristics of reaction products as influenced by environmental and operational use profiles. For this reason, development of analytical models that can consistently predict the performance of a battery has only been partially successful, even though significant resources have been applied to this problem. As an alternative approach, the authors have begun development of a non-phenomenological model for battery systems based on artificial neural networks. Both recurrent and non-recurrent forms of these networks have been successfully used to develop accurate representations of battery behavior. The connectionist normalized linear spline (CMLS) network has been implemented with a self-organizing layer to model a battery system with the generalized radial basis function net. Concurrently, efforts are under way to use the feedforward back propagation network to map the {open_quotes}state{close_quotes} of a battery system. Because of the complexity of battery systems, accurate representation of the input and output parameters has proven to be very important. This paper describes these initial feasibility studies as well as the current models and makes comparisons between predicted and actual performance.

  14. Modeling and simulation of gamma camera

    International Nuclear Information System (INIS)

    Singh, B.; Kataria, S.K.; Samuel, A.M.

    2002-08-01

    Simulation techniques play a vital role in designing of sophisticated instruments and also for the training of operating and maintenance staff. Gamma camera systems have been used for functional imaging in nuclear medicine. Functional images are derived from the external counting of the gamma emitting radioactive tracer that after introduction in to the body mimics the behavior of native biochemical compound. The position sensitive detector yield the coordinates of the gamma ray interaction with the detector and are used to estimate the point of gamma ray emission within the tracer distribution space. This advanced imaging device is thus dependent on the performance of algorithm for coordinate computing, estimation of point of emission, generation of image and display of the image data. Contemporary systems also have protocols for quality control and clinical evaluation of imaging studies. Simulation of this processing leads to understanding of the basic camera design problems. This report describes a PC based package for design and simulation of gamma camera along with the options of simulating data acquisition and quality control of imaging studies. Image display and data processing the other options implemented in SIMCAM will be described in separate reports (under preparation). Gamma camera modeling and simulation in SIMCAM has preset configuration of the design parameters for various sizes of crystal detector with the option to pack the PMT on hexagon or square lattice. Different algorithm for computation of coordinates and spatial distortion removal are allowed in addition to the simulation of energy correction circuit. The user can simulate different static, dynamic, MUGA and SPECT studies. The acquired/ simulated data is processed for quality control and clinical evaluation of the imaging studies. Results show that the program can be used to assess these performances. Also the variations in performance parameters can be assessed due to the induced

  15. A New Model for Simulating TSS Washoff in Urban Areas

    Directory of Open Access Journals (Sweden)

    E. Crobeddu

    2011-01-01

    Full Text Available This paper presents the formulation and validation of the conceptual Runoff Quality Simulation Model (RQSM that was developed to simulate the erosion and transport of solid particles in urban areas. The RQSM assumes that solid particle accumulation on pervious and impervious areas is infinite. The RQSM simulates soil erosion using rainfall kinetic energy and solid particle transport with linear system theory. A sensitivity analysis was conducted on the RQSM to show the influence of each parameter on the simulated load. Total suspended solid (TSS loads monitored at the outlet of the borough of Verdun in Canada and at three catchment outlets of the City of Champaign in the United States were used to validate the RQSM. TSS loads simulated by the RQSM were compared to measured loads and to loads simulated by the Rating Curve model and the Exponential model of the SWMM software. The simulation performance of the RQSM was comparable to the Exponential and Rating Curve models.

  16. Modeling and Simulation of a 12 MW Wind Farm

    Directory of Open Access Journals (Sweden)

    GROZA, V.

    2010-05-01

    Full Text Available The installation of wind turbines in power systems has developed rapidly through the last 20 years. In this paper a complete simulation model of a 6 x 2 MW wind turbines is presented using data from a wind farm installed in Denmark. A model of the wind turbine with cage-rotor induction generator is presented in details. A set of simulations are performed and they show that it is possible to simulate a complete wind farm from wind to the grid. The simulation tool can also be used to simulate bigger wind farms connected to the grid.

  17. Simulation modeling for quality and productivity in steel cord manufacturing

    OpenAIRE

    Türkseven, Can Hulusi; Turkseven, Can Hulusi; Ertek, Gürdal; Ertek, Gurdal

    2003-01-01

    We describe the application of simulation modeling to estimate and improve quality and productivity performance of a steel cord manufacturing system. We describe the typical steel cord manufacturing plant, emphasize its distinguishing characteristics, identify various production settings and discuss applicability of simulation as a management decision support tool. Besides presenting the general structure of the developed simulation model, we focus on wire fractures, which can be an important...

  18. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  19. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  20. Fully Coupled Simulation of Lithium Ion Battery Cell Performance

    Energy Technology Data Exchange (ETDEWEB)

    Trembacki, Bradley L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Murthy, Jayathi Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Scott Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Lithium-ion battery particle-scale (non-porous electrode) simulations applied to resolved electrode geometries predict localized phenomena and can lead to better informed decisions on electrode design and manufacturing. This work develops and implements a fully-coupled finite volume methodology for the simulation of the electrochemical equations in a lithium-ion battery cell. The model implementation is used to investigate 3D battery electrode architectures that offer potential energy density and power density improvements over traditional layer-by-layer particle bed battery geometries. Advancement of micro-scale additive manufacturing techniques has made it possible to fabricate these 3D electrode microarchitectures. A variety of 3D battery electrode geometries are simulated and compared across various battery discharge rates and length scales in order to quantify performance trends and investigate geometrical factors that improve battery performance. The energy density and power density of the 3D battery microstructures are compared in several ways, including a uniform surface area to volume ratio comparison as well as a comparison requiring a minimum manufacturable feature size. Significant performance improvements over traditional particle bed electrode designs are observed, and electrode microarchitectures derived from minimal surfaces are shown to be superior. A reduced-order volume-averaged porous electrode theory formulation for these unique 3D batteries is also developed, allowing simulations on the full-battery scale. Electrode concentration gradients are modeled using the diffusion length method, and results for plate and cylinder electrode geometries are compared to particle-scale simulation results. Additionally, effective diffusion lengths that minimize error with respect to particle-scale results for gyroid and Schwarz P electrode microstructures are determined.

  1. Desempenho de bovinos simulado pelo modelo Pampa Corte e obtido por experimentação Animal performance simulated by Pampa Corte model with experimental records

    Directory of Open Access Journals (Sweden)

    Naíme de Barcellos Trevisan

    2009-02-01

    Full Text Available Este trabalho tem como objetivo verificar a confiabilidade do Modelo Pampa Corte na predição de desempenho de bovinos de corte, em sistemas de pastejo. Para tanto, foram confrontados os valores preditos pelo modelo com dados disponíveis na literatura. Foram verificados coeficientes de correlação acima de 90% entre os dados reais e os simulados em todas as alternativas testadas. O banco de dados do Modelo precisa ser ampliado em termos de alternativas de produtividade das forrageiras, em diferentes condições climáticas. Os parâmetros qualitativos degradabilidade da proteína bruta e fibra em detergente neutro da consorciação aveia preta e azevém necessitam ainda ser pesquisados, assim como o desempenho de animais em pastagens singulares de aveia ou azevém.This study had the objective to evaluate Pampa Corte Model’s reliability in predicting beef cattle performance in grazing systems. For this purpose, model’s predicted values were compared to available data base of published papers. Correlation coefficients above 90 % were obtained between simulated and real data in all tested alternatives. Model’s data base should be enlarged by forage productivity data in different climatic conditions. Mixtures of Italian ryegrass and oat need more studies to obtain qualitative parameters (crude protein degradability and neutral detergent fiber, as well, animal performance in the single pastures of oat or Italian ryegrass.

  2. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ

    1985-03-01

    Full Text Available -animal interactions. An additional two models, which expand aspects of the FYNBOS model, are described: a model for simulating canopy processes; and a Fire Recovery Simulator. The canopy process model will simulate ecophysiological processes in more detail than FYNBOS...

  3. Crystal and molecular simulation of high-performance polymers.

    Science.gov (United States)

    Colquhoun, H M; Williams, D J

    2000-03-01

    Single-crystal X-ray analyses of oligomeric models for high-performance aromatic polymers, interfaced to computer-based molecular modeling and diffraction simulation, have enabled the determination of a range of previously unknown polymer crystal structures from X-ray powder data. Materials which have been successfully analyzed using this approach include aromatic polyesters, polyetherketones, polythioetherketones, polyphenylenes, and polycarboranes. Pure macrocyclic homologues of noncrystalline polyethersulfones afford high-quality single crystals-even at very large ring sizes-and have provided the first examples of a "protein crystallographic" approach to the structures of conventionally amorphous synthetic polymers.

  4. The new rosetta targets observations, simulations and instrument performances

    CERN Document Server

    Epifani, Elena; Palumbo, Pasquale

    2004-01-01

    The Rosetta mission was successfully launched on March 2nd, 2004 for a rendezvous with the short period comet 67PChuryumov-Gerasimenko in 2014 The new baseline mission foresees also a double fly-by with asteroids 21 Lutetia and 2867 Steins, on the way towards the primary target This volume collects papers presented at the workshop on "The NEW Rosetta targets Observations, simulations and instrument performances", held in Capri on October 13-15, 2003 The papers cover the fields of observations of the new Rosetta targets, laboratory experiments and theoretical simulation of cometary processes, and the expected performances of Rosetta experiments Until real operations around 67PChuryumov-Gerasimenko will start in 10 years from now, new astronomical observations, laboratory experiments and theoretical models are required The goals are to increase knowledge about physics and chemistry of comets and to prepare to exploit at best Rosetta data

  5. Numerical simulation investigation on centrifugal compressor performance of turbocharger

    International Nuclear Information System (INIS)

    Li, Jie; Yin, Yuting; Li, Shuqi; Zhang, Jizhong

    2013-01-01

    In this paper, the mathematical model of the flow filed in centrifugal compressor of turbocharger was studied. Based on the theory of computational fluid dynamics (CFD), performance curves and parameter distributions of the compressor were obtained from the 3-D numerical simulation by using CFX. Meanwhile, the influences of grid number and distribution on compressor performance were investigated, and numerical calculation method was analyzed and validated, through combining with test data. The results obtained show the increase of the grid number has little influence on compressor performance while the grid number of single-passage is above 300,000. The results also show that the numerical calculation mass flow rate of compressor choke situation has a good consistent with test results, and the maximum difference of the diffuser exit pressure between simulation and experiment decrease to 3.5% with the assumption of 6 kPa additional total pressure loss at compressor inlet. The numerical simulation method in this paper can be used to predict compressor performance, and the difference of total pressure ratio between calculation and test is less than 7%, and the total-to-total efficiency also have a good consistent with test.

  6. Numerical simulation investigation on centrifugal compressor performance of turbocharger

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jie [China Iron and Steel Research Institute Group, Beijing (China); Yin, Yuting [China North Engine Research Institute, Datong (China); Li, Shuqi; Zhang, Jizhong [Science and Technology Diesel Engine Turbocharging Laboratory, Datong (China)

    2013-06-15

    In this paper, the mathematical model of the flow filed in centrifugal compressor of turbocharger was studied. Based on the theory of computational fluid dynamics (CFD), performance curves and parameter distributions of the compressor were obtained from the 3-D numerical simulation by using CFX. Meanwhile, the influences of grid number and distribution on compressor performance were investigated, and numerical calculation method was analyzed and validated, through combining with test data. The results obtained show the increase of the grid number has little influence on compressor performance while the grid number of single-passage is above 300,000. The results also show that the numerical calculation mass flow rate of compressor choke situation has a good consistent with test results, and the maximum difference of the diffuser exit pressure between simulation and experiment decrease to 3.5% with the assumption of 6 kPa additional total pressure loss at compressor inlet. The numerical simulation method in this paper can be used to predict compressor performance, and the difference of total pressure ratio between calculation and test is less than 7%, and the total-to-total efficiency also have a good consistent with test.

  7. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantification of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to define parts...

  8. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  9. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  10. Neurocognitive Correlates of Young Drivers' Performance in a Driving Simulator.

    Science.gov (United States)

    Guinosso, Stephanie A; Johnson, Sara B; Schultheis, Maria T; Graefe, Anna C; Bishai, David M

    2016-04-01

    Differences in neurocognitive functioning may contribute to driving performance among young drivers. However, few studies have examined this relation. This pilot study investigated whether common neurocognitive measures were associated with driving performance among young drivers in a driving simulator. Young drivers (19.8 years (standard deviation [SD] = 1.9; N = 74)) participated in a battery of neurocognitive assessments measuring general intellectual capacity (Full-Scale Intelligence Quotient, FSIQ) and executive functioning, including the Stroop Color-Word Test (cognitive inhibition), Wisconsin Card Sort Test-64 (cognitive flexibility), and Attention Network Task (alerting, orienting, and executive attention). Participants then drove in a simulated vehicle under two conditions-a baseline and driving challenge. During the driving challenge, participants completed a verbal working memory task to increase demand on executive attention. Multiple regression models were used to evaluate the relations between the neurocognitive measures and driving performance under the two conditions. FSIQ, cognitive inhibition, and alerting were associated with better driving performance at baseline. FSIQ and cognitive inhibition were also associated with better driving performance during the verbal challenge. Measures of cognitive flexibility, orienting, and conflict executive control were not associated with driving performance under either condition. FSIQ and, to some extent, measures of executive function are associated with driving performance in a driving simulator. Further research is needed to determine if executive function is associated with more advanced driving performance under conditions that demand greater cognitive load. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  11. Principles of Sonar Performance Modeling

    NARCIS (Netherlands)

    Ainslie, M.A.

    2010-01-01

    Sonar performance modelling (SPM) is concerned with the prediction of quantitative measures of sonar performance, such as probability of detection. It is a multidisciplinary subject, requiring knowledge and expertise in the disparate fields of underwater acoustics, acoustical oceanography, sonar

  12. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  13. A Simulation Model Articulation of the REA Ontology

    Science.gov (United States)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  14. Performance Analysis of Wavelet Channel Coding in COST207-based Channel Models on Simulated Radio-over-Fiber Systems at the W-Band

    DEFF Research Database (Denmark)

    Cavalcante, Lucas Costa Pereira; Silveira, Luiz F. Q.; Rommel, Simon

    2016-01-01

    Millimeter wave communications based on photonic technologies have gained increased attention to provide optic fiber-like capacity in wireless environments. However, the new hybrid fiber-wireless channel represents new challenges in terms of signal transmission performance analysis. Traditionally......, such systems use diversity schemes in combination with digital signal processing (DSP) techniques to overcome effects such as fading and inter-symbol interference (ISI). Wavelet Channel Coding (WCC) has emerged as a technique to minimize the fading effects of wireless channels, which is a mayor challenge...... in systems operating in the millimeter wave regime. This work takes the WCC one step beyond by performance evaluation in terms of bit error probability, over time-varying, frequency-selective multipath Rayleigh fading channels. The adopted propagation model follows the COST207 norm, the main international...

  15. Architectural and growth traits differ in effects on performance of clonal plants: an analysis using a field-parameterized simulation model

    Czech Academy of Sciences Publication Activity Database

    Wildová, Radka; Gough, L.; Herben, Tomáš; Hershock, Ch.; Goldberg, D. E.

    2007-01-01

    Roč. 116, č. 5 (2007), s. 836-852 ISSN 0030-1299 R&D Projects: GA ČR(CZ) GA206/02/0953; GA ČR(CZ) GA206/02/0578 Grant - others:NSF(US) DEB99-74296; NSF(US) DEB99-74284 Institutional research plan: CEZ:AV0Z60050516 Keywords : individual-based model * performance * plant architecture * competitive response * resource allocation Subject RIV: EF - Botanics Impact factor: 3.136, year: 2007

  16. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  17. Plasma disruption modeling and simulation

    International Nuclear Information System (INIS)

    Hassanein, A.

    1994-01-01

    Disruptions in tokamak reactors are considered a limiting factor to successful operation and reliable design. The behavior of plasma-facing components during a disruption is critical to the overall integrity of the reactor. Erosion of plasma facing-material (PFM) surfaces due to thermal energy dump during the disruption can severely limit the lifetime of these components and thus diminish the economic feasibility of the reactor. A comprehensive understanding of the interplay of various physical processes during a disruption is essential for determining component lifetime and potentially improving the performance of such components. There are three principal stages in modeling the behavior of PFM during a disruption. Initially, the incident plasma particles will deposit their energy directly on the PFM surface, heating it to a very high temperature where ablation occurs. Models for plasma-material interactions have been developed and used to predict material thermal evolution during the disruption. Within a few microseconds after the start of the disruption, enough material is vaporized to intercept most of the incoming plasma particles. Models for plasma-vapor interactions are necessary to predict vapor cloud expansion and hydrodynamics. Continuous heating of the vapor cloud above the material surface by the incident plasma particles will excite, ionize, and cause vapor atoms to emit thermal radiation. Accurate models for radiation transport in the vapor are essential for calculating the net radiated flux to the material surface which determines the final erosion thickness and consequently component lifetime. A comprehensive model that takes into account various stages of plasma-material interaction has been developed and used to predict erosion rates during reactor disruption, as well during induced disruption in laboratory experiments

  18. Architecture oriented modeling and simulation method for combat mission profile

    Directory of Open Access Journals (Sweden)

    CHEN Xia

    2017-05-01

    Full Text Available In order to effectively analyze the system behavior and system performance of combat mission profile, an architecture-oriented modeling and simulation method is proposed. Starting from the architecture modeling,this paper describes the mission profile based on the definition from National Military Standard of China and the US Department of Defense Architecture Framework(DoDAFmodel, and constructs the architecture model of the mission profile. Then the transformation relationship between the architecture model and the agent simulation model is proposed to form the mission profile executable model. At last,taking the air-defense mission profile as an example,the agent simulation model is established based on the architecture model,and the input and output relations of the simulation model are analyzed. It provides method guidance for the combat mission profile design.

  19. Indium (In) Effects to The Efficiency Performance of Ga1-XInxP/GaAs Based Solar Cell Using Silvaco Software Modelling & Simulation

    Science.gov (United States)

    Norizan, M. N.; Zahari, S. M.; Mohamad, I. S.; Osman, R. A. M.; Shahimin, M. M.; Murad, S. A. Z.

    2017-06-01

    Ga1-xInxP composition has been applied to the top cell of multi-junction GaInP/GaAs based solar cell and currently have achieving a conversion efficiency of more than 46%, however its capability is unclear. We performed an analysis using Silvaco simulation method to evaluate the effect of In and the substitution was made to the Ga1-xInxP for the range of x from 0 to 1. We found that the highest efficiency recorded was 17.66% when the composition of Indium was x=1. The efficiency has been increasing about 11.71% from x=0 to x=1 In content. As the composition of In raised, the value of efficiency and short circuit current density, Jsc also become higher (13.60 mA/cm2) by having a greater photon absorption in a wider band gap energy. In addition to that, Voc, Pmax, Vmax, Imax and fill factor was measured to be 2.15 V, 2.44 mW/cm2, 2.0 V, 1.22 mA/cm2 and 83.34 respectively. In conclusion, this study confirms that the existence of In in Ga1-xInxP improves the solar cell efficiency by gaining a higher energy gap and producing more electrons for best achievement in multilayer solar cell applications.

  20. On Improving 4-km Mesoscale Model Simulations

    Science.gov (United States)

    Deng, Aijun; Stauffer, David R.

    2006-03-01

    A previous study showed that use of analysis-nudging four-dimensional data assimilation (FDDA) and improved physics in the fifth-generation Pennsylvania State University National Center for Atmospheric Research Mesoscale Model (MM5) produced the best overall performance on a 12-km-domain simulation, based on the 18 19 September 1983 Cross-Appalachian Tracer Experiment (CAPTEX) case. However, reducing the simulated grid length to 4 km had detrimental effects. The primary cause was likely the explicit representation of convection accompanying a cold-frontal system. Because no convective parameterization scheme (CPS) was used, the convective updrafts were forced on coarser-than-realistic scales, and the rainfall and the atmospheric response to the convection were too strong. The evaporative cooling and downdrafts were too vigorous, causing widespread disruption of the low-level winds and spurious advection of the simulated tracer. In this study, a series of experiments was designed to address this general problem involving 4-km model precipitation and gridpoint storms and associated model sensitivities to the use of FDDA, planetary boundary layer (PBL) turbulence physics, grid-explicit microphysics, a CPS, and enhanced horizontal diffusion. Some of the conclusions include the following: 1) Enhanced parameterized vertical mixing in the turbulent kinetic energy (TKE) turbulence scheme has shown marked improvements in the simulated fields. 2) Use of a CPS on the 4-km grid improved the precipitation and low-level wind results. 3) Use of the Hong and Pan Medium-Range Forecast PBL scheme showed larger model errors within the PBL and a clear tendency to predict much deeper PBL heights than the TKE scheme. 4) Combining observation-nudging FDDA with a CPS produced the best overall simulations. 5) Finer horizontal resolution does not always produce better simulations, especially in convectively unstable environments, and a new CPS suitable for 4-km resolution is needed. 6

  1. Network Modeling and Simulation A Practical Perspective

    CERN Document Server

    Guizani, Mohsen; Khan, Bilal

    2010-01-01

    Network Modeling and Simulation is a practical guide to using modeling and simulation to solve real-life problems. The authors give a comprehensive exposition of the core concepts in modeling and simulation, and then systematically address the many practical considerations faced by developers in modeling complex large-scale systems. The authors provide examples from computer and telecommunication networks and use these to illustrate the process of mapping generic simulation concepts to domain-specific problems in different industries and disciplines. Key features: Provides the tools and strate

  2. Modelling, simulating and optimizing boiler heating surfaces and evaporator circuits

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for optimizing the dynamic performance of boiler have been developed. Design variables related to the size of the boiler and its dynamic performance have been defined. The object function to be optimized takes the weight of the boiler and its dynamic capability into account. As constraints...... for the optimization a dynamic model for the boiler is applied. Furthermore a function for the value of the dynamic performance is included in the model. The dynamic models for simulating boiler performance consists of a model for the flue gas side, a model for the evaporator circuit and a model for the drum....... The dynamic model has been developed for the purpose of determining boiler material temperatures and heat transfer from the flue gas side to the water-/steam side in order to simulate the circulation in the evaporator circuit and hereby the water level fluctuations in the drum. The dynamic model has been...

  3. Systematic simulations of modified gravity: chameleon models

    International Nuclear Information System (INIS)

    Brax, Philippe; Davis, Anne-Christine; Li, Baojiu; Winther, Hans A.; Zhao, Gong-Bo

    2013-01-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc −1 , since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future

  4. Systematic simulations of modified gravity: chameleon models

    Energy Technology Data Exchange (ETDEWEB)

    Brax, Philippe [Institut de Physique Theorique, CEA, IPhT, CNRS, URA 2306, F-91191Gif/Yvette Cedex (France); Davis, Anne-Christine [DAMTP, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Li, Baojiu [Institute for Computational Cosmology, Department of Physics, Durham University, Durham DH1 3LE (United Kingdom); Winther, Hans A. [Institute of Theoretical Astrophysics, University of Oslo, 0315 Oslo (Norway); Zhao, Gong-Bo, E-mail: philippe.brax@cea.fr, E-mail: a.c.davis@damtp.cam.ac.uk, E-mail: baojiu.li@durham.ac.uk, E-mail: h.a.winther@astro.uio.no, E-mail: gong-bo.zhao@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 3FX (United Kingdom)

    2013-04-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.

  5. High-Performance Beam Simulator for the LANSCE Linac

    International Nuclear Information System (INIS)

    Pang, Xiaoying; Rybarcyk, Lawrence J.; Baily, Scott A.

    2012-01-01

    A high performance multiparticle tracking simulator is currently under development at Los Alamos. The heart of the simulator is based upon the beam dynamics simulation algorithms of the PARMILA code, but implemented in C++ on Graphics Processing Unit (GPU) hardware using NVIDIA's CUDA platform. Linac operating set points are provided to the simulator via the EPICS control system so that changes of the real time linac parameters are tracked and the simulation results updated automatically. This simulator will provide valuable insight into the beam dynamics along a linac in pseudo real-time, especially where direct measurements of the beam properties do not exist. Details regarding the approach, benefits and performance are presented.

  6. A study for production simulation model generation system based on data model at a shipyard

    Directory of Open Access Journals (Sweden)

    Myung-Gi Back

    2016-09-01

    Full Text Available Simulation technology is a type of shipbuilding product lifecycle management solution used to support production planning or decision-making. Normally, most shipbuilding processes are consisted of job shop production, and the modeling and simulation require professional skills and experience on shipbuilding. For these reasons, many shipbuilding companies have difficulties adapting simulation systems, regardless of the necessity for the technology. In this paper, the data model for shipyard production simulation model generation was defined by analyzing the iterative simulation modeling procedure. The shipyard production simulation data model defined in this study contains the information necessary for the conventional simulation modeling procedure and can serve as a basis for simulation model generation. The efficacy of the developed system was validated by applying it to the simulation model generation of the panel block production line. By implementing the initial simulation model generation process, which was performed in the past with a simulation modeler, the proposed system substantially reduced the modeling time. In addition, by reducing the difficulties posed by different modeler-dependent generation methods, the proposed system makes the standardization of the simulation model quality possible.

  7. Far-Field Acoustic Power Level and Performance Analyses of F31/A31 Open Rotor Model at Simulated Scaled Takeoff, Nominal Takeoff, and Approach Conditions: Technical Report I

    Science.gov (United States)

    Sree, Dave

    2015-01-01

    Far-field acoustic power level and performance analyses of open rotor model F31/A31 have been performed to determine its noise characteristics at simulated scaled takeoff, nominal takeoff, and approach flight conditions. The nonproprietary parts of the data obtained from experiments in 9- by 15-Foot Low-Speed Wind Tunnel (9?15 LSWT) tests were provided by NASA Glenn Research Center to perform the analyses. The tone and broadband noise components have been separated from raw test data by using a new data analysis tool. Results in terms of sound pressure levels, acoustic power levels, and their variations with rotor speed, angle of attack, thrust, and input shaft power have been presented and discussed. The effect of an upstream pylon on the noise levels of the model has been addressed. Empirical equations relating model's acoustic power level, thrust, and input shaft power have been developed. The far-field acoustic efficiency of the model is also determined for various simulated flight conditions. It is intended that the results presented in this work will serve as a database for comparison and improvement of other open rotor blade designs and also for validating open rotor noise prediction codes.

  8. Predictive neuromechanical simulations indicate why walking performance declines with ageing.

    Science.gov (United States)

    Song, Seungmoon; Geyer, Hartmut

    2018-04-01

    Although the natural decline in walking performance with ageing affects the quality of life of a growing elderly population, its physiological origins remain unknown. By using predictive neuromechanical simulations of human walking with age-related neuro-musculo-skeletal changes, we find evidence that the loss of muscle strength and muscle contraction speed dominantly contribute to the reduced walking economy and speed. The findings imply that focusing on recovering these muscular changes may be the only effective way to improve performance in elderly walking. More generally, the work is of interest for investigating the physiological causes of altered gait due to age, injury and disorders. Healthy elderly people walk slower and energetically less efficiently than young adults. This decline in walking performance lowers the quality of life for a growing ageing population, and understanding its physiological origin is critical for devising interventions that can delay or revert it. However, the origin of the decline in walking performance remains unknown, as ageing produces a range of physiological changes whose individual effects on gait are difficult to separate in experiments with human subjects. Here we use a predictive neuromechanical model to separately address the effects of common age-related changes to the skeletal, muscular and nervous systems. We find in computer simulations of this model that the combined changes produce gait consistent with elderly walking and that mainly the loss of muscle strength and mass reduces energy efficiency. In addition, we find that the slower preferred walking speed of elderly people emerges in the simulations when adapting to muscle fatigue, again mainly caused by muscle-related changes. The results suggest that a focus on recovering these muscular changes may be the only effective way to improve performance in elderly walking. © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.

  9. Modelling and simulation of a heat exchanger

    Science.gov (United States)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  10. Performance analyses of naval ships based on engineering level of simulation at the initial design stage

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Jeong

    2017-07-01

    Full Text Available Naval ships are assigned many and varied missions. Their performance is critical for mission success, and depends on the specifications of the components. This is why performance analyses of naval ships are required at the initial design stage. Since the design and construction of naval ships take a very long time and incurs a huge cost, Modeling and Simulation (M & S is an effective method for performance analyses. Thus in this study, a simulation core is proposed to analyze the performance of naval ships considering their specifications. This simulation core can perform the engineering level of simulations, considering the mathematical models for naval ships, such as maneuvering equations and passive sonar equations. Also, the simulation models of the simulation core follow Discrete EVent system Specification (DEVS and Discrete Time System Specification (DTSS formalisms, so that simulations can progress over discrete events and discrete times. In addition, applying DEVS and DTSS formalisms makes the structure of simulation models flexible and reusable. To verify the applicability of this simulation core, such a simulation core was applied to simulations for the performance analyses of a submarine in an Anti-SUrface Warfare (ASUW mission. These simulations were composed of two scenarios. The first scenario of submarine diving carried out maneuvering performance analysis by analyzing the pitch angle variation and depth variation of the submarine over time. The second scenario of submarine detection carried out detection performance analysis by analyzing how well the sonar of the submarine resolves adjacent targets. The results of these simulations ensure that the simulation core of this study could be applied to the performance analyses of naval ships considering their specifications.

  11. Modeling and simulation of large HVDC systems

    Energy Technology Data Exchange (ETDEWEB)

    Jin, H.; Sood, V.K.

    1993-01-01

    This paper addresses the complexity and the amount of work in preparing simulation data and in implementing various converter control schemes and the excessive simulation time involved in modelling and simulation of large HVDC systems. The Power Electronic Circuit Analysis program (PECAN) is used to address these problems and a large HVDC system with two dc links is simulated using PECAN. A benchmark HVDC system is studied to compare the simulation results with those from other packages. The simulation time and results are provided in the paper.

  12. Characterising performance of environmental models

    NARCIS (Netherlands)

    Bennett, N.D.; Croke, B.F.W.; Guariso, G.; Guillaume, J.H.A.; Hamilton, S.H.; Jakeman, A.J.; Marsili-Libelli, S.; Newham, L.T.H.; Norton, J.; Perrin, C.; Pierce, S.; Robson, B.; Seppelt, R.; Voinov, A.; Fath, B.D.; Andreassian, V.

    2013-01-01

    In order to use environmental models effectively for management and decision-making, it is vital to establish an appropriate level of confidence in their performance. This paper reviews techniques available across various fields for characterising the performance of environmental models with focus

  13. Modeling lift operations with SASmacr Simulation Studio

    Science.gov (United States)

    Kar, Leow Soo

    2016-10-01

    Lifts or elevators are an essential part of multistorey buildings which provide vertical transportation for its occupants. In large and high-rise apartment buildings, its occupants are permanent, while in buildings, like hospitals or office blocks, the occupants are temporary or users of the buildings. They come in to work or to visit, and thus, the population of such buildings are much higher than those in residential apartments. It is common these days that large office blocks or hospitals have at least 8 to 10 lifts serving its population. In order to optimize the level of service performance, different transportation schemes are devised to control the lift operations. For example, one lift may be assigned to solely service the even floors and another solely for the odd floors, etc. In this paper, a basic lift system is modelled using SAS Simulation Studio to study the effect of factors such as the number of floors, capacity of the lift car, arrival rate and exit rate of passengers at each floor, peak and off peak periods on the system performance. The simulation is applied to a real lift operation in Sunway College's North Building to validate the model.

  14. Optical modeling and simulation of thin-film photovoltaic devices

    CERN Document Server

    Krc, Janez

    2013-01-01

    In wafer-based and thin-film photovoltaic (PV) devices, the management of light is a crucial aspect of optimization since trapping sunlight in active parts of PV devices is essential for efficient energy conversions. Optical modeling and simulation enable efficient analysis and optimization of the optical situation in optoelectronic and PV devices. Optical Modeling and Simulation of Thin-Film Photovoltaic Devices provides readers with a thorough guide to performing optical modeling and simulations of thin-film solar cells and PV modules. It offers insight on examples of existing optical models

  15. Solar power plant performance evaluation: simulation and experimental validation

    International Nuclear Information System (INIS)

    Natsheh, E M; Albarbar, A

    2012-01-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P and O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  16. Solar power plant performance evaluation: simulation and experimental validation

    Science.gov (United States)

    Natsheh, E. M.; Albarbar, A.

    2012-05-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P&O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  17. Assessing performance and validating finite element simulations using probabilistic knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, Ronald M.; Rodriguez, E. A. (Edward A.)

    2002-01-01

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.

  18. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  19. Critical review of glass performance modeling

    International Nuclear Information System (INIS)

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process

  20. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  1. Photovoltaic performance models - A report card

    Science.gov (United States)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  2. Numerical simulation of hydrodynamic performance of ship under oblique conditions

    Directory of Open Access Journals (Sweden)

    CHEN Zhiming

    2018-02-01

    Full Text Available [Objectives] This paper is intended to study the viscous flow field around a ship under oblique conditions and provide a research basis for ship maneuverability. [Methods] Using commercial software STRA-CCM+, the SST k-ω turbulence model is selected to predict the hydrodynamic performance of the KVLCC2 model at different drift angles, and predict the hull flow field. The pressure distribution of the ship model at different drift angles is observed and the vortex shedding of the ship's hull and constraint streamlines on the hull's surface are also observed. [Results] The results show that numerical simulation can satisfy the demands of engineering application in the prediction of the lateral force, yaw moment and hull surface pressure distribution of a ship. [Conclusions] The research results of this paper can provide valuable references for the study of the flow separation phenomenon under oblique conditions.

  3. Model improvements to simulate charging in SEM

    Science.gov (United States)

    Arat, K. T.; Klimpel, T.; Hagen, C. W.

    2018-03-01

    Charging of insulators is a complex phenomenon to simulate since the accuracy of the simulations is very sensitive to the interaction of electrons with matter and electric fields. In this study, we report model improvements for a previously developed Monte-Carlo simulator to more accurately simulate samples that charge. The improvements include both modelling of low energy electron scattering and charging of insulators. The new first-principle scattering models provide a more realistic charge distribution cloud in the material, and a better match between non-charging simulations and experimental results. Improvements on charging models mainly focus on redistribution of the charge carriers in the material with an induced conductivity (EBIC) and a breakdown model, leading to a smoother distribution of the charges. Combined with a more accurate tracing of low energy electrons in the electric field, we managed to reproduce the dynamically changing charging contrast due to an induced positive surface potential.

  4. Simulation model for port shunting yards

    Science.gov (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  5. Uncertainty and sensitivity analysis in building performance simulation for decision support and design optimization

    NARCIS (Netherlands)

    Hopfe, C.J.

    2009-01-01

    Building performance simulation (BPS) uses computer-based models that cover performance aspects such as energy consumption and thermal comfort in buildings. The uptake of BPS in current building design projects is limited. Although there is a large number of building simulation tools available, the

  6. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  7. A Network Contention Model for the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2015-01-01

    The Extreme-scale Simulator (xSim) is a performance investigation toolkit for high-performance computing (HPC) hardware/software co-design. It permits running a HPC application with millions of concurrent execution threads, while observing its performance in a simulated extreme-scale system. This paper details a newly developed network modeling feature for xSim, eliminating the shortcomings of the existing network modeling capabilities. The approach takes a different path for implementing network contention and bandwidth capacity modeling using a less synchronous and accurate enough model design. With the new network modeling feature, xSim is able to simulate on-chip and on-node networks with reasonable accuracy and overheads.

  8. Modeling VOC transport in simulated waste drums

    International Nuclear Information System (INIS)

    Liekhus, K.J.; Gresham, G.L.; Peterson, E.S.; Rae, C.; Hotz, N.J.; Connolly, M.J.

    1993-06-01

    A volatile organic compound (VOC) transport model has been developed to describe unsteady-state VOC permeation and diffusion within a waste drum. Model equations account for three primary mechanisms for VOC transport from a void volume within the drum. These mechanisms are VOC permeation across a polymer boundary, VOC diffusion across an opening in a volume boundary, and VOC solubilization in a polymer boundary. A series of lab-scale experiments was performed in which the VOC concentration was measured in simulated waste drums under different conditions. A lab-scale simulated waste drum consisted of a sized-down 55-gal metal drum containing a modified rigid polyethylene drum liner. Four polyethylene bags were sealed inside a large polyethylene bag, supported by a wire cage, and placed inside the drum liner. The small bags were filled with VOC-air gas mixture and the VOC concentration was measured throughout the drum over a period of time. Test variables included the type of VOC-air gas mixtures introduced into the small bags, the small bag closure type, and the presence or absence of a variable external heat source. Model results were calculated for those trials where the VOC permeability had been measured. Permeabilities for five VOCs [methylene chloride, 1,1,2-trichloro-1,2,2-trifluoroethane (Freon-113), 1,1,1-trichloroethane, carbon tetrachloride, and trichloroethylene] were measured across a polyethylene bag. Comparison of model and experimental results of VOC concentration as a function of time indicate that model accurately accounts for significant VOC transport mechanisms in a lab-scale waste drum

  9. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  10. Simulating Radar Signals for Detection Performance Evaluation.

    Science.gov (United States)

    1981-02-01

    incurring the computation costs usually as- sociated with such simulations. With importance sampling one can modify the probability distribution of the...049.7 0110 IF (N0147-1) ?0Q,7oG.6oSi V4.48 102 61 IrF?=TFACTUIFI*VF1 O. All THryAV.THrTA 1S A I~ THF’r P THF T r/LCAT I IVF /’NF 1204!! %S1PP7FS(T4rTAW

  11. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  12. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  13. Simulation models for tokamak plasmas

    International Nuclear Information System (INIS)

    Dimits, A.M.; Cohen, B.I.

    1992-01-01

    Two developments in the nonlinear simulation of tokamak plasmas are described: (A) Simulation algorithms that use quasiballooning coordinates have been implemented in a 3D fluid code and a 3D partially linearized (Δf) particle code. In quasiballooning coordinates, one of the coordinate directions is closely aligned with that of the magnetic field, allowing both optimal use of the grid resolution for structures highly elongated along the magnetic field as well as implementation of the correct periodicity conditions with no discontinuities in the toroidal direction. (B) Progress on the implementation of a likeparticle collision operator suitable for use in partially linearized particle codes is reported. The binary collision approach is shown to be unusable for this purpose. The algorithm under development is a complete version of the test-particle plus source-field approach that was suggested and partially implemented by Xu and Rosenbluth

  14. MMSNF 2005. Materials models and simulations for nuclear fuels

    Energy Technology Data Exchange (ETDEWEB)

    Freyss, M.; Durinck, J.; Carlot, G.; Sabathier, C.; Martin, P.; Garcia, P.; Ripert, M.; Blanpain, P.; Lippens, M.; Schut, H.; Federov, A.V.; Bakker, K.; Osaka, M.; Miwa, S.; Sato, I.; Tanaka, K.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Govers, K.; Verwerft, M.; Hou, M.; Lemehov, S.E.; Terentyev, D.; Govers, K.; Kotomin, E.A.; Ashley, N.J.; Grimes, R.W.; Van Uffelen, P.; Mastrikov, Y.; Zhukovskii, Y.; Rondinella, V.V.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Minato, K.; Phillpot, S.; Watanabe, T.; Shukla, P.; Sinnott, S.; Nino, J.; Grimes, R.; Staicu, D.; Hiernaut, J.P.; Wiss, T.; Rondinella, V.V.; Ronchi, C.; Yakub, E.; Kaye, M.H.; Morrison, C.; Higgs, J.D.; Akbari, F.; Lewis, B.J.; Thompson, W.T.; Gueneau, C.; Gosse, S.; Chatain, S.; Dumas, J.C.; Sundman, B.; Dupin, N.; Konings, R.; Noel, H.; Veshchunov, M.; Dubourg, R.; Ozrin, C.V.; Veshchunov, M.S.; Welland, M.T.; Blanc, V.; Michel, B.; Ricaud, J.M.; Calabrese, R.; Vettraino, F.; Tverberg, T.; Kissane, M.; Tulenko, J.; Stan, M.; Ramirez, J.C.; Cristea, P.; Rachid, J.; Kotomin, E.; Ciriello, A.; Rondinella, V.V.; Staicu, D.; Wiss, T.; Konings, R.; Somers, J.; Killeen, J

    2006-07-01

    The MMSNF Workshop series aims at stimulating research and discussions on models and simulations of nuclear fuels and coupling the results into fuel performance codes.This edition was focused on materials science and engineering for fuel performance codes. The presentations were grouped in three technical sessions: fundamental modelling of fuel properties; integral fuel performance codes and their validation; collaborations and integration of activities. (A.L.B.)

  15. MMSNF 2005. Materials models and simulations for nuclear fuels

    International Nuclear Information System (INIS)

    Freyss, M.; Durinck, J.; Carlot, G.; Sabathier, C.; Martin, P.; Garcia, P.; Ripert, M.; Blanpain, P.; Lippens, M.; Schut, H.; Federov, A.V.; Bakker, K.; Osaka, M.; Miwa, S.; Sato, I.; Tanaka, K.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Govers, K.; Verwerft, M.; Hou, M.; Lemehov, S.E.; Terentyev, D.; Govers, K.; Kotomin, E.A.; Ashley, N.J.; Grimes, R.W.; Van Uffelen, P.; Mastrikov, Y.; Zhukovskii, Y.; Rondinella, V.V.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Minato, K.; Phillpot, S.; Watanabe, T.; Shukla, P.; Sinnott, S.; Nino, J.; Grimes, R.; Staicu, D.; Hiernaut, J.P.; Wiss, T.; Rondinella, V.V.; Ronchi, C.; Yakub, E.; Kaye, M.H.; Morrison, C.; Higgs, J.D.; Akbari, F.; Lewis, B.J.; Thompson, W.T.; Gueneau, C.; Gosse, S.; Chatain, S.; Dumas, J.C.; Sundman, B.; Dupin, N.; Konings, R.; Noel, H.; Veshchunov, M.; Dubourg, R.; Ozrin, C.V.; Veshchunov, M.S.; Welland, M.T.; Blanc, V.; Michel, B.; Ricaud, J.M.; Calabrese, R.; Vettraino, F.; Tverberg, T.; Kissane, M.; Tulenko, J.; Stan, M.; Ramirez, J.C.; Cristea, P.; Rachid, J.; Kotomin, E.; Ciriello, A.; Rondinella, V.V.; Staicu, D.; Wiss, T.; Konings, R.; Somers, J.; Killeen, J.

    2006-01-01

    The MMSNF Workshop series aims at stimulating research and discussions on models and simulations of nuclear fuels and coupling the results into fuel performance codes.This edition was focused on materials science and engineering for fuel performance codes. The presentations were grouped in three technical sessions: fundamental modelling of fuel properties; integral fuel performance codes and their validation; collaborations and integration of activities. (A.L.B.)

  16. Numerical simulations for active tectonic processes: increasing interoperability and performance

    Science.gov (United States)

    Donnellan, A.; Fox, G.; Rundle, J.; McLeod, D.; Tullis, T.; Grant, L.

    2002-01-01

    The objective of this project is to produce a system to fully model earthquake-related data. This task develops simulation and analysis tools to study the physics of earthquakes using state-of-the-art modeling.

  17. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  18. The Effects of Self-Discharge on the Performance of Symmetric Electric Double-Layer Capacitors and Active Electrolyte-Enhanced Supercapacitors: Insights from Modeling and Simulation

    Science.gov (United States)

    Ike, Innocent S.; Sigalas, Iakovos; Iyuke, Sunny E.

    2017-02-01

    The effects of self-discharge on the performance of symmetric electric double-layer capacitors (EDLCs) and active electrolyte-enhanced supercapacitors were examined by incorporating self-discharge into electrochemical capacitor models during charging and discharging. The sources of self-discharge in capacitors were side reactions or redox reactions and several impurities and electric double-layer (EDL) instability. The effects of self-discharge during capacitor storage was negligible since it took a fully charged capacitor a minimum of 14.0 days to be entirely discharged by self-discharge in all conditions studied, hence self-discharge in storage condition can be ignored. The first and second charge-discharge cycle energy efficiencies η_{{{{E}}1}} and η_{{{{E}}2}} of a capacitor of electrode effective conductivity α1 = 0.05 S/cm with only EDL instability self-discharge with current density J_{{VR}} = 1.25 × 10-3 A/cm2 were 72.33% and 72.34%, respectively. Also, energy efficiencies η_{{{{E}}1}} and η_{{{{E}}2}} of a similar capacitor with both side reactions and redox reactions and EDL instability self-discharges with current densities J_{{VR}} = 0.00125 A/cm2 and J_{{{{VR}}1}} = 0.0032 A/cm2 were 38.13% and 38.14% respectively, compared with 84.24% and 84.25% in a similar capacitor without self-discharge. A capacitor with only EDL instability self-discharge and that with both side reactions and redox reactions and EDL instability self-discharge lost 9.73 Wh and 28.38 Wh of energy, respectively, through self-discharge during charging and discharging. Hence, EDLCs charging and discharging time is significantly dependent on the self-discharge rate which are too large to be ignored.

  19. A model management system for combat simulation

    OpenAIRE

    Dolk, Daniel R.

    1986-01-01

    The design and implementation of a model management system to support combat modeling is discussed. Structured modeling is introduced as a formalism for representing mathematical models. A relational information resource dictionary system is developed which can accommodate structured models. An implementation is described. Structured modeling is then compared to Jackson System Development (JSD) as a methodology for facilitating discrete event simulation. JSD is currently better at representin...

  20. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  1. Physically realistic modeling of maritime training simulation

    OpenAIRE

    Cieutat , Jean-Marc

    2003-01-01

    Maritime training simulation is an important matter of maritime teaching, which requires a lot of scientific and technical skills.In this framework, where the real time constraint has to be maintained, all physical phenomena cannot be studied; the most visual physical phenomena relating to the natural elements and the ship behaviour are reproduced only. Our swell model, based on a surface wave simulation approach, permits to simulate the shape and the propagation of a regular train of waves f...

  2. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  3. Modeling & Simulation Education for the Acquisition and T&E Workforce: FY07 Deliverable Package

    National Research Council Canada - National Science Library

    Olwell, David H; Johnson, Jean; Few, Stephanie; Didoszak, Jarema M

    2007-01-01

    This technical report presents the deliverables for calendar year 2007 for the "Educating the Modeling and Simulation Workforce" project performed for the DoD Modeling and Simulation Steering Committee...

  4. Computer simulation of steady-state performance of air-to-air heat pumps

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, R D; Creswick, F A

    1978-03-01

    A computer model by which the performance of air-to-air heat pumps can be simulated is described. The intended use of the model is to evaluate analytically the improvements in performance that can be effected by various component improvements. The model is based on a trio of independent simulation programs originated at the Massachusetts Institute of Technology Heat Transfer Laboratory. The three programs have been combined so that user intervention and decision making between major steps of the simulation are unnecessary. The program was further modified by substituting a new compressor model and adding a capillary tube model, both of which are described. Performance predicted by the computer model is shown to be in reasonable agreement with performance data observed in our laboratory. Planned modifications by which the utility of the computer model can be enhanced in the future are described. User instructions and a FORTRAN listing of the program are included.

  5. Simulation Model of Mobile Detection Systems

    International Nuclear Information System (INIS)

    Edmunds, T.; Faissol, D.; Yao, Y.

    2009-01-01

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped with 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains

  6. Simulation Model of Mobile Detection Systems

    Energy Technology Data Exchange (ETDEWEB)

    Edmunds, T; Faissol, D; Yao, Y

    2009-01-27

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped with 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains

  7. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  8. Deriving simulators for hybrid Chi models

    NARCIS (Netherlands)

    Beek, van D.A.; Man, K.L.; Reniers, M.A.; Rooda, J.E.; Schiffelers, R.R.H.

    2006-01-01

    The hybrid Chi language is formalism for modeling, simulation and verification of hybrid systems. The formal semantics of hybrid Chi allows the definition of provably correct implementations for simulation, verification and realtime control. This paper discusses the principles of deriving an

  9. Modeling and simulation for RF system design

    CERN Document Server

    Frevert, Ronny; Jancke, Roland; Knöchel, Uwe; Schwarz, Peter; Kakerow, Ralf; Darianian, Mohsen

    2005-01-01

    Focusing on RF specific modeling and simulation methods, and system and circuit level descriptions, this work contains application-oriented training material. Accompanied by a CD- ROM, it combines the presentation of a mixed-signal design flow, an introduction into VHDL-AMS and Verilog-A, and the application of commercially available simulators.

  10. Development of a Simulation Model for Swimming with Diving Fins

    Directory of Open Access Journals (Sweden)

    Motomu Nakashima

    2018-02-01

    Full Text Available The simulation model to assess the performance of diving fin was developed by extending the swimming human simulation model SWUM. A diving fin was modeled as a series of five rigid plates and connected to the human model by springs and dampers. These plates were connected to each other by virtual springs and dampers, and fin’s bending property was represented by springs and dampers as well. An actual diver’s swimming motion with fins was acquired by a motion capture experiment. In order to determine the bending property of the fin, two bending tests on land were conducted. In addition, an experiment was conducted in order to determine the fluid force coefficients in the fluid force model for the fin. Finally, using all measured and identified information, a simulation, in which the experimental situation was reproduced, was carried out. It was confirmed that the diver in the simulation propelled forward in the water successfully.

  11. SEAscan 3.5: A simulator performance analyzer

    International Nuclear Information System (INIS)

    Dennis, T.; Eisenmann, S.

    1990-01-01

    SEAscan 3.5 is a personal computer based tool developed to analyze the dynamic performance of nuclear power plant training simulators. The system has integrated features to provide its own human featured performance. In this paper, the program is described as a tool for the analysis of training simulator performance. The structure and operating characteristics of SEAscan 3.5 are described. The hardcopy documents are shown to aid in verification of conformance to ANSI/ANS-3.5-1985

  12. Magnetosphere Modeling: From Cartoons to Simulations

    Science.gov (United States)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems

  13. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  14. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  15. Computer simulation of fuel element performance

    Energy Technology Data Exchange (ETDEWEB)

    Sukhanov, G I

    1979-01-01

    The review presents reports made at the Conference on the Bahaviour and Production of Fuel for Water Reactors on March 13-17, 1979. Discussed at the Conference are the most developed and tested calculation models specially evolved to predict the behaviour of fuel elements of water reactors. The following five main aspects of the problem are discussed: general conceptions and programs; mechanical mock-ups and their applications; gas release, gap conductivity and fuel thermal conductivity; analysis of nonstationary processes; models of specific phenomena. The review briefly describes the physical principles of the following models and programs: the RESTR, providing calculation of the radii of zones of columnar and equiaxial grains as well as the radius of the internal cavity of the fuel core; programs for calculation of fuel-can interaction, based on the finite elements method; a model predicting the behaviour of the CANDU-PHW fuel elements in transient conditions. General results are presented of investigations of heat transfer through a can-fuel gap and thermal conductivity of UO/sub 2/ with regard for cracking and gas release of the fuel. Many programs already suit the accepted standards and are intensively tested at present.

  16. The simulation research for the dynamic performance of integrated PWR

    International Nuclear Information System (INIS)

    Yuan Jiandong; Xia Guoqing; Fu Mingyu

    2005-01-01

    The mathematical model of the reactor core of integrated PWR has been studied and simplified properly. With the lumped parameter method, authors have established the mathematical model of the reactor core, including the neutron dynamic equation, the feedback reactivities model and the thermo-hydraulic model of the reactor. Based on the above equations and models, the incremental transfer functions of the reactor core model have been built. By simulation experimentation, authors have compared the dynamic characteristics of the integrated PWR with the traditional dispersed PWR. The simulation results show that the mathematical models and equations are correct. (authors)

  17. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  18. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  19. NUMERICAL SIMULATION AND MODELING OF UNSTEADY FLOW ...

    African Journals Online (AJOL)

    2014-06-30

    Jun 30, 2014 ... objective of this study is to control the simulation of unsteady flows around structures. ... Aerospace, our results were in good agreement with experimental .... Two-Equation Eddy-Viscosity Turbulence Models for Engineering.

  20. SEIR model simulation for Hepatitis B

    Science.gov (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  1. Switching performance of OBS network model under prefetched real traffic

    Science.gov (United States)

    Huang, Zhenhua; Xu, Du; Lei, Wen

    2005-11-01

    Optical Burst Switching (OBS) [1] is now widely considered as an efficient switching technique in building the next generation optical Internet .So it's very important to precisely evaluate the performance of the OBS network model. The performance of the OBS network model is variable in different condition, but the most important thing is that how it works under real traffic load. In the traditional simulation models, uniform traffics are usually generated by simulation software to imitate the data source of the edge node in the OBS network model, and through which the performance of the OBS network is evaluated. Unfortunately, without being simulated by real traffic, the traditional simulation models have several problems and their results are doubtable. To deal with this problem, we present a new simulation model for analysis and performance evaluation of the OBS network, which uses prefetched IP traffic to be data source of the OBS network model. The prefetched IP traffic can be considered as real IP source of the OBS edge node and the OBS network model has the same clock rate with a real OBS system. So it's easy to conclude that this model is closer to the real OBS system than the traditional ones. The simulation results also indicate that this model is more accurate to evaluate the performance of the OBS network system and the results of this model are closer to the actual situation.

  2. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  3. Turbine modelling for real time simulators

    International Nuclear Information System (INIS)

    Oliveira Barroso, A.C. de; Araujo Filho, F. de

    1992-01-01

    A model for vapor turbines and its peripherals has been developed. All the important variables have been included and emphasis has been given for the computational efficiency to obtain a model able to simulate all the modeled equipment. (A.C.A.S.)

  4. Modelling and Simulation for Requirements Engineering and Options Analysis

    Science.gov (United States)

    2010-05-01

    should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments

  5. A fire management simulation model using stochastic arrival times

    Science.gov (United States)

    Eric L. Smith

    1987-01-01

    Fire management simulation models are used to predict the impact of changes in the fire management program on fire outcomes. As with all models, the goal is to abstract reality without seriously distorting relationships between variables of interest. One important variable of fire organization performance is the length of time it takes to get suppression units to the...

  6. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  7. Modeling and simulation with operator scaling

    OpenAIRE

    Cohen, Serge; Meerschaert, Mark M.; Rosiński, Jan

    2010-01-01

    Self-similar processes are useful in modeling diverse phenomena that exhibit scaling properties. Operator scaling allows a different scale factor in each coordinate. This paper develops practical methods for modeling and simulating stochastic processes with operator scaling. A simulation method for operator stable Levy processes is developed, based on a series representation, along with a Gaussian approximation of the small jumps. Several examples are given to illustrate practical application...

  8. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  9. Dynamic modeling and simulation of power transformer maintenance costs

    Directory of Open Access Journals (Sweden)

    Ristić Olga

    2016-01-01

    Full Text Available The paper presents the dynamic model of maintenance costs of the power transformer functional components. Reliability is modeled combining the exponential and Weibull's distribution. The simulation was performed with the aim of corrective maintenance and installation of the continuous monitoring system of the most critical components. Simulation Dynamic System (SDS method and VENSIM PLE software was used to simulate the cost. In this way, significant savings in maintenance costs will be achieved with a small initial investment. [Projekat Ministarstva nauke Republike Srbije, br. III 41025 i br. OI 171007

  10. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  11. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  12. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  13. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  14. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  15. MATLAB Simulation of Photovoltaic and Photovoltaic/Thermal Systems Performance

    Science.gov (United States)

    Nasir, Farah H. M.; Husaini, Yusnira

    2018-03-01

    The efficiency of the photovoltaic reduces when the photovoltaic cell temperature increased due to solar irradiance. One solution is come up with the cooling system photovoltaic system. This combination is forming the photovoltaic-thermal (PV/T) system. Not only will it generate electricity also heat at the same time. The aim of this research is to focus on the modeling and simulation of photovoltaic (PV) and photovoltaic-thermal (PV/T) electrical performance by using single-diode equivalent circuit model. Both PV and PV/T models are developed in Matlab/Simulink. By providing the cooling system in PV/T, the efficiency of the system can be increased by decreasing the PV cell temperature. The maximum thermal, electrical and total efficiency values of PV/T in the present research are 35.18%, 15.56% and 50.74% at solar irradiance of 400 W/m2, mass flow rate of 0.05kgs-1 and inlet temperature of 25 °C respectively has been obtained. The photovoltaic-thermal shows that the higher efficiency performance compared to the photovoltaic system.

  16. Performance Test of Core Protection and Monitoring Algorithm with DLL for SMART Simulator Implementation

    International Nuclear Information System (INIS)

    Koo, Bonseung; Hwang, Daehyun; Kim, Keungkoo

    2014-01-01

    A multi-purpose best-estimate simulator for SMART is being established, which is intended to be used as a tool to evaluate the impacts of design changes on the safety performance, and to improve and/or optimize the operating procedure of SMART. In keeping with these intentions, a real-time model of the digital core protection and monitoring systems was developed and the real-time performance of the models was verified for various simulation scenarios. In this paper, a performance test of the core protection and monitoring algorithm with a DLL file for the SMART simulator implementation was performed. A DLL file of the simulator application code was made and several real-time evaluation tests were conducted for the steady-state and transient conditions with simulated system variables. A performance test of the core protection and monitoring algorithms for the SMART simulator was performed. A DLL file of the simulator version code was made and several real-time evaluation tests were conducted for various scenarios with a DLL file and simulated system variables. The results of all test cases showed good agreement with the reference results and some features caused by algorithm change were properly reflected to the DLL results. Therefore, it was concluded that the SCOPS S SIM and SCOMS S SIM algorithms and calculational capabilities are appropriate for the core protection and monitoring program in the SMART simulator

  17. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  18. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  19. Building Performance Simulation for Sustainable Energy Use in Buildings

    NARCIS (Netherlands)

    Hensen, J.L.M.

    2010-01-01

    This paper aims to provide a general view of the background and current state of building performance simulation, which has the potential to deliver, directly or indirectly, substantial benefits to building stakeholders and to the environment. However the building simulation community faces many

  20. Building performance simulation for sustainable building design and operation

    NARCIS (Netherlands)

    Hensen, J.L.M.

    2011-01-01

    This paper aims to provide a general view of the background and current state of building performance simulation, which has the potential to deliver, directly or indirectly, substantial benefits to building stakeholders and to the environment. However the building simulation community faces many

  1. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  2. Thermal unit availability modeling in a regional simulation model

    International Nuclear Information System (INIS)

    Yamayee, Z.A.; Port, J.; Robinett, W.

    1983-01-01

    The System Analysis Model (SAM) developed under the umbrella of PNUCC's System Analysis Committee is capable of simulating the operation of a given load/resource scenario. This model employs a Monte-Carlo simulation to incorporate uncertainties. Among uncertainties modeled is thermal unit availability both for energy simulation (seasonal) and capacity simulations (hourly). This paper presents the availability modeling in the capacity and energy models. The use of regional and national data in deriving the two availability models, the interaction between the two and modifications made to the capacity model in order to reflect regional practices is presented. A sample problem is presented to show the modification process. Results for modeling a nuclear unit using NERC-GADS is presented

  3. Alcohol consumption for simulated driving performance: A systematic review

    Directory of Open Access Journals (Sweden)

    Mohammad Saeid Rezaee-Zavareh

    2017-06-01

    Conclusion: Alcohol consumption may decrease simulated driving performance in alcohol consumed people compared with non-alcohol consumed people via changes in SDSD, LPSD, speed, MLPD, LC and NA. More well-designed randomized controlled clinical trials are recommended.

  4. 20th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Patel, Nisarg; Kobayashi, Hiroaki

    2016-01-01

    The book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It explores general trends in hardware and software development, and then focuses specifically on the future of high-performance systems and heterogeneous architectures. It also covers applications such as computational fluid dynamics, material science, medical applications and climate research and discusses innovative fields like coupled multi-physics or multi-scale simulations. The papers included were selected from the presentations given at the 20th Workshop on Sustained Simulation Performance at the HLRS, University of Stuttgart, Germany in December 2015, and the subsequent Workshop on Sustained Simulation Performance at Tohoku University in February 2016.

  5. A virtual laboratory notebook for simulation models.

    Science.gov (United States)

    Winfield, A J

    1998-01-01

    In this paper we describe how we have adopted the laboratory notebook as a metaphor for interacting with computer simulation models. This 'virtual' notebook stores the simulation output and meta-data (which is used to record the scientist's interactions with the simulation). The meta-data stored consists of annotations (equivalent to marginal notes in a laboratory notebook), a history tree and a log of user interactions. The history tree structure records when in 'simulation' time, and from what starting point in the tree changes are made to the parameters by the user. Typically these changes define a new run of the simulation model (which is represented as a new branch of the history tree). The tree shows the structure of the changes made to the simulation and the log is required to keep the order in which the changes occurred. Together they form a record which you would normally find in a laboratory notebook. The history tree is plotted in simulation parameter space. This shows the scientist's interactions with the simulation visually and allows direct manipulation of the parameter information presented, which in turn is used to control directly the state of the simulation. The interactions with the system are graphical and usually involve directly selecting or dragging data markers and other graphical control devices around in parameter space. If the graphical manipulators do not provide precise enough control then textual manipulation is still available which allows numerical values to be entered by hand. The Virtual Laboratory Notebook, by providing interesting interactions with the visual view of the history tree, provides a mechanism for giving the user complex and novel ways of interacting with biological computer simulation models.

  6. Advances in Electrochemical Models for Predicting the Cycling Performance of Traction Batteries: Experimental Study on Ni-MH and Simulation Développement de modèles électrochimiques de batteries de traction pour la prédiction de performances : étude expérimentale de batteries NiMH et simulations

    Directory of Open Access Journals (Sweden)

    Bernard J.

    2009-11-01

    Full Text Available Rigorous electrochemical models to simulate the cycling performance of batteries have been successfully developed and reported in the literature. They constitute a very promising approach for State-of-Charge (SoC estimation based on the physics of the cell with regards to other methods since SoC is an internal parameter of these physical models. However, the computational time needed to solve electrochemical battery models for online applications requires to develop a simplified physics-based battery model. In this work, our goal is to present and validate an advanced 0D-electrochemical model of a Ni-MH cell, as an example. This lumped-parameter model will be used to design an extended Kalman filter to predict the SoC of a Ni-MH pack. It is presented, followed by an extensive experimental study conducted on Ni-MH cells to better understand the mechanisms of physico-chemical phenomena occurring at both electrodes and support the model development. The last part of the paper focuses on the evaluation of the model with regards to experimental results obtained on Ni-MH sealed cells but also on the related commercial HEV battery pack. Des modèles électrochimiques fins permettant de simuler le comportement de batteries ont été développés avec succès et reportés dans la littérature. Ils constituent une alternative aux méthodes classiques pour estimer l’état de charge (SoC pour State of Charge des batteries, cette variable étant ici un paramètre interne du modèle physique. Cependant, pour les applications embarquées, il est nécessaire de développer des modèles simplifiés sur la base de ces modèles physiques afin de diminuer le temps de calcul nécessaire à la résolution des équations. Ici, nous présenterons à titre d’exemple un modèle électrochimique 0D avancé d’un accumulateur NiMH et sa validation. Ce modèle à paramètres concentrés sera utilisé pour réaliser un filtre de Kalman qui permettra la prédiction de l

  7. Aircraft vulnerability analysis by modeling and simulation

    Science.gov (United States)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    guidance acceleration and seeker sensitivity. For the purpose of this investigation the aircraft is equipped with conventional pyrotechnic decoy flares and the missile has no counter-countermeasure means (security restrictions on open publication). This complete simulation is used to calculate the missile miss distance, when the missile is launched from different locations around the aircraft. The miss distance data is then graphically presented showing miss distance (aircraft vulnerability) as a function of launch direction and range. The aircraft vulnerability graph accounts for aircraft and missile characteristics, but does not account for missile deployment doctrine. A Bayesian network is constructed to fuse the doctrinal rules with the aircraft vulnerability data. The Bayesian network now provides the capability to evaluate the combined risk of missile launch and aircraft vulnerability. It is shown in this paper that it is indeed possible to predict the aircraft vulnerability to missile attack in a comprehensive modelling and a holistic process. By using the appropriate real-world models, this approach is used to evaluate the effectiveness of specific countermeasure techniques against specific missile threats. The use of a Bayesian network provides the means to fuse simulated performance data with more abstract doctrinal rules to provide a realistic assessment of the aircraft vulnerability.

  8. Long-term simulation of temporal change of soil organic carbon in Denmark: comparison of three model performances under climate change

    DEFF Research Database (Denmark)

    Öztürk, Isik; Sharif, Behzad; Santhome, Sanmohan

    2018-01-01

    The temporal change in soil organic carbon (SOC) was analysed over an 80-year period based on climate change predictions of four regional circulation models under the International Panel on Climate Change (IPCC) A1B emission scenario in the 21st century. A 20-year (1991–2010) set of observed...... to the total variance of random variation was quantified. Statistical analysis showed that the crop-soil models are the main source for uncertainty in analysing soil C and N responses to climate change....

  9. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our...... understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  10. 24th & 25th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Gienger, Michael; Kobayashi, Hiroaki

    2017-01-01

    This book presents the state of the art in High Performance Computing on modern supercomputer architectures. It addresses trends in hardware and software development in general, as well as the future of High Performance Computing systems and heterogeneous architectures. The contributions cover a broad range of topics, from improved system management to Computational Fluid Dynamics, High Performance Data Analytics, and novel mathematical approaches for large-scale systems. In addition, they explore innovative fields like coupled multi-physics and multi-scale simulations. All contributions are based on selected papers presented at the 24th Workshop on Sustained Simulation Performance, held at the University of Stuttgart’s High Performance Computing Center in Stuttgart, Germany in December 2016 and the subsequent Workshop on Sustained Simulation Performance, held at the Cyberscience Center, Tohoku University, Japan in March 2017.

  11. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  12. Optimization Model for Web Based Multimodal Interactive Simulations.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  13. A satellite simulator for TRMM PR applied to climate model simulations

    Science.gov (United States)

    Spangehl, T.; Schroeder, M.; Bodas-Salcedo, A.; Hollmann, R.; Riley Dellaripa, E. M.; Schumacher, C.

    2017-12-01

    Climate model simulations have to be compared against observation based datasets in order to assess their skill in representing precipitation characteristics. Here we use a satellite simulator for TRMM PR in order to evaluate simulations performed with MPI-ESM (Earth system model of the Max Planck Institute for Meteorology in Hamburg, Germany) performed within the MiKlip project (https://www.fona-miklip.de/, funded by Federal Ministry of Education and Research in Germany). While classical evaluation methods focus on geophysical parameters such as precipitation amounts, the application of the satellite simulator enables an evaluation in the instrument's parameter space thereby reducing uncertainties on the reference side. The CFMIP Observation Simulator Package (COSP) provides a framework for the application of satellite simulators to climate model simulations. The approach requires the introduction of sub-grid cloud and precipitation variability. Radar reflectivities are obtained by applying Mie theory, with the microphysical assumptions being chosen to match the atmosphere component of MPI-ESM (ECHAM6). The results are found to be sensitive to the methods used to distribute the convective precipitation over the sub-grid boxes. Simple parameterization methods are used to introduce sub-grid variability of convective clouds and precipitation. In order to constrain uncertainties a comprehensive comparison with sub-grid scale convective precipitation variability which is deduced from TRMM PR observations is carried out.

  14. submitter Simulation-Based Performance Analysis of the ALICE Mass Storage System

    CERN Document Server

    Vickovic, L; Celar, S

    2016-01-01

    CERN – the European Organization for Nuclear Research today, in the era of big data, is one of the biggest data generators in the world. Especially interesting is transient data storage system in the ALICE experiment. With the goal to optimize its performance this paper discusses a dynamic, discrete event simulation model of disk based Storage Area Network (SAN) and its usage for the performance analyses. Storage system model is based on modular, bottom up approach and the differences between measured and simulated values vary between 1.5 % and 4 % depending on the simulated component. Once finished, simulation model was used for detailed performance analyses. Among other findings it showed that system performances can be seriously affected if the array stripe size is larger than the size of cache on individual disks in the array, which so far has been completely ignored in the literature.

  15. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  16. HPC Performance Analysis of a Distributed Information Enterprise Simulation

    National Research Council Canada - National Science Library

    Hanna, James P; Walter, Martin J; Hillman, Robert G

    2004-01-01

    .... The analysis identified several performance limitations and bottlenecks. One critical limitation addressed and eliminated was simultaneously mixing a periodic process model with an event driven model causing rollbacks...

  17. Contribution to the Development of Simulation Model of Ship Turbine

    Directory of Open Access Journals (Sweden)

    Božić Ratko

    2015-01-01

    Full Text Available Simulation modelling, performed by System Dynamics Modelling Approach and intensive use of computers, is one of the most convenient and most successful scientific methods of analysis of performance dynamics of nonlinear and very complex natural technical and organizational systems [1]. The purpose of this work is to demonstrate the successful application of system dynamics simulation modelling at analyzing performance dynamics of a complex system of ship’s propulsion system. Gas turbine is a complex non-linear system, which needs to be systematically investigated as a unit consisting of a number of subsystems and elements, which are linked by cause-effect (UPV feedback loops (KPD, both within the propulsion system and with the relevant surrounding. In this paper the authors will present an efficient application of scientific methods for the study of complex dynamic systems called qualitative and quantitative simulation System Dynamics Methodology. Gas turbine will be presented by a set of non-linear differential equations, after which mental-verbal structural models and flowcharts in System dynamics symbols will be produced, and the performance dynamics in load condition will be simulated in POWERSIM simulation language.

  18. Proceedings of eSim 2006 : IBPSA-Canada's 4. biennial building performance simulation conference

    International Nuclear Information System (INIS)

    Kesik, T.

    2006-01-01

    This conference was attended by professionals, academics and students interested in promoting the science of building performance simulation in order to optimize design, construction, operation and maintenance of new and existing buildings around the world. This biennial conference and exhibition covered all topics related to computerized simulation of a building's energy performance and energy efficiency. Computerized simulation is widely used to predict the environmental performance of buildings during all stages of a building's life cycle, from the design, commissioning, construction, occupancy and management stages. Newly developed simulation methods for optimal comfort in new and existing buildings were evaluated. The themes of the conference were: recent developments for modelling the physical processes relevant to buildings; algorithms for modelling conventional and innovative HVAC systems; methods for modelling whole-building performance; building simulation software development; the use of building simulation tools in code compliance; moving simulation into practice; validation of building simulation software; architectural design; and optimization approaches in building design. The conference also covered the modeling of energy supply systems with reference to renewable energy sources such as ground source heat pumps or hybrid systems incorporating solar energy. The conference featured 32 presentations, of which 28 have been catalogued separately for inclusion in this database. refs., tabs., figs

  19. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  20. Analytical system dynamics modeling and simulation

    CERN Document Server

    Fabien, Brian C

    2008-01-01

    This book offering a modeling technique based on Lagrange's energy method includes 125 worked examples. Using this technique enables one to model and simulate systems as diverse as a six-link, closed-loop mechanism or a transistor power amplifier.

  1. Relating Standardized Visual Perception Measures to Simulator Visual System Performance

    Science.gov (United States)

    Kaiser, Mary K.; Sweet, Barbara T.

    2013-01-01

    Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).

  2. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  3. Dynamic modeling and simulation of wind turbines

    International Nuclear Information System (INIS)

    Ghafari Seadat, M.H.; Kheradmand Keysami, M.; Lari, H.R.

    2002-01-01

    Using wind energy for generating electricity in wind turbines is a good way for using renewable energies. It can also help to protect the environment. The main objective of this paper is dynamic modeling by energy method and simulation of a wind turbine aided by computer. In this paper, the equations of motion are extracted for simulating the system of wind turbine and then the behavior of the system become obvious by solving the equations. The turbine is considered with three blade rotor in wind direction, induced generator that is connected to the network and constant revolution for simulation of wind turbine. Every part of the wind turbine should be simulated for simulation of wind turbine. The main parts are blades, gearbox, shafts and generator

  4. Digital Quantum Simulation of Spin Models with Circuit Quantum Electrodynamics

    OpenAIRE

    Salathé, Y.; Mondal, M.; Oppliger, M.; Heinsoo, J.; Kurpiers, P.; Potočnik, A.; Mezzacapo, Antonio; Las Heras García, Urtzi; Lamata Manuel, Lucas; Solano Villanueva, Enrique Leónidas; Filipp, S.; Wallraff, A.

    2015-01-01

    Systems of interacting quantum spins show a rich spectrum of quantum phases and display interesting many-body dynamics. Computing characteristics of even small systems on conventional computers poses significant challenges. A quantum simulator has the potential to outperform standard computers in calculating the evolution of complex quantum systems. Here, we perform a digital quantum simulation of the paradigmatic Heisenberg and Ising interacting spin models using a two transmon-qubit circuit...

  5. Equipment and performance upgrade of compact nuclear simulator

    International Nuclear Information System (INIS)

    Park, J. C.; Kwon, K. C.; Lee, D. Y.; Hwang, I. K.; Park, W. M.; Cha, K. H.; Song, S. J.; Lee, J. W.; Kim, B. G.; Kim, H. J.

    1999-01-01

    The simulator at Nuclear Training Center in KAERI became old and has not been used effectively for nuclear-related training and researches due to the problems such as aging of the equipment, difficulties in obtaining consumables and their high cost, and less personnel available who can handle the old equipment. To solve the problems, this study was performed for recovering the functions of the simulator through the technical design and replacement of components with new ones. As results of this study, our test after the replacement showed the same simulation status as the previous one, and new graphic displays added to the simulator was effective for the training and easy for maintenance. This study is meaningful as demonstrating the way of upgrading nuclear training simulators that lost their functioning due to the obsolescence of simulators and the unavailability of components

  6. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  7. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  8. A queuing model for road traffic simulation

    International Nuclear Information System (INIS)

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-01-01

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme

  9. Clock error models for simulation and estimation

    International Nuclear Information System (INIS)

    Meditch, J.S.

    1981-10-01

    Mathematical models for the simulation and estimation of errors in precision oscillators used as time references in satellite navigation systems are developed. The results, based on all currently known oscillator error sources, are directly implementable on a digital computer. The simulation formulation is sufficiently flexible to allow for the inclusion or exclusion of individual error sources as desired. The estimation algorithms, following from Kalman filter theory, provide directly for the error analysis of clock errors in both filtering and prediction

  10. Comments on ''Use of conditional simulation in nuclear waste site performance assessment'' by Carol Gotway

    International Nuclear Information System (INIS)

    Downing, D.J.

    1993-01-01

    This paper discusses Carol Gotway's paper, ''The Use of Conditional Simulation in Nuclear Waste Site Performance Assessment.'' The paper centers on the use of conditional simulation and the use of geostatistical methods to simulate an entire field of values for subsequent use in a complex computer model. The issues of sampling designs for geostatistics, semivariogram estimation and anisotropy, turning bands method for random field generation, and estimation of the comulative distribution function are brought out

  11. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-02-01

    We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for

  12. Constrained optimization via simulation models for new product innovation

    Science.gov (United States)

    Pujowidianto, Nugroho A.

    2017-11-01

    We consider the problem of constrained optimization where the decision makers aim to optimize the primary performance measure while constraining the secondary performance measures. This paper provides a brief overview of stochastically constrained optimization via discrete event simulation. Most review papers tend to be methodology-based. This review attempts to be problem-based as decision makers may have already decided on the problem formulation. We consider constrained optimization models as there are usually constraints on secondary performance measures as trade-off in new product development. It starts by laying out different possible methods and the reasons using constrained optimization via simulation models. It is then followed by the review of different simulation optimization approach to address constrained optimization depending on the number of decision variables, the type of constraints, and the risk preferences of the decision makers in handling uncertainties.

  13. Uterus models for use in virtual reality hysteroscopy simulators.

    Science.gov (United States)

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  14. Fracture network modeling and GoldSim simulation support

    International Nuclear Information System (INIS)

    Sugita, Kenichirou; Dershowitz, W.

    2005-01-01

    During Heisei-16, Golder Associates provided support for JNC Tokai through discrete fracture network data analysis and simulation of the Mizunami Underground Research Laboratory (MIU), participation in Task 6 of the AEspoe Task Force on Modeling of Groundwater Flow and Transport, and development of methodologies for analysis of repository site characterization strategies and safety assessment. MIU support during H-16 involved updating the H-15 FracMan discrete fracture network (DFN) models for the MIU shaft region, and developing improved simulation procedures. Updates to the conceptual model included incorporation of 'Step2' (2004) versions of the deterministic structures, and revision of background fractures to be consistent with conductive structure data from the DH-2 borehole. Golder developed improved simulation procedures for these models through the use of hybrid discrete fracture network (DFN), equivalent porous medium (EPM), and nested DFN/EPM approaches. For each of these models, procedures were documented for the entire modeling process including model implementation, MMP simulation, and shaft grouting simulation. Golder supported JNC participation in Task 6AB, 6D and 6E of the AEspoe Task Force on Modeling of Groundwater Flow and Transport during H-16. For Task 6AB, Golder developed a new technique to evaluate the role of grout in performance assessment time-scale transport. For Task 6D, Golder submitted a report of H-15 simulations to SKB. For Task 6E, Golder carried out safety assessment time-scale simulations at the block scale, using the Laplace Transform Galerkin method. During H-16, Golder supported JNC's Total System Performance Assessment (TSPA) strategy by developing technologies for the analysis of the use site characterization data in safety assessment. This approach will aid in the understanding of the use of site characterization to progressively reduce site characterization uncertainty. (author)

  15. Data management system performance modeling

    Science.gov (United States)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  16. Virtual reality simulation training of mastoidectomy - studies on novice performance.

    Science.gov (United States)

    Andersen, Steven Arild Wuyts

    2016-08-01

    Virtual reality (VR) simulation-based training is increasingly used in surgical technical skills training including in temporal bone surgery. The potential of VR simulation in enabling high-quality surgical training is great and VR simulation allows high-stakes and complex procedures such as mastoidectomy to be trained repeatedly, independent of patients and surgical tutors, outside traditional learning environments such as the OR or the temporal bone lab, and with fewer of the constraints of traditional training. This thesis aims to increase the evidence-base of VR simulation training of mastoidectomy and, by studying the final-product performances of novices, investigates the transfer of skills to the current gold-standard training modality of cadaveric dissection, the effect of different practice conditions and simulator-integrated tutoring on performance and retention of skills, and the role of directed, self-regulated learning. Technical skills in mastoidectomy were transferable from the VR simulation environment to cadaveric dissection with significant improvement in performance after directed, self-regulated training in the VR temporal bone simulator. Distributed practice led to a better learning outcome and more consolidated skills than massed practice and also resulted in a more consistent performance after three months of non-practice. Simulator-integrated tutoring accelerated the initial learning curve but also caused over-reliance on tutoring, which resulted in a drop in performance when the simulator-integrated tutor-function was discontinued. The learning curves were highly individual but often plateaued early and at an inadequate level, which related to issues concerning both the procedure and the VR simulator, over-reliance on the tutor function and poor self-assessment skills. Future simulator-integrated automated assessment could potentially resolve some of these issues and provide trainees with both feedback during the procedure and immediate

  17. Validation of the simulator neutronics model

    International Nuclear Information System (INIS)

    Gregory, M.V.

    1984-01-01

    The neutronics model in the SRP reactor training simulator computes the variation with time of the neutron population in the reactor core. The power output of a reactor is directly proportional to the neutron population, thus in a very real sense the neutronics model determines the response of the simulator. The geometrical complexity of the reactor control system in SRP reactors requires the neutronics model to provide a detailed, 3D representation of the reactor core. Existing simulator technology does not allow such a detailed representation to run in real-time in a minicomputer environment, thus an entirely different approach to the problem was required. A prompt jump method has been developed in answer to this need

  18. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-11-30

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  19. Simulation platform to model, optimize and design wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    farms. The performance of these models is proven and they can be directly implemented in different simulation tools. Then, the general conclusions regarding the achieved results during the project are summarized and some guidelines for future work are given. A general conclusion is that the main goals of the project have been achieved. Finally, the papers and reports published during the project are presented. (au)

  20. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Tryggvason, Tryggvi

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...

  1. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  2. The Five Key Questions of Human Performance Modeling.

    Science.gov (United States)

    Wu, Changxu

    2018-01-01

    Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.

  3. Tsunami simulation using submarine displacement calculated from simulation of ground motion due to seismic source model

    Science.gov (United States)

    Akiyama, S.; Kawaji, K.; Fujihara, S.

    2013-12-01

    Since fault fracturing due to an earthquake can simultaneously cause ground motion and tsunami, it is appropriate to evaluate the ground motion and the tsunami by single fault model. However, several source models are used independently in the ground motion simulation or the tsunami simulation, because of difficulty in evaluating both phenomena simultaneously. Many source models for the 2011 off the Pacific coast of Tohoku Earthquake are proposed from the inversion analyses of seismic observations or from those of tsunami observations. Most of these models show the similar features, which large amount of slip is located at the shallower part of fault area near the Japan Trench. This indicates that the ground motion and the tsunami can be evaluated by the single source model. Therefore, we examine the possibility of the tsunami prediction, using the fault model estimated from seismic observation records. In this study, we try to carry out the tsunami simulation using the displacement field of oceanic crustal movements, which is calculated from the ground motion simulation of the 2011 off the Pacific coast of Tohoku Earthquake. We use two fault models by Yoshida et al. (2011), which are based on both the teleseismic body wave and on the strong ground motion records. Although there is the common feature in those fault models, the amount of slip near the Japan trench is lager in the fault model from the strong ground motion records than in that from the teleseismic body wave. First, the large-scale ground motion simulations applying those fault models used by the voxel type finite element method are performed for the whole eastern Japan. The synthetic waveforms computed from the simulations are generally consistent with the observation records of K-NET (Kinoshita (1998)) and KiK-net stations (Aoi et al. (2000)), deployed by the National Research Institute for Earth Science and Disaster Prevention (NIED). Next, the tsunami simulations are performed by the finite

  4. High performance stream computing for particle beam transport simulations

    International Nuclear Information System (INIS)

    Appleby, R; Bailey, D; Higham, J; Salt, M

    2008-01-01

    Understanding modern particle accelerators requires simulating charged particle transport through the machine elements. These simulations can be very time consuming due to the large number of particles and the need to consider many turns of a circular machine. Stream computing offers an attractive way to dramatically improve the performance of such simulations by calculating the simultaneous transport of many particles using dedicated hardware. Modern Graphics Processing Units (GPUs) are powerful and affordable stream computing devices. The results of simulations of particle transport through the booster-to-storage-ring transfer line of the DIAMOND synchrotron light source using an NVidia GeForce 7900 GPU are compared to the standard transport code MAD. It is found that particle transport calculations are suitable for stream processing and large performance increases are possible. The accuracy and potential speed gains are compared and the prospects for future work in the area are discussed

  5. Distributed dynamic simulations of networked control and building performance applications.

    Science.gov (United States)

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  6. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  7. Modelling, simulation and validation of the industrial robot

    Directory of Open Access Journals (Sweden)

    Aleksandrov Slobodan Č.

    2014-01-01

    Full Text Available In this paper, a DH model of industrial robot, with anthropomorphic configuration and five degrees of freedom - Mitsubishi RV2AJ, is developed. The model is verified on the example robot Mitsubishi RV2AJ. In paper detailed represented the complete mathematical model of the robot and the parameters of the programming. On the basis of this model, simulation of robot motion from point to point is performed, as well as the continuous movement of the pre-defined path. Also, programming of industrial robots identical to simulation programs is made, and comparative analysis of real and simulated experiment is shown. In the final section, a detailed analysis of robot motion is described.

  8. Fracture Network Modeling and GoldSim Simulation Support

    OpenAIRE

    杉田 健一郎; Dershowiz, W.

    2003-01-01

    During Heisei-14, Golder Associates provided support for JNC Tokai through data analysis and simulation of the MIU Underground Rock Laboratory, participation in Task 6 of the Aspo Task Force on Modelling of Groundwater Flow and Transport, and analysis of repository safety assessment technologies including cell networks for evaluation of the disturbed rock zone (DRZ) and total systems performance assessment (TSPA).

  9. Electromagnetic simulations of simple models of ferrite loaded kickers

    CERN Document Server

    Zannini, Carlo; Salvant, B; Metral, E; Rumolo, G

    2010-01-01

    The kickers are major contributors to the CERN SPS beam coupling impedance. As such, they may represent a limitation to increasing the SPS bunch current in the frame of an intensity upgrade of the LHC. In this paper, CST Particle Studio time domain electromagnetic simulations are performed to obtain the longitudinal and transverse impedances/wake potentials of simplified models of ferrite loaded kickers. The simulation results have been successfully compared with some existing analytical expressions. In the transverse plane, the dipolar and quadrupolar contributions to the wake potentials have been estimated from the results of these simulations. For some cases, simulations have also been benchmarked against measurements on PS kickers. It turns out that the large simulated quadrupolar contributions of these kickers could explain both the negative total (dipolar+quadrupolar) horizontal impedance observed in bench measurements and the positive horizontal tune shift measured with the SPS beam.

  10. Generic simplified simulation model for DFIG with active crowbar

    Energy Technology Data Exchange (ETDEWEB)

    Buendia, Francisco Jimenez [Gamesa Innovation and Technology, Sarriguren, Navarra (Spain). Technology Dept.; Barrasa Gordo, Borja [Assystem Iberia, Bilbao, Vizcaya (Spain)

    2012-07-01

    Simplified models for transient stability studies are a general requirement for transmission system operators to wind turbine (WTG) manufacturers. Those models must represent the performance of the WTGs for transient stability studies, mainly voltage dips originated by short circuits in the electrical network. Those models are implemented in simulation software as PSS/E, DigSilent or PSLF. Those software platforms allow simulation of transients in large electrical networks with thousands of busses, generators and loads. The high complexity of the grid requires that the models inserted into the grid should be simplified in order to allow the simulations being executed as fast as possible. The development of a model which is simplified enough to be integrated in those complex grids and represent the performance of WTG is a challenge. The IEC TC88 working group has developed generic models for different types of generators, among others for WTGs using doubly fed induction generators (DFIG). This paper will focus in an extension of the models for DFIG WTGs developed in IEC in order to be able to represent the simplified model of DFIG with an active crowbar, which is required to withstand voltage dips without disconnecting from the grid. This paper improves current generic model of Type 3 for DFIG adding a simplified version of the generator including crowbar functionality and a simplified version of the crowbar firing. In addition, this simplified model is validated by correlation with voltage dip field test from a real wind turbine. (orig.)

  11. Ravenscar Computational Model compliant AADL Simulation on LEON2

    Directory of Open Access Journals (Sweden)

    Roberto Varona-Gómez

    2013-02-01

    Full Text Available AADL has been proposed for designing and analyzing SW and HW architectures for real-time mission-critical embedded systems. Although the Behavioral Annex improves its simulation semantics, AADL is a language for analyzing architectures and not for simulating them. AADS-T is an AADL simulation tool that supports the performance analysis of the AADL specification throughout the refinement process from the initial system architecture until the complete, detailed application and execution platform are developed. In this way, AADS-T enables the verification of the initial timing constraints during the complete design process. In this paper we focus on the compatibility of AADS-T with the Ravenscar Computational Model (RCM as part of the TASTE toolset. Its flexibility enables AADS-T to support different processors. In this work we have focused on performing the simulation on a LEON2 processor.

  12. Review of Methods Related to Assessing Human Performance in Nuclear Power Plant Control Room Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Katya L Le Blanc; Ronald L Boring; David I Gertman

    2001-11-01

    With the increased use of digital systems in Nuclear Power Plant (NPP) control rooms comes a need to thoroughly understand the human performance issues associated with digital systems. A common way to evaluate human performance is to test operators and crews in NPP control room simulators. However, it is often challenging to characterize human performance in meaningful ways when measuring performance in NPP control room simulations. A review of the literature in NPP simulator studies reveals a variety of ways to measure human performance in NPP control room simulations including direct observation, automated computer logging, recordings from physiological equipment, self-report techniques, protocol analysis and structured debriefs, and application of model-based evaluation. These methods and the particular measures used are summarized and evaluated.

  13. Modeling salmonella Dublin into the dairy herd simulation model Simherd

    DEFF Research Database (Denmark)

    Kudahl, Anne Braad

    2010-01-01

    Infection with Salmonella Dublin in the dairy herd and effects of the infection and relevant control measures are currently being modeled into the dairy herd simulation model called Simherd. The aim is to compare the effects of different control strategies against Salmonella Dublin on both within...... of the simulations will therefore be used for decision support in the national surveillance and eradication program against Salmonella Dublin. Basic structures of the model are programmed and will be presented at the workshop. The model is in a phase of face-validation by a group of Salmonella......-herd- prevalence and economy by simulations. The project Dublin on both within-herd- prevalence and economy by simulations. The project is a part of a larger national project "Salmonella 2007 - 2011" with the main objective to reduce the prevalence of Salmonella Dublin in Danish Dairy herds. Results...

  14. Simulator experiments: effects of NPP operator experience on performance

    International Nuclear Information System (INIS)

    Beare, A.N.; Gray, L.H.

    1984-01-01

    During the FY83 research, a simulator experiment was conducted at the control room simulator for a GE Boiling Water Reactor (BWR) NPP. The research subjects were licensed operators undergoing requalification training and shift technical advisors (STAs). This experiment was designed to investigate the effects of senior reactor operator (SRO) experience, operating crew augmentation with an STA and practice, as a crew, upon crew and individual operator performance, in response to anticipated plant transients. Sixteen two-man crews of licensed operators were employed in a 2 x 2 factorial design. The SROs leading the crews were split into high and low experience groups on the basis of their years of experience as an SRO. One half of the high- and low-SRO experience groups were assisted by an STA. The crews responded to four simulated plant casualties. A five-variable set of content-referenced performance measures was derived from task analyses of the procedurally correct responses to the four casualties. System parameters and control manipulations were recorded by the computer controlling the simulator. Data on communications and procedure use were obtained from analysis of videotapes of the exercises. Questionnaires were used to collect subject biographical information and data on subjective workload during each simulated casualty. For four of the five performance measures, no significant differences were found between groups led by high (25 to 114 months) and low (1 to 17 months as an SRO) experience SROs. However, crews led by low experience SROs tended to have significantly shorter task performance times than crews led by high experience SROs. The presence of the STA had no significant effect on overall team performance in responding to the four simulated casualties. The FY84 experiments are a partial replication and extension of the FY83 experiment, but with PWR operators and simulator

  15. Characterization uncertainty and its effects on models and performance

    International Nuclear Information System (INIS)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization

  16. Digital Quantum Simulation of Spin Models with Circuit Quantum Electrodynamics

    Directory of Open Access Journals (Sweden)

    Y. Salathé

    2015-06-01

    Full Text Available Systems of interacting quantum spins show a rich spectrum of quantum phases and display interesting many-body dynamics. Computing characteristics of even small systems on conventional computers poses significant challenges. A quantum simulator has the potential to outperform standard computers in calculating the evolution of complex quantum systems. Here, we perform a digital quantum simulation of the paradigmatic Heisenberg and Ising interacting spin models using a two transmon-qubit circuit quantum electrodynamics setup. We make use of the exchange interaction naturally present in the simulator to construct a digital decomposition of the model-specific evolution and extract its full dynamics. This approach is universal and efficient, employing only resources that are polynomial in the number of spins, and indicates a path towards the controlled simulation of general spin dynamics in superconducting qubit platforms.

  17. Wave and Wind Model Performance Metrics Tools

    Science.gov (United States)

    Choi, J. K.; Wang, D. W.

    2016-02-01

    Continual improvements and upgrades of Navy ocean wave and wind models are essential to the assurance of battlespace environment predictability of ocean surface wave and surf conditions in support of Naval global operations. Thus, constant verification and validation of model performance is equally essential to assure the progress of model developments and maintain confidence in the predictions. Global and regional scale model evaluations may require large areas and long periods of time. For observational data to compare against, altimeter winds and waves along the tracks from past and current operational satellites as well as moored/drifting buoys can be used for global and regional coverage. Using data and model runs in previous trials such as the planned experiment, the Dynamics of the Adriatic in Real Time (DART), we demonstrated the use of accumulated altimeter wind and wave data over several years to obtain an objective evaluation of the performance the SWAN (Simulating Waves Nearshore) model running in the Adriatic Sea. The assessment provided detailed performance of wind and wave models by using cell-averaged statistical variables maps with spatial statistics including slope, correlation, and scatter index to summarize model performance. Such a methodology is easily generalized to other regions and at global scales. Operational technology currently used by subject matter experts evaluating the Navy Coastal Ocean Model and the Hybrid Coordinate Ocean Model can be expanded to evaluate wave and wind models using tools developed for ArcMAP, a GIS application developed by ESRI. Recent inclusion of altimeter and buoy data into a format through the Naval Oceanographic Office's (NAVOCEANO) quality control system and the netCDF standards applicable to all model output makes it possible for the fusion of these data and direct model verification. Also, procedures were developed for the accumulation of match-ups of modelled and observed parameters to form a data base

  18. 18th and 19th Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Kobayashi, Hiroaki; Patel, Nisarg

    2015-01-01

    This book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and the future of high-performance systems and heterogeneous architectures in particular. The application-related contributions cover computational fluid dynamics, material science, medical applications and climate research; innovative fields such as coupled multi-physics and multi-scale simulations are highlighted. All papers were chosen from presentations given at the 18th Workshop on Sustained Simulation Performance held at the HLRS, University of Stuttgart, Germany in October 2013 and subsequent Workshop of the same name held at Tohoku University in March 2014.  

  19. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels

    2013-01-01

    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  20. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  1. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  2. A SIMULATION MODEL OF THE GAS COMPLEX

    Directory of Open Access Journals (Sweden)

    Sokolova G. E.

    2016-06-01

    Full Text Available The article considers the dynamics of gas production in Russia, the structure of sales in the different market segments, as well as comparative dynamics of selling prices on these segments. Problems of approach to the creation of the gas complex using a simulation model, allowing to estimate efficiency of the project and determine the stability region of the obtained solutions. In the presented model takes into account the unit repayment of the loan, allowing with the first year of simulation to determine the possibility of repayment of the loan. The model object is a group of gas fields, which is determined by the minimum flow rate above which the project is cost-effective. In determining the minimum source flow rate for the norm of discount is taken as a generalized weighted average percentage on debt and equity taking into account risk premiums. He also serves as the lower barrier to internal rate of return below which the project is rejected as ineffective. Analysis of the dynamics and methods of expert evaluation allow to determine the intervals of variation of the simulated parameters, such as the price of gas and the exit gas complex at projected capacity. Calculated using the Monte Carlo method, for each random realization of the model simulated values of parameters allow to obtain a set of optimal for each realization of values minimum yield of wells, and also allows to determine the stability region of the solution.

  3. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  4. Advanced feeder control using fast simulation models

    NARCIS (Netherlands)

    Verheijen, O.S.; Op den Camp, O.M.G.C.; Beerkens, R.G.C.; Backx, A.C.P.M.; Huisman, L.; Drummond, C.H.

    2005-01-01

    For the automatic control of glass quality in glass production, the relation between process variable and product or glass quality and process conditions/process input parameters must be known in detail. So far, detailed 3-D glass melting simulation models were used to predict the effect of process

  5. Modeling and Simulating Virtual Anatomical Humans

    NARCIS (Netherlands)

    Madehkhaksar, Forough; Luo, Zhiping; Pronost, Nicolas; Egges, Arjan

    2014-01-01

    This chapter presents human musculoskeletal modeling and simulation as a challenging field that lies between biomechanics and computer animation. One of the main goals of computer animation research is to develop algorithms and systems that produce plausible motion. On the other hand, the main

  6. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  7. 3D numerical simulation on heat transfer performance of a cylindrical liquid immersion solar receiver

    International Nuclear Information System (INIS)

    Xiang Haijun; Wang Yiping; Zhu Li; Han Xinyue; Sun Yong; Zhao Zhengjian

    2012-01-01

    Highlights: ► Establishment of a three-dimensional numerical simulation model of a cylindrical liquid immersion solar receiver. ► Determination of model parameters and validation of the model by using the real-collected data. ► Optimization of liquid flow rate and fin’s structure for better heat transfer performance. - Abstract: Liquid immersion cooling for a cylindrical solar receiver in a dish concentrator photovoltaic system has been experimentally verified to be a promising method of removing surplus heat from densely packed solar cells. In the present study, a three-dimensional (3D) numerical simulation model of the prototype was established for better understanding the mechanism of the direct-contact heat transfer process. With the selection of standard k–ε turbulent model, the detailed simulation results of velocity field and temperature characteristics were obtained. The heat transfer performance of two structural modules (bare module and finned module) under actual weather conditions was simulated. It was found that the predicted temperature distribution of the two structural modules at the axial and lateral direction was in good agreement with the experimental data. Based on the validated simulation model, the influence of liquid flow rate and module geometric parameters on the cell temperature was then investigated. The simulated results indicated that the cell module with fin height of 4 mm and fin number of 11 has the best heat transfer performance and will be used in further works.

  8. Thermohydraulic modeling and simulation of breeder reactors

    International Nuclear Information System (INIS)

    Agrawal, A.K.; Khatib-Rahbar, M.; Curtis, R.T.; Hetrick, D.L.; Girijashankar, P.V.

    1982-01-01

    This paper deals with the modeling and simulation of system-wide transients in LMFBRs. Unprotected events (i.e., the presumption of failure of the plant protection system) leading to core-melt are not considered in this paper. The existing computational capabilities in the area of protected transients in the US are noted. Various physical and numerical approximations that are made in these codes are discussed. Finally, the future direction in the area of model verification and improvements is discussed

  9. Optical Imaging and Radiometric Modeling and Simulation

    Science.gov (United States)

    Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.

    2010-01-01

    OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge

  10. ATES/heat pump simulations performed with ATESSS code

    Science.gov (United States)

    Vail, L. W.

    1989-01-01

    Modifications to the Aquifer Thermal Energy Storage System Simulator (ATESSS) allow simulation of aquifer thermal energy storage (ATES)/heat pump systems. The heat pump algorithm requires a coefficient of performance (COP) relationship of the form: COP = COP sub base + alpha (T sub ref minus T sub base). Initial applications of the modified ATES code to synthetic building load data for two sizes of buildings in two U.S. cities showed insignificant performance advantage of a series ATES heat pump system over a conventional groundwater heat pump system. The addition of algorithms for a cooling tower and solar array improved performance slightly. Small values of alpha in the COP relationship are the principal reason for the limited improvement in system performance. Future studies at Pacific Northwest Laboratory (PNL) are planned to investigate methods to increase system performance using alternative system configurations and operations scenarios.

  11. Off gas condenser performance modelling

    International Nuclear Information System (INIS)

    Cains, P.W.; Hills, K.M.; Waring, S.; Pratchett, A.G.

    1989-12-01

    A suite of three programmes has been developed to model the ruthenium decontamination performance of a vitrification plant off-gas condenser. The stages of the model are: condensation of water vapour, NO x absorption in the condensate, RuO 4 absorption in the condensate. Juxtaposition of these stages gives a package that may be run on an IBM-compatible desktop PC. Experimental work indicates that the criterion [HNO 2 ] > 10 [RuO 4 ] used to determine RuO 4 destruction in solution is probably realistic under condenser conditions. Vapour pressures of RuO 4 over aqueous solutions at 70 o -90 o C are slightly lower than the values given by extrapolating the ln K p vs. T -1 relation derived from lower temperature data. (author)

  12. Data harmonization and model performance

    Science.gov (United States)

    The Joint Committee on Urban Storm Drainage of the International Association for Hydraulic Research (IAHR) and International Association on Water Pollution Research and Control (IAWPRC) was formed in 1982. The current committee members are (no more than two from a country): B. C. Yen, Chairman (USA); P. Harremoes, Vice Chairman (Denmark); R. K. Price, Secretary (UK); P. J. Colyer (UK), M. Desbordes (France), W. C. Huber (USA), K. Krauth (FRG), A. Sjoberg (Sweden), and T. Sueishi (Japan).The IAHR/IAWPRC Joint Committee is forming a Task Group on Data Harmonization and Model Performance. One objective is to promote international urban drainage data harmonization for easy data and information exchange. Another objective is to publicize available models and data internationally. Comments and suggestions concerning the formation and charge of the Task Group are welcome and should be sent to: B. C. Yen, Dept. of Civil Engineering, Univ. of Illinois, 208 N. Romine St., Urbana, IL 61801.

  13. Modeling Supermassive Black Holes in Cosmological Simulations

    Science.gov (United States)

    Tremmel, Michael

    My thesis work has focused on improving the implementation of supermassive black hole (SMBH) physics in cosmological hydrodynamic simulations. SMBHs are ubiquitous in mas- sive galaxies, as well as bulge-less galaxies and dwarfs, and are thought to be a critical component to massive galaxy evolution. Still, much is unknown about how SMBHs form, grow, and affect their host galaxies. Cosmological simulations are an invaluable tool for un- derstanding the formation of galaxies, self-consistently tracking their evolution with realistic merger and gas accretion histories. SMBHs are often modeled in these simulations (generally as a necessity to produce realistic massive galaxies), but their implementations are commonly simplified in ways that can limit what can be learned. Current and future observations are opening new windows into the lifecycle of SMBHs and their host galaxies, but require more detailed, physically motivated simulations. Within the novel framework I have developed, SMBHs 1) are seeded at early times without a priori assumptions of galaxy occupation, 2) grow in a way that accounts for the angular momentum of gas, and 3) experience realistic orbital evolution. I show how this model, properly tuned with a novel parameter optimiza- tion technique, results in realistic galaxies and SMBHs. Utilizing the unique ability of these simulations to capture the dynamical evolution of SMBHs, I present the first self-consistent prediction for the formation timescales of close SMBH pairs, precursors to SMBH binaries and merger events potentially detected by future gravitational wave experiments.

  14. Dual Arm Work Package performance estimates and telerobot task network simulation

    International Nuclear Information System (INIS)

    Draper, J.V.

    1997-01-01

    This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy's Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collected to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations

  15. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    Science.gov (United States)

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  16. Advances in NLTE Modeling for Integrated Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Scott, H A; Hansen, S B

    2009-07-08

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, {Delta}n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  17. Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

    Directory of Open Access Journals (Sweden)

    Florin-Catalin ENACHE

    2015-10-01

    Full Text Available The growing character of the cloud business has manifested exponentially in the last 5 years. The capacity managers need to concentrate on a practical way to simulate the random demands a cloud infrastructure could face, even if there are not too many mathematical tools to simulate such demands.This paper presents an introduction into the most important stochastic processes and queueing theory concepts used for modeling computer performance. Moreover, it shows the cases where such concepts are applicable and when not, using clear programming examples on how to simulate a queue, and how to use and validate a simulation, when there are no mathematical concepts to back it up.

  18. Performance evaluation by simulation and analysis with applications to computer networks

    CERN Document Server

    Chen, Ken

    2015-01-01

    This book is devoted to the most used methodologies for performance evaluation: simulation using specialized software and mathematical modeling. An important part is dedicated to the simulation, particularly in its theoretical framework and the precautions to be taken in the implementation of the experimental procedure.  These principles are illustrated by concrete examples achieved through operational simulation languages ​​(OMNeT ++, OPNET). Presented under the complementary approach, the mathematical method is essential for the simulation. Both methodologies based largely on the theory of

  19. Particle tracking in sophisticated CAD models for simulation purposes

    International Nuclear Information System (INIS)

    Sulkimo, J.; Vuoskoski, J.

    1995-01-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT. (orig.)

  20. Particle tracking in sophisticated CAD models for simulation purposes

    Science.gov (United States)

    Sulkimo, J.; Vuoskoski, J.

    1996-02-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT.