Sample records for simulation modelling methodology

  1. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno


    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  2. The Conical Methodology: A Framework for Simulation Model Development


    Richard E. Nance


    The Conical Methodology, intended for large discrete event simulation modeling, is reviewed from two perspectives. The designer perspectivebegins with the question: What is a methodology? From an answer to that question is framed an inquiry based on an objective/principles/attributes linkage that has proved useful in evaluating software development methodologies. The user perspective addresses the role of a methodology vis a vis the software utilities (the tools) that comprise the environme...

  3. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)



    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  4. Effects of simulation language and modeling methodology on simulation modeling performance

    Energy Technology Data Exchange (ETDEWEB)

    Wang, T.J.


    Research in simulation modeling has made little advance over the past two decades. Many simulation languages and modeling methodologies were designed but not evaluated. Model developers were given no criteria for selecting from among these modeling tools. A framework of research in simulation modeling was developed to identify factors that might most affect simulation modeling performance. First, two simulation languages (MAGIE and GPSS) that differ greatly in complexity were compared. Both languages are similar in their design philosophy. However, MAGIE is a small simulation language with ten model building blocks while GPSS is a large simulation language with fifty-six model building blocks. Secondly, two modeling methodologies, namely the top-down and the bottom-up approaches, were compared. This research shows that it is feasible to apply the user-based empirical research methodology to study simulation modeling. It is also concluded that modeling with a large simulation language does not necessarily yield better results than modeling with a small simulation language. Furthermore, it was found that using the top-down modeling approach does not necessarily yield better results than using the bottom-up modeling approach.

  5. Validating agent oriented methodology (AOM) for netlogo modelling and simulation (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan


    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  6. Modelling and simulation of information systems on computer: methodological advantages. (United States)

    Huet, B; Martin, J


    Modelling and simulation of information systems by the means of miniatures on computer aim at two general objectives: (a) as an aid to design and realization of information systems; and (b) a tool to improve the dialogue between the designer and the users. An operational information system has two components bound by a dynamic relationship, an information system and a behavioural system. Thanks to the behaviour system, modelling and simulation allow the designer to integrate into the projects a large proportion of the system's implicit specification. The advantages of modelling to the information system relate to: (a) The conceptual phase: initial objectives are compared with the results of simulation and sometimes modified. (b) The external specifications: simulation is particularly useful for personalising man-machine relationships in each application. (c) The internal specifications: if the miniatures are built on the concept of process, the global design and the software are tested and also the simulation refines the configuration and directs the choice of hardware. (d) The implementation: stimulation reduces costs, time and allows testing. Progress in modelling techniques will undoubtedly lead to better information systems.

  7. An improved methodology for dynamic modelling and simulation of ...

    Indian Academy of Sciences (India)

    This presents a real struggle to the engineers who want to design and implement such systems with high performance, efficiency and reliability. For this purpose, engineers need a tool capable of modelling and/or simulating components of diverse nature within the ECDS. However, a majority of the available tools are limited ...

  8. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee


    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  9. 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Koziel, Slawomir; Kacprzyk, Janusz; Leifsson, Leifur; Ören, Tuncer


    This book includes extended and revised versions of a set of selected papers from the 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2013) which was co-organized by the Reykjavik University (RU) and sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC). SIMULTECH 2013 was held in cooperation with the ACM SIGSIM - Special Interest Group (SIG) on SImulation and Modeling (SIM), Movimento Italiano Modellazione e Simulazione (MIMOS) and AIS Special Interest Group on Modeling and Simulation (AIS SIGMAS) and technically co-sponsored by the Society for Modeling & Simulation International (SCS), Liophant Simulation, Simulation Team and International Federation for Information Processing (IFIP). This proceedings brings together researchers, engineers, applied mathematicians and practitioners working in the advances and applications in the field of system simulation.

  10. Methodologies for the modeling and simulation of biochemical networks, illustrated for signal transduction pathways: a primer. (United States)

    ElKalaawy, Nesma; Wassal, Amr


    Biochemical networks depict the chemical interactions that take place among elements of living cells. They aim to elucidate how cellular behavior and functional properties of the cell emerge from the relationships between its components, i.e. molecules. Biochemical networks are largely characterized by dynamic behavior, and exhibit high degrees of complexity. Hence, the interest in such networks is growing and they have been the target of several recent modeling efforts. Signal transduction pathways (STPs) constitute a class of biochemical networks that receive, process, and respond to stimuli from the environment, as well as stimuli that are internal to the organism. An STP consists of a chain of intracellular signaling processes that ultimately result in generating different cellular responses. This primer presents the methodologies used for the modeling and simulation of biochemical networks, illustrated for STPs. These methodologies range from qualitative to quantitative, and include structural as well as dynamic analysis techniques. We describe the different methodologies, outline their underlying assumptions, and provide an assessment of their advantages and disadvantages. Moreover, publicly and/or commercially available implementations of these methodologies are listed as appropriate. In particular, this primer aims to provide a clear introduction and comprehensive coverage of biochemical modeling and simulation methodologies for the non-expert, with specific focus on relevant literature of STPs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi


    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  12. Concepts and methodologies for modeling and simulation a tribute to Tuncer Oren

    CERN Document Server

    Yilmaz, Levent


    This comprehensive text/reference presents cutting-edge advances in the theory and methodology of modeling and simulation (M&S), and reveals how this work has been influenced by the fundamental contributions of Professor Tuncer Ören to this field. Exploring the synergies among the domains of M&S and systems engineering (SE), the book describes how M&S and SE can help to address the complex problems identified as "Grand Challenges" more effectively under a model-driven and simulation-directed systems engineering framework. Topics and features: examines frameworks for the development of advan

  13. 2014 International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Ören, Tuncer; Kacprzyk, Janusz; Filipe, Joaquim


    The present book includes a set of selected extended papers from the 4th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2014), held in Vienna, Austria, from 28 to 30 August 2014. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2014 received 167 submissions, from 45 countries, in all continents. After a double blind paper review performed by the Program Committee, 23% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2014. Commitment to high quality standards is a major concern of SIMULTEC...

  14. 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Kacprzyk, Janusz; Ören, Tuncer; Filipe, Joaquim


    The present book includes a set of selected extended papers from the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2015), held in Colmar, France, from 21 to 23 July 2015. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2015 received 102 submissions, from 36 countries, in all continents. After a double blind paper review performed by the Program Committee, 19% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2015. Commitment to high quality standards is a major concern of SIMULTECH t...

  15. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    Energy Technology Data Exchange (ETDEWEB)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes [Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Skudai, Johor (Malaysia)


    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.

  16. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests (United States)

    Izzuddin, Nur; Sunarsih, Priyanto, Agoes


    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel's speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel's speed to obtain better characteristics and hence optimize the fuel saving rate.

  17. Methodology of the Access to Care and Timing Simulation Model for Traumatic Spinal Cord Injury Care. (United States)

    Santos, Argelio; Fallah, Nader; Lewis, Rachel; Dvorak, Marcel F; Fehlings, Michael G; Burns, Anthony Scott; Noonan, Vanessa K; Cheng, Christiana L; Chan, Elaine; Singh, Anoushka; Belanger, Lise M; Atkins, Derek


    Despite the relatively low incidence, the management and care of persons with traumatic spinal cord injury (tSCI) can be resource intensive and complex, spanning multiple phases of care and disciplines. Using a simulation model built with a system level view of the healthcare system allows for prediction of the impact of interventions on patient and system outcomes from injury through to community reintegration after tSCI. The Access to Care and Timing (ACT) project developed a simulation model for tSCI care using techniques from operations research and its development has been described previously. The objective of this article is to briefly describe the methodology and the application of the ACT Model as it was used in several of the articles in this focus issue. The approaches employed in this model provide a framework to look into the complexity of interactions both within and among the different SCI programs, sites and phases of care.

  18. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations (United States)

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  19. A systematic methodology to extend the applicability of a bioconversion model for the simulation of various co-digestion scenarios

    DEFF Research Database (Denmark)

    Kovalovszki, Adam; Alvarado-Morales, Merlin; Fotidis, Ioannis


    Detailed simulation of anaerobic digestion (AD) requires complex mathematical models and the optimization of numerous model parameters. By performing a systematic methodology and identifying parameters with the highest impact on process variables in a well-established AD model, its applicability ...

  20. Experimental Study of the Methodology for the Modelling and Simulation of Mobile Manipulators

    Directory of Open Access Journals (Sweden)

    Luis Adrian Zuñiga Aviles


    Full Text Available This paper describes an experimental study of a novel methodology for the positioning of a multi-articulated wheeled mobile manipulator with 12 degrees of freedom used for handling tasks with explosive devices. The approach is based on an extension of a homogenous transformation graph (HTG, which is adapted to be used in the kinematic modelling of manipulators as well as mobile manipulators. The positioning of a mobile manipulator is desirable when: (1 the manipulation task requires the orientation of the whole system towards the objective; (2 the tracking trajectories are performed upon approaching the explosive device's location on the horizontal and inclined planes; (3 the application requires the manipulation of the explosive device; (4 the system requires the extension of its vertical scope; and (5 the system is required to climb stairs using its front arms. All of the aforementioned desirable features are analysed using the HTG, which establishes the appropriate transformations and interaction parameters of the coupled system. The methodology is tested with simulations and real experiments of the system where the error RMS average of the positioning task is 7. 91 mm, which is an acceptable parameter for performance of the mobile manipulator.

  1. A Radiative Transfer Modeling Methodology in Gas-Liquid Multiphase Flow Simulations

    Directory of Open Access Journals (Sweden)

    Gautham Krishnamoorthy


    Full Text Available A methodology for performing radiative transfer calculations in computational fluid dynamic simulations of gas-liquid multiphase flows is presented. By considering an externally irradiated bubble column photoreactor as our model system, the bubble scattering coefficients were determined through add-on functions by employing as inputs the bubble volume fractions, number densities, and the fractional contribution of each bubble size to the bubble volume from four different multiphase modeling options. The scattering coefficient profiles resulting from the models were significantly different from one another and aligned closely with their predicted gas-phase volume fraction distributions. The impacts of the multiphase modeling option, initial bubble diameter, and gas flow rates on the radiation distribution patterns within the reactor were also examined. An increase in air inlet velocities resulted in an increase in the fraction of larger sized bubbles and their contribution to the scattering coefficient. However, the initial bubble sizes were found to have the strongest impact on the radiation field.

  2. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  3. On a reusable and multilevel methodology for modeling and simulation of pharmacokinetic-physiological systems: a preliminary study. (United States)

    Rodrigues Matos, Tomé; Prado-Velasco, Manuel; Navarro, Juan Martín; Vallez, Cristina


    The intrinsic characteristics of physiological systems demand two critical requirements at the time of mathematical modeling: multilevel description and reusability. These features are not properly satisfied by current methodologies. In this paper the design of a multilevel and reusable methodology for modeling pharmacokinetic-physiological systems is presented. It has been implemented under a compliant modeling language to validate its reliability, obtaining a simulation components library, LibPK. A 3-pool urea kinetic model, whose vascular pool includes red blood cells, was built by means of LibPK. This model successfully confirmed the ability of this technology and the underlying methodology for addressing multilevel and reusability features, surpassing other physiologically based pharmacokinetic modeling technologies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Development of a cross-section methodology and a real-time core model for VVER-1000 simulator application

    Energy Technology Data Exchange (ETDEWEB)

    Georgieva, Emiliya Lyudmilova


    The novel academic contributions are summarized as follows. A) A cross-section modelling methodology and a cycle-specific cross-section update procedure are developed to meet fidelity requirements applicable to a cycle-specific reactor core simulation, as well as particular customer needs and practices supporting VVER-1000 operation and safety. B) A real-time version of the Nodal Expansion Method code is developed and implemented into Kozloduy 6 full-scope replica control room simulator.

  5. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation (United States)

    Pawar, Sumedh; Sharma, Atul


    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  6. Methodology of Transport Scheme Selection for Metro Trains Using a Combined Simulation-Optimization Model

    Directory of Open Access Journals (Sweden)

    Svetla Dimitrova Stoilova


    Full Text Available A major problem connected with planning the organization of trains in metros is the optimization of the scheme of movement, which determines the routing and the number of trains. In this paper, a combined simulation-optimization model including four steps is proposed. In the first step, the train movement has been simulated in order to study the interval between the trains according to the incoming passenger flows at the stations. The simulation model was elaborated using the ARENA software. The results were validated through experimental observations. Using the results obtained from simulations in the second step the correlation between the observed parameters - the incoming passengers and the interval between trains - has been studied. Recent research has established a non-linear relationship between the interval of movement, incoming passengers at the station and passengers on the platform. The third step defines the variant schemes of transportation. The fourth step presents the optimal choice of transportation of trains in metros based on linear optimization model. The model uses the regression obtained in the second step. The practicability of the combined simulation-optimization model is demonstrated through the case study of Sofia’s metro in two peak periods – morning and evening. The model results and the real situation have been compared. It was found that the model results are similar to the real data for the morning peak period but for the evening peak period it is necessary to increase the number of trains.

  7. Analysis of Wind Turbine Simulation Models: Assessment of Simplified versus Complete Methodologies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Honrubia-Escribano, A.; Jimenez-Buendia, F.; Molina-Garcia, A.; Fuentes-Moreno, J. A.; Muljadi, Eduard; Gomez-Lazaro, E.


    This paper presents the current status of simplified wind turbine models used for power system stability analysis. This work is based on the ongoing work being developed in IEC 61400-27. This international standard, for which a technical committee was convened in October 2009, is focused on defining generic (also known as simplified) simulation models for both wind turbines and wind power plants. The results of the paper provide an improved understanding of the usability of generic models to conduct power system simulations.

  8. General Methodology for Metabolic Pathways Modeling and Simulation using Discrete Event Approach. Example on glycolysis of Yeast.


    Antoine-Santoni, Thierry; Bernardi, Fabrice; Giamarchi, François


    7 pages; International audience; In the Bioinformatics research domain, the systemics approach (considering a cell as a system) is growing up very quickly. Most of the significant current researches focus on mathematical formalisms. However, the discrete event simulation domain is so mature nowadays, that it can be considered as the pending of differential equations in the continuous models domain. In this paper, we introduce a general methodology using the DEVS formalism in the bioinformatic...

  9. A Methodology for Model Comparison Using the Theater Simulation of Airbase Resources and All Mobile Tactical Air Force Models (United States)


    ability to simulate the capability of an airbase to generate and sustain sorties under wartime conditions, but the estimates pro- duced by the TSAR model...research are the common airbase simulation or sortie generation features for which the two models are purport- edly similar. The TSAR model is accepted...titled "Comparison of the TSAR Model to the LCOM Model," attempted to compare two models. Noble’s effort consisted of a statistical analysis of the two

  10. HRP's Healthcare Spin-Offs Through Computational Modeling and Simulation Practice Methodologies (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Peng, Grace; Morrison, Tina; Erdemir, Ahmet; Myers, Jerry


    Spaceflight missions expose astronauts to novel operational and environmental conditions that pose health risks that are currently not well understood, and perhaps unanticipated. Furthermore, given the limited number of humans that have flown in long duration missions and beyond low Earth-orbit, the amount of research and clinical data necessary to predict and mitigate these health and performance risks are limited. Consequently, NASA's Human Research Program (HRP) conducts research and develops advanced methods and tools to predict, assess, and mitigate potential hazards to the health of astronauts. In this light, NASA has explored the possibility of leveraging computational modeling since the 1970s as a means to elucidate the physiologic risks of spaceflight and develop countermeasures. Since that time, substantial progress has been realized in this arena through a number of HRP funded activates such as the Digital Astronaut Project (DAP) and the Integrated Medical Model (IMM). Much of this success can be attributed to HRP's endeavor to establish rigorous verification, validation, and credibility (VV&C) processes that ensure computational models and simulations (M&S) are sufficiently credible to address issues within their intended scope. This presentation summarizes HRP's activities in credibility of modeling and simulation, in particular through its outreach to the community of modeling and simulation practitioners. METHODS: The HRP requires all M&S that can have moderate to high impact on crew health or mission success must be vetted in accordance to NASA Standard for Models and Simulations, NASA-STD-7009 (7009) [5]. As this standard mostly focuses on engineering systems, the IMM and DAP have invested substantial efforts to adapt the processes established in this standard for their application to biological M&S, which is more prevalent in human health and performance (HHP) and space biomedical research and operations [6,7]. These methods have also generated

  11. An Eulerian two-phase model for steady sheet flow using large-eddy simulation methodology (United States)

    Cheng, Zhen; Hsu, Tian-Jian; Chauchat, Julien


    A three-dimensional Eulerian two-phase flow model for sediment transport in sheet flow conditions is presented. To resolve turbulence and turbulence-sediment interactions, the large-eddy simulation approach is adopted. Specifically, a dynamic Smagorinsky closure is used for the subgrid fluid and sediment stresses, while the subgrid contribution to the drag force is included using a drift velocity model with a similar dynamic procedure. The contribution of sediment stresses due to intergranular interactions is modeled by the kinetic theory of granular flow at low to intermediate sediment concentration, while at high sediment concentration of enduring contact, a phenomenological closure for particle pressure and frictional viscosity is used. The model is validated with a comprehensive high-resolution dataset of unidirectional steady sheet flow (Revil-Baudard et al., 2015, Journal of Fluid Mechanics, 767, 1-30). At a particle Stokes number of about 10, simulation results indicate a reduced von Kármán coefficient of κ ≈ 0.215 obtained from the fluid velocity profile. A fluid turbulence kinetic energy budget analysis further indicates that the drag-induced turbulence dissipation rate is significant in the sheet flow layer, while in the dilute transport layer, the pressure work plays a similar role as the buoyancy dissipation, which is typically used in the single-phase stratified flow formulation. The present model also reproduces the sheet layer thickness and mobile bed roughness similar to measured data. However, the resulting mobile bed roughness is more than two times larger than that predicted by the empirical formulae. Further analysis suggests that through intermittent turbulent motions near the bed, the resolved sediment Reynolds stress plays a major role in the enhancement of mobile bed roughness. Our analysis on near-bed intermittency also suggests that the turbulent ejection motions are highly correlated with the upward sediment suspension flux, while

  12. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators (United States)

    Field, Edward H.


    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  13. Archetype Modeling Methodology. (United States)

    Moner, David; Alberto Maldonado, José; Robles, Montserrat


    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)


    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  15. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    DEFF Research Database (Denmark)

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas


    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  16. Simulation Enabled Safeguards Assessment Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Robert Bean; Trond Bjornard; Thomas Larson


    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.

  17. Modelling and Simulation Methodology for Dynamic Resources Assignment System in Container Terminal

    Directory of Open Access Journals (Sweden)

    Lu Bo


    Full Text Available As the competition among international container terminals has become increasingly fierce, every port is striving to maintain the competitive edge and provide satisfactory services to port users. By virtue of information technology enhancement, many efforts to raise port competitiveness through an advanced operation system are actively being made, and judging from the viewpoint of investment effect, these efforts are more preferable than infrastructure expansion and additional equipment acquisition. Based on simulation, this study has tried to prove that RFID-based real-time location system (RTLS data collection and dynamic operation of transfer equipment brings a positive effect on the productivity improvement and resource utilization enhancement. Moreover, this study on the demand for the real-time data for container terminal operation have been made, and operation processes have been redesigned along with the collection of related data, and based on them, simulations have been conducted. As a result of them, much higher productivity improvement could be expected.

  18. Numerical investigation of Marine Hydrokinetic Turbines: methodology development for single turbine and small array simulation, and application to flume and full-scale reference models (United States)

    Javaherchi Mozafari, Amir Teymour

    A hierarchy of numerical models, Single Rotating Reference Frame (SRF) and Blade Element Model (BEM), were used for numerical investigation of horizontal axis Marine Hydrokinetic (MHK) Turbines. In the initial stage the SRF and BEM were used to simulate the performance and turbulent wake of a flume- and a full-scale MHK turbine reference model. A significant level of understanding and confidence was developed in the implementation of numerical models for simulation of a MHK turbine. This was achieved by simulation of the flume-scale turbine experiments and comparison between numerical and experimental results. Then the developed numerical methodology was applied to simulate the performance and wake of the full-scale MHK reference model (DOE Reference Model 1). In the second stage the BEM was used to simulate the experimental study of two different MHK turbine array configurations (i.e. two and three coaxial turbines). After developing a numerical methodology using the experimental comparison to simulate the flow field of a turbine array, this methodology was applied toward array optimization study of a full-scale model with the goal of proposing an optimized MHK turbine configuration with minimal computational cost and time. In the last stage the BEM was used to investigate one of the potential environmental effects of MHK turbine. A general methodological approach was developed and experimentally validated to investigate the effect of MHK turbine wake on the sedimentation process of suspended particles in a tidal channel.

  19. Models and Methodologies for Realistic Propagation Simulation for Urban Mesh Networks (United States)


    accurate, then there are well known formulas that describe the behavior (e.g., see [10]). In general, the reflection can be written in terms of six ...and the 75th percentile. 5 10 150.5 1 1.5 2 si gm a an d ga m m a sigma gamma variance High densityLess density Figure 4.7: Relationship between...Furthermore, accurate mobility simulation requires knowledge of details such as the types of establishments within each build- ing (e.g., restaurant

  20. A New Methodology for Modeling National Command Level Decisionmaking in War Games and Simulations. (United States)


    Smalltalk ( Goldberg and Kay, 1976) and ROSS (McArthur and Klahr, 1982), but the relationships are subtle and the mechanisms for achieving the discussion of strategic issues such as escala - tion in crisis is much less complex and sophisticated than is often assumed (see also Davis and Stan...Workshop on C3 Systems, Massachusetts Institute of Technology, Cambridge, 1983.) Davis, Paul K., and Peter J. E. Stan, Concepts and Models of Escala

  1. New Methodology for Simulating Fragmentation Munitions

    National Research Council Canada - National Science Library

    Gold, V


    This document on New Methodology for Simulating Fragmentation Munitions consists of the 17 presentation slides that were presented at the Proceedings of the 36th Annual Gun and Ammunition Symposium...

  2. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.


    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  3. Quantum dynamical simulation of non-adiabatic proton transfers in aqueous solution methodology, molecular models and applications

    CERN Document Server

    Billeter, S R


    This thesis describes the methodology of quantum dynamical (QD) simulation of proton transfers in aqueous solutions, its implementation in the simulation program QDGROMOS and its application to protonated water and aqueous solutions of acetic acid. QDGROMOS is based on the GROMOS96 molecular dynamics (MD) program package. Many of the solutions to partial problems such as the representation of the quantum state, the solution of the time-dependent Schrodinger equation, the forces from the quantum subsystem, the time-ordering of the propagations and the correlations between the subsystems, are complementary. In chapter 1, various numerical propagation algorithms for solving the time-dependent Schrodinger equation under the influence of a constant Hamilton operator are compared against each other, mainly in one dimension. A Chebysheff series expansion and the expansion in terms of eigenstates of the Hamilton operator were found to be most stable. Chapter 2 describes the theory, the methods and the algorithms of Q...

  4. Methodological issues in lipid bilayer simulations

    NARCIS (Netherlands)

    Anezo, C; de Vries, AH; Holtje, HD; Tieleman, DP; Marrink, SJ


    Methodological issues in molecular dynamics (MD) simulations, such as the treatment of long-range electrostatic interactions or the type of pressure coupling, have important consequences for the equilibrium properties observed. We report a series of long (up to 150 ns) MD simulations of

  5. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao


    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  6. Simulation statistical foundations and methodology

    CERN Document Server

    Mihram, G Arthur


    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat


    Directory of Open Access Journals (Sweden)

    Marko Hadjina


    Full Text Available In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of suggested methodology, is investigated and described regarding its special characteristics, advantages and reasons for application, especially in shipbuilding production process. Furthermore, simulation modeling basics were described as well as suggested methodology for production process procedure. Case study of suggested methodology application for designing a robotized profile fabrication production process line is demonstrated. Selected design solution, acquired with suggested methodology was evaluated through comparison with robotized profile cutting production line installation in a specific shipyard production process. Based on obtained data from real production the simulation model was further enhanced. Finally, on grounds of this research, results and droved conclusions, directions for further research are suggested.

  8. Linear Model Methodology

    CERN Document Server

    Khuri, Andre I


    Supported by a large number of examples, this book provides a foundation in the theory of linear models and explores the developments in data analysis. It encompasses a wide variety of topics in linear models that incorporate the classical approach and other trends and modeling techniques.

  9. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis (United States)


    development and structured exploration of external models and simulations is a valuable approach within the system architecture domain and system analysis...consistency within the architecture . Recall that each SysML Diagram required a “connected” structure . That is, any elements created within a diagram needed...of this type of 224 structure , the creation of an executable architecture that checks for logical consistency does not support those types of

  10. Model continuity in discrete event simulation: A framework for model-driven development of simulation models

    NARCIS (Netherlands)

    Cetinkaya, D; Verbraeck, A.; Seck, MD


    Most of the well-known modeling and simulation (M&S) methodologies state the importance of conceptual modeling in simulation studies, and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to how to

  11. Teaching and assessing procedural skills using simulation: metrics and methodology. (United States)

    Lammers, Richard L; Davenport, Moira; Korley, Frederick; Griswold-Theodorson, Sharon; Fitch, Michael T; Narang, Aneesh T; Evans, Leigh V; Gross, Amy; Rodriguez, Elliot; Dodge, Kelly L; Hamann, Cara J; Robey, Walter C


    Simulation allows educators to develop learner-focused training and outcomes-based assessments. However, the effectiveness and validity of simulation-based training in emergency medicine (EM) requires further investigation. Teaching and testing technical skills require methods and assessment instruments that are somewhat different than those used for cognitive or team skills. Drawing from work published by other medical disciplines as well as educational, behavioral, and human factors research, the authors developed six research themes: measurement of procedural skills; development of performance standards; assessment and validation of training methods, simulator models, and assessment tools; optimization of training methods; transfer of skills learned on simulator models to patients; and prevention of skill decay over time. The article reviews relevant and established educational research methodologies and identifies gaps in our knowledge of how physicians learn procedures. The authors present questions requiring further research that, once answered, will advance understanding of simulation-based procedural training and assessment in EM.

  12. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  13. Methodology for the LABIHS PWR simulator modernization

    Energy Technology Data Exchange (ETDEWEB)

    Jaime, Guilherme D.G.; Oliveira, Mauro V., E-mail:, E-mail: [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)


    The Human-System Interface Laboratory (LABIHS) simulator is composed by a set of advanced hardware and software components whose goal is to simulate the main characteristics of a Pressured Water Reactor (PWR). This simulator serves for a set of purposes, such as: control room modernization projects; designing of operator aiding systems; providing technological expertise for graphical user interfaces (GUIs) designing; control rooms and interfaces evaluations considering both ergonomics and human factors aspects; interaction analysis between operators and the various systems operated by them; and human reliability analysis in scenarios considering simulated accidents and normal operation. The simulator runs in a PA-RISC architecture server (HPC3700), developed nearby 2000's, using the HP-UX operating system. All mathematical modeling components were written using the HP Fortran-77 programming language with a shared memory to exchange data from/to all simulator modules. Although this hardware/software framework has been discontinued in 2008, with costumer support ceasing in 2013, it is still used to run and operate the simulator. Due to the fact that the simulator is based on an obsolete and proprietary appliance, the laboratory is subject to efficiency and availability issues, such as: downtime caused by hardware failures; inability to run experiments on modern and well known architectures; and lack of choice of running multiple simulation instances simultaneously. This way, there is a need for a proposal and implementation of solutions so that: the simulator can be ported to the Linux operating system, running on the x86 instruction set architecture (i.e. personal computers); we can simultaneously run multiple instances of the simulator; and the operator terminals run remotely. This paper deals with the design stage of the simulator modernization, in which it is performed a thorough inspection of the hardware and software currently in operation. Our goal is to

  14. VERA Core Simulator Methodology for PWR Cycle Depletion

    Energy Technology Data Exchange (ETDEWEB)

    Kochunas, Brendan [University of Michigan; Collins, Benjamin S [ORNL; Jabaay, Daniel [University of Michigan; Kim, Kang Seog [ORNL; Graham, Aaron [University of Michigan; Stimpson, Shane [University of Michigan; Wieselquist, William A [ORNL; Clarno, Kevin T [ORNL; Palmtag, Scott [Core Physics, Inc.; Downar, Thomas [University of Michigan; Gehin, Jess C [ORNL


    This paper describes the methodology developed and implemented in MPACT for performing high-fidelity pressurized water reactor (PWR) multi-cycle core physics calculations. MPACT is being developed primarily for application within the Consortium for the Advanced Simulation of Light Water Reactors (CASL) as one of the main components of the VERA Core Simulator, the others being COBRA-TF and ORIGEN. The methods summarized in this paper include a methodology for performing resonance self-shielding and computing macroscopic cross sections, 2-D/1-D transport, nuclide depletion, thermal-hydraulic feedback, and other supporting methods. These methods represent a minimal set needed to simulate high-fidelity models of a realistic nuclear reactor. Results demonstrating this are presented from the simulation of a realistic model of the first cycle of Watts Bar Unit 1. The simulation, which approximates the cycle operation, is observed to be within 50 ppm boron (ppmB) reactivity for all simulated points in the cycle and approximately 15 ppmB for a consistent statepoint. The verification and validation of the PWR cycle depletion capability in MPACT is the focus of two companion papers.

  15. Robust Optimization in Simulation : Taguchi and Response Surface Methodology

    NARCIS (Netherlands)

    Dellino, G.; Kleijnen, J.P.C.; Meloni, C.


    Optimization of simulated systems is tackled by many methods, but most methods assume known environments. This article, however, develops a 'robust' methodology for uncertain environments. This methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by

  16. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)


    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  17. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software. (United States)

    Zuckerman, Daniel M; Chong, Lillian T


    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.

  18. Adaptive LES Methodology for Turbulent Flow Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oleg V. Vasilyev


    turbulence have recently been completed at the Japanese Earth Simulator (Yokokawa et al. 2002, Kaneda et al. 2003) using a resolution of 40963 (approximately 10{sup 11}) grid points with a Taylor-scale Reynolds number of 1217 (Re {approx} 10{sup 6}). Impressive as these calculations are, performed on one of the world's fastest super computers, more brute computational power would be needed to simulate the flow over the fuselage of a commercial aircraft at cruising speed. Such a calculation would require on the order of 10{sup 16} grid points and would have a Reynolds number in the range of 108. Such a calculation would take several thousand years to simulate one minute of flight time on today's fastest super computers (Moin & Kim 1997). Even using state-of-the-art zonal approaches, which allow DNS calculations that resolve the necessary range of scales within predefined 'zones' in the flow domain, this calculation would take far too long for the result to be of engineering interest when it is finally obtained. Since computing power, memory, and time are all scarce resources, the problem of simulating turbulent flows has become one of how to abstract or simplify the complexity of the physics represented in the full Navier-Stokes (NS) equations in such a way that the 'important' physics of the problem is captured at a lower cost. To do this, a portion of the modes of the turbulent flow field needs to be approximated by a low order model that is cheaper than the full NS calculation. This model can then be used along with a numerical simulation of the 'important' modes of the problem that cannot be well represented by the model. The decision of what part of the physics to model and what kind of model to use has to be based on what physical properties are considered 'important' for the problem. It should be noted that 'nothing is free', so any use of a low order model will by definition lose some information about the

  19. Axiomatic safety-critical assessment process (ASCAP) simulation methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kaufman, L.M.; Giras, T.C. [Virginia Univ., Charlottesville, VA (United States). Center for Safety-Critical Ssytems


    The ASCAP simulation methodology models continuous train movement using a hybrid simulation consisting of both time and event driven portions. The actual train movement is predicated upon the solution of a series of discrete equations to determine its velocity, acceleration and braking profile at a specific time in tandem with the behavior of the train crew and the dispatcher agents in relationship to the behavioral states of the various stationary and mobile objects that a given train encounters. From this hybrid simulation, it is possible to generate accident scenarios. By performing multiple experiments to collect data for various accident scenarios, it will be possible to statistically quantify the risk associated with the various accident scenarios that are identified. (orig.)

  20. Simulation Methodology in Nursing Education and Adult Learning Theory (United States)

    Rutherford-Hemming, Tonya


    Simulation is often used in nursing education as a teaching methodology. Simulation is rooted in adult learning theory. Three learning theories, cognitive, social, and constructivist, explain how learners gain knowledge with simulation experiences. This article takes an in-depth look at each of these three theories as each relates to simulation.…

  1. The SIMRAND methodology - Simulation of Research and Development Projects (United States)

    Miles, R. F., Jr.


    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  2. Evolutionary-Simulative Methodology in the Management of Social and Economic Systems

    Directory of Open Access Journals (Sweden)

    Konyavskiy V.A.


    Full Text Available The article outlines the main provisions of the evolutionary-simulative methodology (ESM which is a methodology of mathematical modeling of equilibrium random processes (CPR, widely used in the economy. It discusses the basic directions of use of ESM solutions for social problems and economic management systems.

  3. A computer simulator for development of engineering system design methodologies (United States)

    Padula, S. L.; Sobieszczanski-Sobieski, J.


    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  4. A computer simulation methodology of metal perforation

    NARCIS (Netherlands)

    Roos, A; de Hosson, J.T.M.; Cleveringa, H.H.M.; van der Giessen, E.; Huetink, J; Baaijens, FPT


    In this paper, shear deformation at high strain rates is modelled with the framework of discrete dislocation plasticity. The total stress state is split into the stress state due to the dislocations in an infinite medium, and the complementary image stress, which is enforced by a finite element

  5. EMC Simulation and Modeling (United States)

    Takahashi, Takehiro; Schibuya, Noboru

    The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.

  6. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur


    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  7. Wind Farm LES Simulations Using an Overset Methodology (United States)

    Ananthan, Shreyas; Yellapantula, Shashank


    Accurate simulation of wind farm wakes under realistic atmospheric inflow conditions and complex terrain requires modeling a wide range of length and time scales. The computational domain can span several kilometers while requiring mesh resolutions in O(10-6) to adequately resolve the boundary layer on the blade surface. Overset mesh methodology offers an attractive option to address the disparate range of length scales; it allows embedding body-confirming meshes around turbine geomtries within nested wake capturing meshes of varying resolutions necessary to accurately model the inflow turbulence and the resulting wake structures. Dynamic overset hole-cutting algorithms permit relative mesh motion that allow this nested mesh structure to track unsteady inflow direction changes, turbine control changes (yaw and pitch), and wake propagation. An LES model with overset mesh for localized mesh refinement is used to analyze wind farm wakes and performance and compared with local mesh refinements using non-conformal (hanging node) unstructured meshes. Turbine structures will be modeled using both actuator line approaches and fully-resolved structures to test the efficacy of overset methods for wind farm applications. Exascale Computing Project (ECP), Project Number: 17-SC-20-SC, a collaborative effort of two DOE organizations - the Office of Science and the National Nuclear Security Administration.

  8. Knowledge Support of Simulation Model Reuse

    Directory of Open Access Journals (Sweden)

    M. Valášek


    Full Text Available This describes the knowledge support for engineering design based on virtual modelling and simulation. These are the results of the EC Clockwork project. A typical and important step in the development of a simulation model is the phase of reusing. Virtual modelling and simulation often use the components of previous models. The usual problem is that the only remaining part of the previous simulation models is the model itself. However, a large amount of knowledge and intermediate models have been used, developed and then lost. A special methodology and special tools have therefore been developed on support of storing, retrieving and reusing the knowledge from previous simulation models. The knowledge support includes informal knowledge, formal knowledge and intermediate engineering models. This paper describes the overall methodology and tools, using the example of developing a simulation model of Trijoint, a new machine tool.

  9. Methodology for simulation of geomagnetically induced currents in power systems

    Directory of Open Access Journals (Sweden)

    Boteler David


    Full Text Available To assess the geomagnetic hazard to power systems it is useful to be able to simulate the geomagnetically induced currents (GIC that are produced during major geomagnetic disturbances. This paper examines the methodology used in power system analysis and shows how it can be applied to modelling GIC. Electric fields in the area of the power network are used to determine the voltage sources or equivalent current sources in the transmission lines. The power network can be described by a mesh impedance matrix which is combined with the voltage sources to calculate the GIC in each loop. Alternatively the power network can be described by a nodal admittance matrix which is combined with the sum of current sources into each node to calculate the nodal voltages which are then used to calculate the GIC in the transmission lines and GIC flowing to ground at each substation. Practical calculations can be made by superposition of results calculated separately for northward and eastward electric fields. This can be done using magnetic data from a single observatory to calculate an electric field that is a uniform approximation of the field over the area of the power system. It is also shown how the superposition of results can be extended to use data from two observatories: approximating the electric field by a linear variation between the two observatory locations. These calculations provide an efficient method for simulating the GIC that would be produced by historically significant geomagnetic storm events.

  10. Energy Performance Contracting Methodology Based upon Simulation and Measurement


    Ligier, Simon; Robillart, Maxime; Schalbart, Patrick; Peuportier, Bruno


    International audience; Discrepancies between ex-ante energy performance assessment and actual consumption of buildings hinder the development of energy performance contracting (EPC). To address this issue, uncertainty integration in simulation as well as measurement and verification (M&V) strategies have been studied. In this article, we propose a methodology, combining detailed energy performance simulation and M&V anticipation. Statistical studies using Monte-Carlo analysis allow a guarant...

  11. Multiscale modelling methodology for virtual prototyping of effervescent tablets. (United States)

    Stepanek, Frantisek; Loo, Adeline; Lim, Tiong Shing


    Computational methodology for virtual tablet prototyping has been developed. The methodology consists of two steps: construction of virtual particle compacts of varying composition using the discrete element method (DEM), and the simulation of their dissolution by eroding individual components from the tablet microstructure according to their intrinsic dissolution rates, using the volume-of-fluid (VOF) method. The effective erosion rate obtained from simulations at the particle assembly length-scale is used for the calculation of dissolution time at the tablet length-scale. The methodology is demonstrated on the case of a simple effervescent formulation consisting of sodium bicarbonate-acetic acid effervescent system, sodium chloride as a model active substance, and lactose filler. The experimentally measured dependence of tablet dissolution time on composition and compaction force was found to be in very good agreement with outputs from computer simulations.

  12. Subspace electrode selection methodology for the reduction of the effect of uncertain conductivity values in the EEG dipole localization: a simulation study using a patient-specific head model (United States)

    Crevecoeur, G.; Montes Restrepo, V.; Staelens, S.


    The simulation of the electroencephalogram (EEG) using a realistic head model needs the correct conductivity values of several tissues. However, these values are not precisely known and have an influence on the accuracy of the EEG source analysis problem. This paper presents a novel numerical methodology for the increase of accuracy of the EEG dipole source localization problem. The presented subspace electrode selection (SES) methodology is able to limit the effect of uncertain conductivity values on the solution of the EEG inverse problem, yielding improved source localization accuracy. We redefine the traditional cost function associated with the EEG inverse problem and introduce a selection procedure of EEG potentials. In each iteration of the presented EEG cost function minimization procedure, potentials are selected that are affected as little as possible by the uncertain conductivity value. Using simulation data, the proposed SES methodology is able to enhance the neural source localization accuracy dependent on the dipole location, the assumed versus actual conductivity and the possible noise in measurements.

  13. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D


    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  14. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel


    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  15. An automated methodology development. [software design for combat simulation (United States)

    Hawley, L. R.


    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  16. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur


    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....

  17. A methodology for the rigorous verification of Particle-in-Cell simulations (United States)

    Riva, Fabio; Beadle, Carrie F.; Ricci, Paolo


    A methodology to perform a rigorous verification of Particle-in-Cell (PIC) simulations is presented, both for assessing the correct implementation of the model equations (code verification) and for evaluating the numerical uncertainty affecting the simulation results (solution verification). The proposed code verification methodology is a generalization of the procedure developed for plasma simulation codes based on finite difference schemes that was described by Riva et al. [Phys. Plasmas 21, 062301 (2014)] and consists of an order-of-accuracy test using the method of manufactured solutions. The generalization of the methodology for PIC codes consists of accounting for numerical schemes intrinsically affected by statistical noise and providing a suitable measure of the distance between continuous, analytical distribution functions and finite samples of computational particles. The solution verification consists of quantifying both the statistical and discretization uncertainties. The statistical uncertainty is estimated by repeating the simulation with different pseudorandom number generator seeds. For the discretization uncertainty, the Richardson extrapolation is used to provide an approximation of the analytical solution and the grid convergence index is used as an estimate of the relative discretization uncertainty. The code verification methodology is successfully applied to a PIC code that numerically solves the one-dimensional, electrostatic, collisionless Vlasov-Poisson system. The solution verification methodology is applied to quantify the numerical uncertainty affecting the two-stream instability growth rate, which is numerically evaluated thanks to a PIC simulation.

  18. Axi-symmetric simulation of a two phase vertical thermosyphon using Eulerian two-fluid methodology (United States)

    Kafeel, Khurram; Turan, Ali


    Numerical simulation of steady state operation of a vertical two phase closed thermosyphon is performed using the two-fluid methodology within Eulerian multiphase domain. A full scale axi-symmetric model is developed for computational fluid dynamics simulation of thermosyphon using ANSYS/FLUENT 13.0. The effects of evaporation, condensation and interfacial heat and mass transfer are taken into account within the whole domain. Cooling water jacket is also modelled along with the wall of thermosyphon to simulate the effect of conjugate heat transfer between the wall and fluid phase. The results obtained are presented and compared with available experimental investigations for a similar thermosyphon. It is established that two-fluid methodology can be used effectively for the purpose of simulation of two phase system like a typical thermosyphon.

  19. Technical progress report for application of numerical simulation methodology to automotive combustion

    Energy Technology Data Exchange (ETDEWEB)



    The third quarterly technical progress report is presented for DOE Contract No. DE-AC-03-79-ET15397.001 entitled, Application of Numerical Simulation Methodology to Automotive Combustion. Work during the period has concentrated on completing the model development and validation for in-cylinder fluid dynamics via: simulation (and data comparison) for piston induced vortex roll-up at high Reynolds number; simulation (and data comparison) for the decay of swirl in the Sandia DISC engine; and definition of compression cycle parametric cycle simulations. The results of these studies are described.

  20. A methodology for 2D cutting process simulation of solid end mill (United States)

    Skrypka, Kateryna; Pittala, Gaetano


    The FEM simulation of end mill is complex, due to the three dimensional tool geometry. In this paper 2D FEM simulation, obtained with Thirdwave AdvantEdge v.7.0, was used in order to set up the material model. A proper machining test was designed in order to compensate the helix angle of the tool, in order to have 2D cutting forces. A mechanicistic model was developed in order to simulate, together with 2D FEM, the cutting forces in milling operation. A comparison with experimental cutting forces have been performed in order to validate the methodology.

  1. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin


    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  2. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás


    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  3. Electric and plug-in hybrid vehicles advanced simulation methodologies

    CERN Document Server

    Varga, Bogdan Ovidiu; Moldovanu, Dan; Iclodean, Calin


    This book is designed as an interdisciplinary platform for specialists working in electric and plug-in hybrid electric vehicles powertrain design and development, and for scientists who want to get access to information related to electric and hybrid vehicle energy management, efficiency and control. The book presents the methodology of simulation that allows the specialist to evaluate electric and hybrid vehicle powertrain energy flow, efficiency, range and consumption. The mathematics behind each electric and hybrid vehicle component is explained and for each specific vehicle the powertrain

  4. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab


    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  5. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.


    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  6. BSM2 Plant-Wide Model construction and comparative analysis with other methodologies for integrated modelling. (United States)

    Grau, P; Vanrolleghem, P; Ayesa, E


    In this paper, a new methodology for integrated modelling of the WWTP has been used for the construction of the Benchmark Simulation Model N degrees 2 (BSM2). The transformations-approach proposed in this methodology does not require the development of specific transformers to interface unit process models and allows the construction of tailored models for a particular WWTP guaranteeing the mass and charge continuity for the whole model. The BSM2 PWM constructed as case study, is evaluated by means of simulations under different scenarios and its validity in reproducing water and sludge lines in WWTP is demonstrated. Furthermore the advantages that this methodology presents compared to other approaches for integrated modelling are verified in terms of flexibility and coherence.

  7. Methodology for digital radiography simulation using the Monte Carlo code MCNPX for industrial applications. (United States)

    Souza, E M; Correa, S C A; Silva, A X; Lopes, R T; Oliveira, D F


    This work presents a methodology for digital radiography simulation for industrial applications using the MCNPX radiography tally. In order to perform the simulation, the energy-dependent response of a BaFBr imaging plate detector was modeled and introduced in the MCNPX radiography tally input. In addition, a post-processing program was used to convert the MCNPX radiography tally output into 16-bit digital images. Simulated and experimental images of a steel pipe containing corrosion alveoli and stress corrosion cracking were compared, and the results showed good agreement between both images.

  8. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion


    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  9. A novel methodology for model-based OPC verification (United States)

    Huang, Tengyen; Liao, ChunCheng; Chou, Ryan; Liao, Hung-Yueh; Schacht, Jochen


    Model-based optical proximity correction (OPC) is an indispensable production tool enabling successful extension of photolithography down to sub-80nm regime. Commercial OPC software has established clear procedures to produce accurate OPC models at best focus condition. However, OPC models calibrated at best focus condition sometimes fail to prevent catastrophic circuit failure due to patterning short & open caused by accidental shifts of dose/ focus within the corners of allowed processes window. A novel model-based OPC verification methodology is presented in this work, which precisely pinpoints post OPC photolithography failures in VLSI circuits through the entire lithographic process window. By application of a critical photolithography process window model in OPC verification software, we successfully uncovered all weak points of a design prior tape out, eliminating high risk of circuits open & shorts at the extreme corner of the lithographic process window in any complex circuit layout environment. The process window-related information is usually not taken into consideration when running OPC verification procedures with models calibrated at nominal process condition. Intensive review of the critical dimension (CD) and top-view SEM micrographs from the weak points indicate matching between post OPC simulation and measurements. Using a single highly accurate process window resist model provides a reliable OPC verification methodology when used in a field- or grid-based simulation engine ensuring manufacturability within the largest possible process window for any modern critical design.

  10. In Vivo Predictive Dissolution (IPD) and Biopharmaceutical Modeling and Simulation: Future Use of Modern Approaches and Methodologies in a Regulatory Context. (United States)

    Lennernäs, H; Lindahl, A; Van Peer, A; Ollier, C; Flanagan, T; Lionberger, R; Nordmark, A; Yamashita, S; Yu, L; Amidon, G L; Fischer, V; Sjögren, E; Zane, P; McAllister, M; Abrahamsson, B


    The overall objective of OrBiTo, a project within Innovative Medicines Initiative (IMI), is to streamline and optimize the development of orally administered drug products through the creation and efficient application of biopharmaceutics tools. This toolkit will include both experimental and computational models developed on improved understanding of the highly dynamic gastrointestinal (GI) physiology relevant to the GI absorption of drug products in both fasted and fed states. A part of the annual OrBiTo meeting in 2015 was dedicated to the presentation of the most recent progress in the development of the regulatory use of PBPK in silico modeling, in vivo predictive dissolution (IPD) tests, and their application to biowaivers. There are still several areas for improvement of in vitro dissolution testing by means of generating results relevant for the intraluminal conditions in the GI tract. The major opportunity is probably in combining IPD testing and physiologically based in silico models where the in vitro data provide input to the absorption predictions. The OrBiTo project and other current research projects include definition of test media representative for the more distal parts of the GI tract, models capturing supersaturation and precipitation phenomena, and influence of motility waves on shear and other forces of hydrodynamic origin, addressing the interindividual variability in composition and characteristics of GI fluids, food effects, definition of biorelevant buffer systems, and intestinal water volumes. In conclusion, there is currently a mismatch between the extensive industrial usage of modern in vivo predictive tools and very limited inclusion of such data in regulatory files. However, there is a great interest among all stakeholders to introduce recent progresses in prediction of in vivo GI drug absorption into regulatory context.

  11. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server



    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  12. Quantifying Roughness Coefficient Uncertainty in Urban Flooding Simulations through a Simplified Methodology

    Directory of Open Access Journals (Sweden)

    Vasilis Bellos


    Full Text Available A methodology is presented which can be used in the evaluation of parametric uncertainty in urban flooding simulation. Due to the fact that such simulations are time consuming, the following methodology is proposed: (a simplification of the description of the physical process; (b derivation of a training data set; (c development of a data-driven surrogate model; (d use of a forward uncertainty propagation scheme. The simplification comprises the following steps: (a unit hydrograph derivation using a 2D hydrodynamic model; (b calculation of the losses in order to determine the effective rainfall depth; (c flood event simulation using the principle of the proportionality and superposition. The above methodology was implemented in an urban catchment located in the city of Athens, Greece. The model used for the first step of the simplification was FLOW-R2D, whereas the well-known SWMM software (US Environmental Protection Agency, Washington, DC, USA was used for the second step of the simplification. For the training data set derivation, an ensemble of 100 Unit Hydrographs was derived with the FLOW-R2D model. The parameters which were modified in order to produce this ensemble were the Manning coefficients in the two friction zones (residential and urban open space areas. The surrogate model used to replicate the unit hydrograph derivation, using the Manning coefficients as an input, was based on the Polynomial Chaos Expansion technique. It was found that, although the uncertainties in the derived results have to be taken into account, the proposed methodology can be a fast and efficient way to cope with dynamic flood simulation in an urban catchment.

  13. Simulation model and methodology for calculating the damage by internal radiation in a PWR reactor; Modelo de simulacion y metodologia para el calculo del dano por irradiacion en los internos de un reactor PWR

    Energy Technology Data Exchange (ETDEWEB)

    Cadenas Mendicoa, A. M.; Benito Hernandez, M.; Barreira Pereira, P.


    This study involves the development of the methodology and three-dimensional models to estimate the damage to the vessel internals of a commercial PWR reactor from irradiation history of operating cycles.

  14. Analyzing the uncertainty of simulation results in accident reconstruction with Response Surface Methodology. (United States)

    Zou, Tiefang; Cai, Ming; Du, Ronghua; Liu, Jike


    This paper is focused on the uncertainty of simulation results in accident reconstruction. The Upper and Lower Bound Method (ULM) and the Finite Difference Method (FDM), which can be easily applied in this field, are introduced firstly; the Response Surface Methodology (RSM) is then introduced into this field as an alternative methodology. In RSM, a sample set is firstly generated via uniform design; secondly, experiments are conducted according to the sample set with the help of simulation methods; thirdly, a response surface model is determined through regression analysis; finally, the uncertainty of simulation results can be analyzed using a combination of the response surface model and existing uncertainty analysis methods. It is later discussed in detail how to generate a sample set, how to calculate the range of simulation results and how to analyze the parameter sensitivity in RSM. Finally, the feasibility of RSM is validated by five cases. Moreover, the applicability of RSM, ULM and FDM in analyzing the uncertainty of simulation results is studied; the phenomena that ULM and FDM can hardly work while RSM can is found in the latter two cases. After an analysis of these five cases and the number of simulation runs required for each method, both advantages and disadvantages of these uncertainty analysis methods are indicated. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  16. Climate Model Ensemble Methodology: Rationale and Challenges (United States)

    Vezer, M. A.; Myrvold, W.


    A tractable model of the Earth's atmosphere, or, indeed, any large, complex system, is inevitably unrealistic in a variety of ways. This will have an effect on the model's output. Nonetheless, we want to be able to rely on certain features of the model's output in studies aiming to detect, attribute, and project climate change. For this, we need assurance that these features reflect the target system, and are not artifacts of the unrealistic assumptions that go into the model. One technique for overcoming these limitations is to study ensembles of models which employ different simplifying assumptions and different methods of modelling. One then either takes as reliable certain outputs on which models in the ensemble agree, or takes the average of these outputs as the best estimate. Since the Intergovernmental Panel on Climate Change's Fourth Assessment Report (IPCC AR4) modellers have aimed to improve ensemble analysis by developing techniques to account for dependencies among models, and to ascribe unequal weights to models according to their performance. The goal of this paper is to present as clearly and cogently as possible the rationale for climate model ensemble methodology, the motivation of modellers to account for model dependencies, and their efforts to ascribe unequal weights to models. The method of our analysis is as follows. We will consider a simpler, well-understood case of taking the mean of a number of measurements of some quantity. Contrary to what is sometimes said, it is not a requirement of this practice that the errors of the component measurements be independent; one must, however, compensate for any lack of independence. We will also extend the usual accounts to include cases of unknown systematic error. We draw parallels between this simpler illustration and the more complex example of climate model ensembles, detailing how ensembles can provide more useful information than any of their constituent models. This account emphasizes the

  17. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.


    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model developm......We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  18. A methodology for modeling regional terrorism risk. (United States)

    Chatterjee, Samrat; Abkowitz, Mark D


    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  19. A methodology for modeling barrier island storm-impact scenarios (United States)

    Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy


    A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.

  20. Methodology, models and algorithms in thermographic diagnostics

    CERN Document Server

    Živčák, Jozef; Madarász, Ladislav; Rudas, Imre J


    This book presents  the methodology and techniques of  thermographic applications with focus primarily on medical thermography implemented for parametrizing the diagnostics of the human body. The first part of the book describes the basics of infrared thermography, the possibilities of thermographic diagnostics and the physical nature of thermography. The second half includes tools of intelligent engineering applied for the solving of selected applications and projects. Thermographic diagnostics was applied to problematics of paraplegia and tetraplegia and carpal tunnel syndrome (CTS). The results of the research activities were created with the cooperation of the four projects within the Ministry of Education, Science, Research and Sport of the Slovak Republic entitled Digital control of complex systems with two degrees of freedom, Progressive methods of education in the area of control and modeling of complex object oriented systems on aircraft turbocompressor engines, Center for research of control of te...

  1. Methodological Developments in Geophysical Assimilation Modeling (United States)

    Christakos, George


    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to

  2. Model evaluation methodology applicable to environmental assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Shaeffer, D.L.


    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes.

  3. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael


    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  4. A generalized methodology to characterize composite materials for pyrolysis models (United States)

    McKinnon, Mark B.

    The predictive capabilities of computational fire models have improved in recent years such that models have become an integral part of many research efforts. Models improve the understanding of the fire risk of materials and may decrease the number of expensive experiments required to assess the fire hazard of a specific material or designed space. A critical component of a predictive fire model is the pyrolysis sub-model that provides a mathematical representation of the rate of gaseous fuel production from condensed phase fuels given a heat flux incident to the material surface. The modern, comprehensive pyrolysis sub-models that are common today require the definition of many model parameters to accurately represent the physical description of materials that are ubiquitous in the built environment. Coupled with the increase in the number of parameters required to accurately represent the pyrolysis of materials is the increasing prevalence in the built environment of engineered composite materials that have never been measured or modeled. The motivation behind this project is to develop a systematic, generalized methodology to determine the requisite parameters to generate pyrolysis models with predictive capabilities for layered composite materials that are common in industrial and commercial applications. This methodology has been applied to four common composites in this work that exhibit a range of material structures and component materials. The methodology utilizes a multi-scale experimental approach in which each test is designed to isolate and determine a specific subset of the parameters required to define a material in the model. Data collected in simultaneous thermogravimetry and differential scanning calorimetry experiments were analyzed to determine the reaction kinetics, thermodynamic properties, and energetics of decomposition for each component of the composite. Data collected in microscale combustion calorimetry experiments were analyzed to

  5. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa


    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  6. Quantifying roughness coefficient uncertainty in urban flooding simulations through a simplified methodology

    NARCIS (Netherlands)

    Bellos, Vasilis; Kourtis, Ioannis M.; Moreno Rodenas, A.M.; Tsihrintzis, Vassilios A.


    © 2017 by the authors. A methodology is presented which can be used in the evaluation of parametric uncertainty in urban flooding simulation. Due to the fact that such simulations are time consuming, the following methodology is proposed: (a) simplification of the description of the physical

  7. A framework for using simulation methodology in ergonomics interventions in design projects

    DEFF Research Database (Denmark)

    Broberg, Ole; Duarte, Francisco; Andersen, Simone Nyholm


    The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective......The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective...

  8. Adaptive System Modeling for Spacecraft Simulation (United States)

    Thomas, Justin


    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  9. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.


    is to give a structured, convenient approach for building threat models. A framework for the threat model is presented with a list of requirements for methodology. The methodology will be applied to build a threat model for Personal Networks. Practical tools like UML sequence diagrams and attack trees have...

  10. Simulation as a surgical teaching model. (United States)

    Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos


    Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. A methodology for determining the dynamic exchange of resources in nuclear fuel cycle simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gidden, Matthew J., E-mail: [International Institute for Applied Systems Analysis, Schlossplatz 1, A-2361 Laxenburg (Austria); University of Wisconsin – Madison, Department of Nuclear Engineering and Engineering Physics, Madison, WI 53706 (United States); Wilson, Paul P.H. [University of Wisconsin – Madison, Department of Nuclear Engineering and Engineering Physics, Madison, WI 53706 (United States)


    Highlights: • A novel fuel cycle simulation entity interaction mechanism is proposed. • A framework and implementation of the mechanism is described. • New facility outage and regional interaction scenario studies are described and analyzed. - Abstract: Simulation of the nuclear fuel cycle can be performed using a wide range of techniques and methodologies. Past efforts have focused on specific fuel cycles or reactor technologies. The CYCLUS fuel cycle simulator seeks to separate the design of the simulation from the fuel cycle or technologies of interest. In order to support this separation, a robust supply–demand communication and solution framework is required. Accordingly an agent-based supply-chain framework, the Dynamic Resource Exchange (DRE), has been designed implemented in CYCLUS. It supports the communication of complex resources, namely isotopic compositions of nuclear fuel, between fuel cycle facilities and their managers (e.g., institutions and regions). Instances of supply and demand are defined as an optimization problem and solved for each timestep. Importantly, the DRE allows each agent in the simulation to independently indicate preference for specific trading options in order to meet both physics requirements and satisfy constraints imposed by potential socio-political models. To display the variety of possible simulations that the DRE enables, example scenarios are formulated and described. Important features include key fuel-cycle facility outages, introduction of external recycled fuel sources (similar to the current mixed oxide (MOX) fuel fabrication facility in the United States), and nontrivial interactions between fuel cycles existing in different regions.

  12. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens


    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  13. Integrated methodology for constructing a quantified hydrodynamic model for application to clastic petroleum reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Honarpour, M. M.; Schatzinger, R. A.; Szpakiewicz, M. J.; Jackson, S. R.; Sharma, B.; Tomutsa, L.; Chang, M. M.


    A comprehensive, multidisciplinary, stepwise methodology is developed for constructing and integration geological and engineering information for predicting petroleum reservoir performance. This methodology is based on our experience in characterizing shallow marine reservoirs, but it should also apply to other deposystems. The methodology is presented as Part 1 of this report. Three major tasks that must be studied to facilitate a systematic approach for constructing a predictive hydrodynamic model for petroleum reservoirs are addressed: (1) data collection, organization, evaluation, and integration; (2) hydrodynamic model construction and verification; and (3) prediction and ranking of reservoir parameters by numerical simulation using data derived from the model. 39 refs., 62 figs., 13 tabs.

  14. Powertrain modeling and simulation for off-road vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Ouellette, S. [McGill Univ., Montreal, PQ (Canada)


    Standard forward facing automotive powertrain modeling and simulation methodology did not perform equally for all vehicles in all applications in the 2010 winter Olympics, 2009 world alpine ski championships, summit station in Greenland, the McGill Formula Hybrid, Unicell QuickSider, and lunar mobility. This presentation provided a standard automotive powertrain modeling and simulation flow chart as well as an example. It also provided a flow chart for location based powertrain modeling and simulation and discussed location based powertrain modeling and simulation implementation. It was found that in certain applications, vehicle-environment interactions cannot be neglected in order to have good model fidelity. Powertrain modeling and simulation of off-road vehicles demands a new approach to powertrain modeling and simulation. It was concluded that the proposed location based methodology could improve the results for off-road vehicles. tabs., figs.

  15. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels


    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus the...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  16. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  17. A new methodology to simulate subglacial deformation of water saturated granular material

    DEFF Research Database (Denmark)

    Damsgaard, Anders; Egholm, David Lundbek; Piotrowski, Jan A.


    can cause variations in the pore-fluid pressure. The pressure variations weaken or strengthen the granular phase, and in turn influence the distribution of shear strain with depth. In permeable sediments the strain distribution is governed by the grain-size distribution and effective normal stress...... of subglacial sediment to the shear stress of an overriding glacier. In this study, we present a new methodology designed to simulate subglacial deformation using a coupled numerical model for computational experiments on grain-fluid mixtures. The granular phase is simulated on a per-grain basis by the discrete......The dynamics of glaciers are to a large degree governed by processes operating at the ice-bed interface, and one of the primary mechanisms of glacier flow over soft unconsolidated sediments is subglacial deformation. However, it has proven difficult to constrain the mechanical response...

  18. Methodology for Simulation and Analysis of Complex Adaptive Supply Network Structure and Dynamics Using Information Theory

    Directory of Open Access Journals (Sweden)

    Joshua Rodewald


    Full Text Available Supply networks existing today in many industries can behave as complex adaptive systems making them more difficult to analyze and assess. Being able to fully understand both the complex static and dynamic structures of a complex adaptive supply network (CASN are key to being able to make more informed management decisions and prioritize resources and production throughout the network. Previous efforts to model and analyze CASN have been impeded by the complex, dynamic nature of the systems. However, drawing from other complex adaptive systems sciences, information theory provides a model-free methodology removing many of those barriers, especially concerning complex network structure and dynamics. With minimal information about the network nodes, transfer entropy can be used to reverse engineer the network structure while local transfer entropy can be used to analyze the network structure’s dynamics. Both simulated and real-world networks were analyzed using this methodology. Applying the methodology to CASNs allows the practitioner to capitalize on observations from the highly multidisciplinary field of information theory which provides insights into CASN’s self-organization, emergence, stability/instability, and distributed computation. This not only provides managers with a more thorough understanding of a system’s structure and dynamics for management purposes, but also opens up research opportunities into eventual strategies to monitor and manage emergence and adaption within the environment.

  19. The SIMRAND methodology: Theory and application for the simulation of research and development projects (United States)

    Miles, R. F., Jr.


    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  20. Goal Model to Business Process Model: A Methodology for Enterprise Government Tourism System Development

    National Research Council Canada - National Science Library

    Ahmad Nurul Fajar; Imam Marzuki Shofi


    .... However, the goal model could not used directly to make business process model. In order to solve this problem,this paper presents and proposed a Methodology to extract goal model into business process model that called GBPM Methodology...

  1. Modeling Methodologies For Microvibration- Related Analyses Of Spacecraft Structrues (United States)

    Remedia, M.; Aglietti, G. S.; Zhang, Z.; Le Page, B.; Richardson, G.


    Driven by the increasingly stringent stability requirement of some modern payloads (e.g. the new generations of optical instruments) the issue of accurate spacecraft micro-vibration modeling has grown increasingly important. In this context micro-vibrations are low level mechanical disturbances occurring at frequencies from a few Hertz up to 1000 Hz. As the frequency content of these phenomena extends beyond the first few modal frequencies, FEA predictions become less accurate and alternative methods have to be considered. Other modeling and analysis techniques have been investigated and applied to vibration problems (Stochastic Finite Element Method (e.g. Monte Carlo Simulation), Statistical Energy Analysis (well-established method for high frequency ranges) and the Hybrid FE-SEA), with the aim of investigating medium and high frequency behavior. This work is part of a project whose aim is to establish appropriate procedures for the modeling and analysis of micro-vibration and validate these procedures against experimental data. All the methods cited above are implemented in this study and compared with experimental results, in order to assess the performance of the various methodologies for micro-vibration problems, covering the whole frequency range up to 1000 Hz. Some comparisons between experimental and computational results are performed using the MAC. Some other analyses, like linearity, reciprocity or effect of the harness are also described. The bench work model that has provided the experimental data is the satellite platform SSTL 300 and this paper outlines these related test campaigns.

  2. Global Methodology to Integrate Innovative Models for Electric Motors in Complete Vehicle Simulators Méthodologie générale d’intégration de modèles innovants de moteurs électriques dans des simulateurs véhicules complets

    Directory of Open Access Journals (Sweden)

    Abdelli A.


    Full Text Available By what means the greenhouse gas emissions of passenger cars can be reduced to 120 g/km in 2012 and 95 g/km in 2020 as the European Commission and the automotive manufacturers are stated? This question with multi answers preoccupies at the moment the whole automobile world. One of the most promising solutions which receive attention is the electrification of the vehicle. It is this idea that has prompted the automobile manufacturers to envisage increasingly innovative hybrid vehicles. However, this theoretically interesting solution makes more complex the powertrain, which requires the use of simulation tools in order to reduce the cost and the time of system development. System simulation, which is already a crucial tool for the design process of internal combustion engines, becomes indispensable in the development of the Hybrid Electric Vehicle (HEV. To study the complex structures of HEV, following the example of the physical models developed for the internal combustion engine, system simulation has to provide itself of the same predictive models for electric machines. From their specifications, these models have to take into account the strict constraint on the time simulation. This constraint guarantees the wide use of simulators, notably to help the development and the validation of control strategies. This paper aims to present a global methodology to develop innovative models of electrical machines. The final objective of these models is to be integrated in a global vehicle simulator. This methodology includes several types of models and tools, as Finite Elements Models (FEM, characterization and simulating models. This methodology was applied successfully to model an internal permanent magnet synchronous motors. At the end of the modelling process of the electric motor, the latter has been integrated in a complete global hybrid vehicle. This guarantees the good operation and integration in the global process of a new vehicle concept


    Directory of Open Access Journals (Sweden)



    Full Text Available Nowadays, grace of computing possibilities that electronic computers offer and namely, big memory volume and computing speed, there is the improving of modeling methods, an important role having complex system modeling using simulation techniques. These o

  4. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte


    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operation...... in case of such faults. The design of the controller is described and its performance assessed by simulations. The control strategies are explained and the behaviour of the turbine discussed....

  5. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca


    electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. We argue that validation is part of the whole MSE system and is contingent upon 1) understanding and coping with sources...

  6. Defense Modeling and Simulation Initiative (United States)


    an artificial battlefield created by computer-based simulation software. The most important constraint associated with this type of simulators is the...techniques for improving on this situation, which draw on artificial intelligence, mathematical programming, and simpler operations research methods...algoriims, data structwres for real-time represenmion and modeling • Develop a global hierarchy ofinn erable environmental models - Develop inteligent

  7. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)



    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  8. Multidisciplinary Engineering Models: Methodology and Case Study in Spreadsheet Analytics


    Birch, D.; Liang, H.; Ko, J.; Kelly, P; Field, A.; Mullineux, G; Simondetti, A


    This paper demonstrates a methodology to help practitioners maximise the utility of complex multidisciplinary engineering models implemented as spreadsheets, an area presenting unique challenges. As motivation we investigate the expanding use of Integrated Resource Management(IRM) models which assess the sustainability of urban masterplan designs. IRM models reflect the inherent complexity of multidisciplinary sustainability analysis by integrating models from many disciplines. This complexit...

  9. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))


    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  10. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian


    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  11. Overview of Computer Simulation Modeling Approaches and Methods (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett


    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  12. Teaching Behavioral Modeling and Simulation Techniques for Power Electronics Courses (United States)

    Abramovitz, A.


    This paper suggests a pedagogical approach to teaching the subject of behavioral modeling of switch-mode power electronics systems through simulation by general-purpose electronic circuit simulators. The methodology is oriented toward electrical engineering (EE) students at the undergraduate level, enrolled in courses such as "Power…

  13. Model Calibration for Ship Simulations

    NARCIS (Netherlands)

    E.F.G. van Daalen (Ed); J. Fehribach; T. van Leeuwen (Tristan); C. Reinhardt; N. Schenkels; R. Sheombarsing


    htmlabstractModel calibration is an important aspect in ship simulation. Here, ship motion is described by an ODE which includes tuning parameters that capture complex physical processes such as friction of the hull. In order for the simulations to be realistic for a wide range of

  14. Model Calibration for Ship Simulations

    NARCIS (Netherlands)

    van Daalen, Ed; Fehribach, Joseph; van Leeuwen, Tristan; Reinhardt, Christian; Schenkels, Nick; Sheombarsing, Ray


    Model calibration is an important aspect in ship simulation. Here, ship motion is described by an ODE which includes tuning parameters that capture complex physical processes such as friction of the hull. In order for the simulations to be realistic for a wide range of scenarios these tuning

  15. Multilevel Modeling: A Review of Methodological Issues and Applications (United States)

    Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.


    This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…

  16. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid


    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF...... in the orientation and, thereby, allow the robots to undertake any relative configuration the attitude is represented in Euler parameters....

  17. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ


    Full Text Available and Turner, 1996). Steady-state Gaussian plume dispersion models require hourly single-point meteorological data at the surface and an upper air station to estimate the mixing 928 D.J. Moschandreas et al. / Chemosphere 49 (2002) 923?946 height. More... require core parameters (Table 1). These include surface wind direction, wind speed, air temperature and cloud data. In addition, upper air data, typically measured twice per day, determines wind, temperature, and humidity changes with height (Schulze...

  18. Progress in modeling and simulation. (United States)

    Kindler, E


    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  19. Development of a numerical methodology for flowforming process simulation of complex geometry tubes (United States)

    Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca


    Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.

  20. Methodological advances: using greenhouses to simulate climate change scenarios. (United States)

    Morales, F; Pascual, I; Sánchez-Díaz, M; Aguirreolea, J; Irigoyen, J J; Goicoechea, N; Antolín, M C; Oyarzun, M; Urdiain, A


    Human activities are increasing atmospheric CO2 concentration and temperature. Related to this global warming, periods of low water availability are also expected to increase. Thus, CO2 concentration, temperature and water availability are three of the main factors related to climate change that potentially may influence crops and ecosystems. In this report, we describe the use of growth chamber - greenhouses (GCG) and temperature gradient greenhouses (TGG) to simulate climate change scenarios and to investigate possible plant responses. In the GCG, CO2 concentration, temperature and water availability are set to act simultaneously, enabling comparison of a current situation with a future one. Other characteristics of the GCG are a relative large space of work, fine control of the relative humidity, plant fertirrigation and the possibility of light supplementation, within the photosynthetic active radiation (PAR) region and/or with ultraviolet-B (UV-B) light. In the TGG, the three above-mentioned factors can act independently or in interaction, enabling more mechanistic studies aimed to elucidate the limiting factor(s) responsible for a given plant response. Examples of experiments, including some aimed to study photosynthetic acclimation, a phenomenon that leads to decreased photosynthetic capacity under long-term exposures to elevated CO2, using GCG and TGG are reported. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Simulation of Detonation Problems with MLS Grid Free Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Yao, J; Gunger, M E; Matuska, D A


    The MLS grid free rezone method, a simple, flexible finite difference method to solve general mechanics problems, especially detonation problems, is proposed in this paper. The spatial points that carry time dependent data are distributed in space in such a way that provides nearly uniform spacing of points, accurate presentation of boundaries, easy variation of resolutions and arbitrary deletion of irrelevant regions. Local finite difference operators are obtained with simple MLS differentiation. There is no specific topological or geometrical restriction with the distribution of data points. Therefore this method avoids many drawbacks of the traditional CFD methods. Because of its flexibility, it can be used to simulate a wide range of mechanics problems. Because of its simplicity, it has the potential to become a preferred method. Most traditional CFD methods, from a SPH view, can be considered as special cases of grid free methods of specific kernel functions. Such a generalization allows the development of a unified grid free CFD code that can be switched to various CFD methods by switching the kernel functions. Because of the flexibility in management and simplicity of coding, such a unified code is desired.

  2. Traffic and emission simulation in China based on statistical methodology (United States)

    Liu, Huan; He, Kebin; Barth, Matthew


    To better understand how the traffic control can affect vehicle emissions, a novel TRaffic And Vehicle Emission Linkage (TRAVEL) approach was developed based on local traffic activity and emission data. This approach consists of a two-stage mapping from general traffic information to traffic flow patterns, and then to the aggregated emission rates. 39 traffic flow patterns and corresponding emission rates for light-duty and heavy-duty vehicles considering emission standards classification are generated. As a case study, vehicle activity and emissions during the Beijing Olympics were simulated and compared to BAU scenario. Approximately 42-65% of the gaseous pollutants and 24% of the particle pollutants from cars, taxies and buses were reduced. These results are validated by traffic and air quality monitoring data during the Olympics, as well as other emission inventory studies. This approach improves the ability to fast predict emission variation from traffic control measurements in several typical Chinese cities. Comments related to application of this approach with both advantages and limitations are included.

  3. TREAT Modeling and Simulation Strategy

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  4. Modeling and Simulation at NASA (United States)

    Steele, Martin J.


    This slide presentation is composed of two topics. The first reviews the use of modeling and simulation (M&S) particularly as it relates to the Constellation program and discrete event simulation (DES). DES is defined as a process and system analysis, through time-based and resource constrained probabilistic simulation models, that provide insight into operation system performance. The DES shows that the cycles for a launch from manufacturing and assembly to launch and recovery is about 45 days and that approximately 4 launches per year are practicable. The second topic reviews a NASA Standard for Modeling and Simulation. The Columbia Accident Investigation Board made some recommendations related to models and simulations. Some of the ideas inherent in the new standard are the documentation of M&S activities, an assessment of the credibility, and reporting to decision makers, which should include the analysis of the results, a statement as to the uncertainty in the results,and the credibility of the results. There is also discussion about verification and validation (V&V) of models. There is also discussion about the different types of models and simulation.

  5. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L


    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  6. Co-simulation Methodologies for Hybrid and Electric Vehicle Dynamics


    Veintimilla Porlán, Julia


    In recent decades, full electric and hybrid electric vehicles have emerged as an alternative to conventional cars due to a range of factors, including environmental and economic aspects. These vehicles are the result of considerable efforts to seek ways of reducing the use of fossil fuel for vehicle propulsion. Sophisticated technologies such as hybrid and electric powertrains require careful study and optimization. Mathematical models play a key role at this point. Currently, many advanced m...

  7. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette


    trials. However, if simulation models would be used, good quality input data must be available. To model FMD, several disease spread models are available. For this project, we chose three simulation model; Davis Animal Disease Spread (DADS), that has been upgraded to DTU-DADS, InterSpread Plus (ISP......) and the North American Animal Disease Spread Model (NAADSM). The models are rather data intensive, but in varying degrees. They generally demand data on the farm level, including farm location, type, number of animals, and movement and contact frequency to other farms. To be able to generate a useful model...... of FMD spread that can provide useful and trustworthy advises, there are four important issues, which the model should represent: 1) The herd structure of the country in question, 2) the dynamics of animal movements and contacts between herds, 3) the biology of the disease, and 4) the regulations...

  8. Greenhouse simulation models.

    NARCIS (Netherlands)

    Bot, G.P.A.


    A model is a representation of a real system to describe some properties i.e. internal factors of that system (out-puts) as function of some external factors (inputs). It is impossible to describe the relation between all internal factors (if even all internal factors could be defined) and all

  9. Complex systems models: engineering simulations


    Polack, Fiona A. C.; Hoverd, Tim; Sampson, Adam T.; Stepney, Susan; Timmis, Jon,


    As part of research towards the CoSMoS unified infrastructure for modelling and simulating complex systems, we review uses of definitional and descriptive models in natural science and computing, and existing integrated platforms. From these, we identify requirements for engineering models of complex systems, and consider how some of the requirements could be met, using state-of-the-art model management and a mobile, process-oriented computing paradigm.

  10. Data and models in Action. Methodological Issues in Production Ecology

    NARCIS (Netherlands)

    Stein, A.; Penning, de F.W.T.


    This book addresses methodological issues of production ecology. A central issue is the combination of the agricultural model with reliable data in relation to scale. A model is developed with data from a single point, whereas decisions are to be made for areas of land. Such an approach requires the

  11. Twitter's tweet method modelling and simulation (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.


    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  12. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek


    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  14. Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Krause, E.; et al.


    We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihood $\\Delta \\chi^2 \\le 0.045$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$~h^{-1}$) and galaxy-galaxy lensing (12 Mpc$~h^{-1}$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.

  15. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.


    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  16. Validation of response simulation methodology of Albedo dosemeter; Validacao da metodologia de simulacao de resposta de dosimetro de Albedo

    Energy Technology Data Exchange (ETDEWEB)

    Freitas, B.M.; Silva, A.X. da, E-mail: [Coordenacao do Programas de Pos-Graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Mauricio, C.L.P. [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)


    The Instituto de Radioprotecao e Dosimetria developed and runs a neutron TLD albedo individual monitoring service. To optimize the dose calculation algorithm and to infer new calibration factors, the response of this dosemeter was simulated. In order to validate this employed methodology, it was applied in the simulation of the problem of the QUADOS (Quality Assurance of Computational Tools for Dosimetry) intercomparison, aimed to evaluate dosimetric problems, one being to calculate the response of a generic albedo dosemeter. The obtained results were compared with those of other modeling and the reference one, with good agreements. (author)

  17. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto


    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  18. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.


    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  19. Facebook's personal page modelling and simulation (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.


    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  20. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels


    This paper describes the modelling, simulating and optimizing including experimental verication as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and re tube boilers. A detailed dynamic model...... of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic- Equation system. Being able to operate...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantication of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to dene parts...


    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels


    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantification of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to define parts...

  2. Recursive modular modelling methodology for lumped-parameter dynamic systems (United States)

    Orsino, Renato Maia Matarazzo


    This paper proposes a novel approach to the modelling of lumped-parameter dynamic systems, based on representing them by hierarchies of mathematical models of increasing complexity instead of a single (complex) model. Exploring the multilevel modularity that these systems typically exhibit, a general recursive modelling methodology is proposed, in order to conciliate the use of the already existing modelling techniques. The general algorithm is based on a fundamental theorem that states the conditions for computing projection operators recursively. Three procedures for these computations are discussed: orthonormalization, use of orthogonal complements and use of generalized inverses. The novel methodology is also applied for the development of a recursive algorithm based on the Udwadia-Kalaba equation, which proves to be identical to the one of a Kalman filter for estimating the state of a static process, given a sequence of noiseless measurements representing the constraints that must be satisfied by the system.

  3. Methodological approach to simulation and choice of ecologically efficient and energetically economic wind turbines (WT) (United States)

    Bespalov, Vadim; Udina, Natalya; Samarskaya, Natalya


    Use of wind energy is related to one of the prospective directions among renewed energy sources. A methodological approach is reviewed in the article to simulation and choice of ecologically efficient and energetically economic wind turbines on the designing stage taking into account characteristics of natural-territorial complex and peculiarities of anthropogenic load in the territory of WT location.

  4. Methodology of modeling fiber reinforcement in concrete elements

    NARCIS (Netherlands)

    Stroeven, P.


    This paper’s focus is on the modeling methodology of (steel) fiber reinforcement in concrete. The orthogonal values of fiber efficiency are presented. Bulk as well as boundary situations are covered. Fiber structure is assumed due to external compaction by vibration to display a partially linear

  5. Modeling control in manufacturing simulation

    NARCIS (Netherlands)

    Zee, Durk-Jouke van der; Chick, S.; Sánchez, P.J.; Ferrin, D.; Morrice, D.J.


    A significant shortcoming of traditional simulation languages is the lack of attention paid to the modeling of control structures, i.e., the humans or systems responsible for manufacturing planning and control, their activities and the mutual tuning of their activities. Mostly they are hard coded

  6. Methodological assessment of kinetic Monte Carlo simulations of organic photovoltaic devices: the treatment of electrostatic interactions. (United States)

    Casalegno, Mosè; Raos, Guido; Po, Riccardo


    The kinetic Monte Carlo (KMC) method provides a versatile tool to investigate the mechanisms underlying photocurrent generation in nanostructured organic solar cells. Currently available algorithms can already support the development of more cost-efficient photovoltaic devices, but so far no attempt has been made to test the validity of some fundamental model assumptions and their impact on the simulation result. A meaningful example is given by the treatment of the electrostatic interactions. In most KMC models, electrostatic interactions are approximated by means of cutoff based potentials, irrespective of the long-range nature of the Coulomb interaction. In this paper, the reliability of such approximation is tested against the exact Ewald sum. The results under short-circuit and flat-band conditions show that use of cutoff-based potentials tends to underestimate real device performance, in terms of internal quantum efficiency and current density. Together with this important finding, we formalize other methodological aspects which have been scarcely discussed in the literature.

  7. Agent-based Modeling Methodology for Analyzing Weapons Systems (United States)


    1998) published similarities between ABM and the more traditional equation based methods of simulation and developed criteria for choosing one...analysis method of the effects of a new weapon on tactics and combat decision making by modeling flexible agent behaviors in a mission level combat...adaptive system (Bullock, McIntyre, & Hill, 2000). There are several statistical methods available for conducting analysis of simulation models (Law

  8. [Use of simulation-based methodologies for teaching and learning in Portuguese medical schools]. (United States)

    Reynolds, Ana; Campos, D Ayres de; Bernardes, João


    The main purpose of medical simulation is for students and healthcare professionals to learn, individually or as team members. A questionnaire was developed on the use of medical simulators or simulation-based techniques applied to Medicine, and sent to the directors of all medical schools in Portugal (n = 7). The aim was to contribute to a better understanding of teaching through the use of simulation applied to Medicine. In the curricular year of 2006-07 all medical schools used simulators, or techniques of medical simulation, in their pre-graduate training in Medicine. A small number of other initiatives in pre-and post-graduate medical training were also reported. Despite these activities, there is still a large potential for expansion of simulation-based teaching methodologies in Portuguese medical schools. The growing number of students admitted to medical courses, together with the increase in medico-legal conflicts, leads to a need for curricular developments and adjustments in teaching methodologies.

  9. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring


    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  10. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving. (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice


    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  11. Viscoelastic flow simulations in model porous media (United States)

    De, S.; Kuipers, J. A. M.; Peters, E. A. J. F.; Padding, J. T.


    We investigate the flow of unsteadfy three-dimensional viscoelastic fluid through an array of symmetric and asymmetric sets of cylinders constituting a model porous medium. The simulations are performed using a finite-volume methodology with a staggered grid. The solid-fluid interfaces of the porous structure are modeled using a second-order immersed boundary method [S. De et al., J. Non-Newtonian Fluid Mech. 232, 67 (2016), 10.1016/j.jnnfm.2016.04.002]. A finitely extensible nonlinear elastic constitutive model with Peterlin closure is used to model the viscoelastic part. By means of periodic boundary conditions, we model the flow behavior for a Newtonian as well as a viscoelastic fluid through successive contractions and expansions. We observe the presence of counterrotating vortices in the dead ends of our geometry. The simulations provide detailed insight into how flow structure, viscoelastic stresses, and viscoelastic work change with increasing Deborah number De. We observe completely different flow structures and different distributions of the viscoelastic work at high De in the symmetric and asymmetric configurations, even though they have the exact same porosity. Moreover, we find that even for the symmetric contraction-expansion flow, most energy dissipation is occurring in shear-dominated regions of the flow domain, not in extensional-flow-dominated regions.

  12. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie


    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  13. Modeling and Simulation of Nanoindentation (United States)

    Huang, Sixie; Zhou, Caizhi


    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  14. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik


    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance......, a correct spectral shape, and non-Gaussian statistics, is selected in order to evaluate the model turbulence. An actual turbulence record is analyzed in detail providing both a standard for comparison and input statistics for the generalized spectral analysis, which in turn produces a set of orthonormal....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  15. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)



    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  16. Simulation Framework for Teaching in Modeling and Simulation Areas (United States)

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan


    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  17. An experimental methodology for a fuzzy set preference model (United States)

    Turksen, I. B.; Willson, Ian A.


    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  18. A Methodology to Assess Ionospheric Models for GNSS (United States)

    Rovira-Garcia, Adria; Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Ibánez, Deimos


    Testing the accuracy of the ionospheric models used in the Global Navigation Satellite System (GNSS) is a long-standing issue. It is still a challenging problem due to the lack of accurate enough slant ionospheric determinations to be used as a reference. The present study proposes a methodology to assess any ionospheric model used in satellite-based applications and, in particular, GNSS ionospheric models. The methodology complements other analysis comparing the navigation based on different models to correct the code and carrier-phase observations. Specifically, the following ionospheric models are assessed: the operational models broadcast in the Global Positioning System (GPS), Galileo and the European Geostationary Navigation Overlay System (EGNOS), the post-process Global Ionospheric Maps (GIMs) from different analysis centers belonging to the International GNSS Service (IGS) and, finally, a new GIM computed by the gAGE/UPC research group. The methodology is based in the comparison between the predictions of the ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences shall be separated into the hardware delays (a receiver constant plus a satellite constant) per data interval, e.g., a day. The condition that these Differential Code Biases (DCBs) are commonly shared throughout the world-wide network of receivers and satellites provides a global character to the assessment. This approach generalizes simple tests based on double differenced Slant Total Electron Contents (STECs) between pairs of satellites and receivers on a much local scale. The present study has been conducted during the entire 2014, i.e., the last Solar Maximum. The seasonal and latitudinal structures of the results clearly reflect the different strategies used by the different models. On one hand, ionospheric model corrections based on a grid (IGS-GIMs or EGNOS) are shown to be several times better than the models

  19. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji


    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  20. A methodology to calibrate pedestrian walker models using multiple objectives

    NARCIS (Netherlands)

    Campanella, M.C.; Daamen, W.; Hoogendoorn, S.P.


    The application of walker models to simulate real situations require accuracy in several traffic situations. One strategy to obtain a generic model is to calibrate the parameters in several situations using multiple-objective functions in the optimization process. In this paper, we propose a general

  1. Standard for Models and Simulations (United States)

    Steele, Martin J.


    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  2. Beyond Modeling: All-Atom Olfactory Receptor Model Simulations

    Directory of Open Access Journals (Sweden)

    Peter C Lai


    Full Text Available Olfactory receptors (ORs are a type of GTP-binding protein-coupled receptor (GPCR. These receptors are responsible for mediating the sense of smell through their interaction with odor ligands. OR-odorant interactions marks the first step in the process that leads to olfaction. Computational studies on model OR structures can validate experimental functional studies as well as generate focused and novel hypotheses for further bench investigation by providing a view of these interactions at the molecular level. Here we have shown the specific advantages of simulating the dynamic environment that is associated with OR-odorant interactions. We present a rigorous methodology that ranges from the creation of a computationally-derived model of an olfactory receptor to simulating the interactions between an OR and an odorant molecule. Given the ubiquitous occurrence of GPCRs in the membranes of cells, we anticipate that our OR-developed methodology will serve as a model for the computational structural biology of all GPCRs.

  3. A framework for multiscale and multiscience modeling and numerical simulations

    NARCIS (Netherlands)

    Chopard, B.; Falcone, J.-L.; Hoekstra, A.G.; Borgdorff, J.


    The Complex Automata (CxA) methodology offers a new framework to develop multiscale and multiscience numerical simulations. The CxA approach assumes that a multiscale model can be formulated in terms of several coupled single-scale submodels. With concepts such as the scale separation map, the

  4. Telco Clouds: Modelling and Simulation


    Krzywda, Jakub; Tärneberg, William; Östberg, Per-Olov; Kihl, Maria; Elmroth, Erik


    In this paper, we propose a telco cloud meta-model that can be used to simulate different infrastructure con- figurations and explore their consequences on the system performance and costs. To achieve this, we analyse current telecommunication and data centre infrastructure paradigms, describe the architecture of the telco cloud and detail the benefits of merging both infrastructures in a unified system. Next, we detail the dynamics of the telco cloud and identify the components that are the ...

  5. Towards an in-plane methodology to track breast lesions using mammograms and patient-specific finite-element simulations (United States)

    Lapuebla-Ferri, Andrés; Cegoñino-Banzo, José; Jiménez-Mocholí, Antonio-José; Pérez del Palomar, Amaya


    In breast cancer screening or diagnosis, it is usual to combine different images in order to locate a lesion as accurately as possible. These images are generated using a single or several imaging techniques. As x-ray-based mammography is widely used, a breast lesion is located in the same plane of the image (mammogram), but tracking it across mammograms corresponding to different views is a challenging task for medical physicians. Accordingly, simulation tools and methodologies that use patient-specific numerical models can facilitate the task of fusing information from different images. Additionally, these tools need to be as straightforward as possible to facilitate their translation to the clinical area. This paper presents a patient-specific, finite-element-based and semi-automated simulation methodology to track breast lesions across mammograms. A realistic three-dimensional computer model of a patient’s breast was generated from magnetic resonance imaging to simulate mammographic compressions in cranio-caudal (CC, head-to-toe) and medio-lateral oblique (MLO, shoulder-to-opposite hip) directions. For each compression being simulated, a virtual mammogram was obtained and posteriorly superimposed to the corresponding real mammogram, by sharing the nipple as a common feature. Two-dimensional rigid-body transformations were applied, and the error distance measured between the centroids of the tumors previously located on each image was 3.84 mm and 2.41 mm for CC and MLO compression, respectively. Considering that the scope of this work is to conceive a methodology translatable to clinical practice, the results indicate that it could be helpful in supporting the tracking of breast lesions.

  6. Decision analytic modeling in spinal surgery: a methodologic overview with review of current published literature. (United States)

    McAnany, Steven J; Anwar, Muhammad A F; Qureshi, Sheeraz A


    In recent years, there has been an increase in the number of decision analysis studies in the spine literature. Although there are several published reviews on the different types of decision analysis (cost-effectiveness, cost-benefit, cost-utility), there is limited information in the spine literature regarding the mathematical models used in these studies (decision tree, Markov modeling, Monte Carlo simulation). The purpose of this review was to provide an overview of the types of decision analytic models used in spine surgery. A secondary aim was to provide a systematic overview of the most cited studies in the spine literature. This is a systematic review of the available information from all sources regarding decision analytics and economic modeling in spine surgery. A systematic search of PubMed, Embase, and Cochrane review was performed to identify the most relevant peer-reviewed literature of decision analysis/cost-effectiveness analysis (CEA) models including decisions trees, Markov models, and Monte Carlo simulations. Additionally, CEA models based on investigational drug exemption studies were reviewed in particular detail, as these studies are prime candidates for economic modeling. The initial review of the literature resulted in 712 abstracts. After two reviewer-assessment of abstract relevance and methodologic quality, 19 studies were selected: 12 with decision tree constructs and 7 with Markov models. Each study was assessed for methodologic quality and a review of the overall results of the model. A generalized overview of the mathematical construction and methodology of each type of model was also performed. Limitations, strengths, and potential applications to spine research were further explored. Decision analytic modeling represents a powerful tool both in the assessment of competing treatment options and potentially in the formulation of policy and reimbursement. Our review provides a generalized overview and a conceptual framework to help

  7. Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems (United States)

    Steinthorsson, Erlendur; Modiano, David


    Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.

  8. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian


    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  9. A dynamic hybrid RANS/LES modeling methodology for turbulent/transitional flow field prediction (United States)

    Alam, Mohammad Faridul

    A dynamic hybrid Reynolds-averaged Navier-Stokes (RANS)-Large Eddy Simulation (LES) modeling framework has been investigated and further developed to improve the Computational Fluid Dynamics (CFD) prediction of turbulent flow features along with laminar-to-turbulent transitional phenomena. In recent years, the use of hybrid RANS/LES (HRL) models has become more common in CFD simulations, since HRL models offer more accuracy than RANS in regions of flow separation at a reduced cost relative to LES in attached boundary layers. The first part of this research includes evaluation and validation of a dynamic HRL (DHRL) model that aims to address issues regarding the RANS-to-LES zonal transition and explicit grid dependence, both of which are inherent to most current HRL models. Simulations of two test cases---flow over a backward facing step and flow over a wing with leading-edge ice accretion---were performed to assess the potential of the DHRL model for predicting turbulent features involved in mainly unsteady separated flow. The DHRL simulation results are compared with experimental data, along with the computational results for other HRL and RANS models. In summary, these comparisons demonstrate that the DHRL framework does address many of the weaknesses inherent in most current HRL models. Although HRL models are widely used in turbulent flow simulations, they have limitations for transitional flow predictions. Most HRL models include a fully turbulent RANS component for attached boundary layer regions. The small number of HRL models that do include transition-sensitive RANS models have issues related to the RANS model itself and to the zonal transition between RANS and LES. In order to address those issues, a new transition-sensitive HRL modeling methodology has been developed that includes the DHRL methodology and a physics-based transition-sensitive RANS model. The feasibility of the transition-sensitive dynamic HRL (TDHRL) model has been investigated by

  10. Building Energy Simulation Test for Existing Homes (BESTEST-EX) Methodology: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Polly, B.; Bianchi, M.; Neymark, J.


    The test suite represents a set of cases applying the new Building Energy Simulation Test for Existing Homes (BESTEST-EX) Methodology developed by NREL. (Judkoff et al. 2010a). The NREL team developed the test cases in consultation with the home retrofit industry (BESTEST-EX Working Group 2009), and adjusted the test specifications in accordance with information supplied by a participant with access to large utility bill datasets (Blasnik 2009).

  11. Numerical simulation methodologies for design and development of Diffuser-Augmented Wind Turbines – analysis and comparison

    Directory of Open Access Journals (Sweden)

    Michał Lipian


    Full Text Available Different numerical computation methods used to develop a methodology for fast, efficient, reliable design and comparison of Diffuser-Augmented Wind Turbine (DAWT geometries are presented. The demand for such methods is evident, following the multitude of geometrical parameters that influence the flow character through ducted turbines. The results of the Actuator Disk Model (ADM simulations will be confronted with a simulation method of higher order of accuracy, i.e. the 3D Fully-resolved Rotor Model (FRM in the rotor design point. Both will be checked for consistency with the experimental results measured in the wind tunnel at the Institute of Turbo-machinery (IMP, Lodz University of Technology (TUL. An attempt to find an efficient method (with a compromise between accuracy and design time for the flow analysis pertinent to the DAWT is a novel approach presented in this paper.

  12. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek


    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  13. Simulating the use of products : Applying the nucleus paradigm to resource-integrated virtual interaction models

    NARCIS (Netherlands)

    Van der Vegte, W.F.; Horváth, I.; Rusák, Z.


    We introduce a methodology for modelling and simulating fully virtual human-artefact systems, aiming to resolve two issues in virtual prototyping: (i) integration of distinct modelling and simulation approaches, and (ii) extending the deployability of simulations towards conceptual design. We are

  14. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  15. Generalized equilibrium modeling: the methodology of the SRI-Gulf energy model. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gazalet, E.G.


    The report provides documentation of the generalized equilibrium modeling methodology underlying the SRI-Gulf Energy Model and focuses entirely on the philosophical, mathematical, and computational aspects of the methodology. The model is a highly detailed regional and dynamic model of the supply and demand for energy in the US. The introduction emphasized the need to focus modeling efforts on decisions and the coordinated decomposition of complex decision problems using iterative methods. The conceptual framework is followed by a description of the structure of the current SRI-Gulf model and a detailed development of the process relations that comprise the model. The network iteration algorithm used to compute a solution to the model is described and the overall methodology is compared with other modeling methodologies. 26 references.


    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels


    In the present work a framework for optimizing the design of boilers for dynamic operation has been developed. A cost function to be minimized during the optimization has been formulated and for the present design variables related to the Boiler Volume and the Boiler load Gradient (i.e. ring rate...... on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...

  17. Abdominal surgery process modeling framework for simulation using spreadsheets. (United States)

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja


    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.


    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna


    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  19. Economic costs of power interruptions: a consistent model and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ghajar, Raymond F. [School of Engineering and Architecture Lebanese American University P.O. Box 36, Byblos (Lebanon); Billinton, Roy [Power Systems Research Group University of Saskatchewan Saskatoon, Sask., S7N 5A9 (Canada)


    One of the most basic requirements in cost/benefit assessments of generation and transmission systems are the costs incurred by customers due to power interruptions. This paper provides a consistent set of cost of interruption data that can be used to assess the reliability worth of a power system. In addition to this basic data, methodologies for calculating the customer damage functions and the interrupted energy assessment rates for individual load points in the system and for the entire service area are also presented. The proposed model and methodology are illustrated by application to the IEEE-reliability test system (IEEE-RTS) [A Report Prepared by the Reliability Test System Task Force of the Application of Probability Methods Subcommittee, IEEE Reliability Test System, IEEE Trans. on PAS, Vol. PAS-98, No.6, pp. 2047-2054, November/December 1979. [1

  20. The Simulation of Daily Temperature Time Series from GCM Output. Part II: Sensitivity Analysis of an Empirical Transfer Function Methodology. (United States)

    Winkler, Julie A.; Palutikof, Jean P.; Andresen, Jeffrey A.; Goodess, Clare M.


    Empirical transfer functions have been proposed as a means for `downscaling' simulations from general circulation models (GCMs) to the local scale. However, subjective decisions made during the development of these functions may influence the ensuing climate scenarios. This research evaluated the sensitivity of a selected empirical transfer function methodology to 1) the definition of the seasons for which separate specification equations are derived, 2) adjustments for known departures of the GCM simulations of the predictor variables from observations, 3) the length of the calibration period, 4) the choice of function form, and 5) the choice of predictor variables. A modified version of the Climatological Projection by Model Statistics method was employed to generate control (1 × CO2) and perturbed (2 × CO2) scenarios of daily maximum and minimum temperature for two locations with diverse climates (Alcantarilla, Spain, and Eau Claire, Michigan). The GCM simulations used in the scenario development were from the Canadian Climate Centre second-generation model (CCC GCMII).Variations in the downscaling methodology were found to have a statistically significant impact on the 2 × CO2 climate scenarios, even though the 1 × CO2 scenarios for the different transfer function approaches were often similar. The daily temperature scenarios for Alcantarilla and Eau Claire were most sensitive to the decision to adjust for deficiencies in the GCM simulations, the choice of predictor variables, and the seasonal definitions used to derive the functions (i.e., fixed seasons, floating seasons, or no seasons). The scenarios were less sensitive to the choice of function form (i.e., linear versus nonlinear) and to an increase in the length of the calibration period.The results of Part I, which identified significant departures of the CCC GCMII simulations of two candidate predictor variables from observations, together with those presented here in Part II, 1) illustrate the

  1. Methodology for the 3D modeling and visualization of concurrency networks (United States)

    Dance, Linda K.; Fishwick, Paul A.


    One of the primary formalisms for modeling concurrency and resource contention in systems is Petri nets. The literature on Petri nets is rich with activity on applications as well as extensions. We use the basic Petri net formalism as well as several extensions to demonstrate how metaphor can be applied to yield 3D model worlds. A number of metaphors, including 3D-primitive and landscape are employed within the framework of VRML-enabled simulation. We designed a template for use in creating any Petri net model and then using the template, implemented an example model utilizing metaphors for both the structure and the behaviors of the model. We determined that the result is an effectively and efficiently communicated model with high memory retention. The modeling methodology that we employ was successfully implemented for Petri nets with the ability for model reuse and/or personalization with any aesthetics applied to the desired Petri net.

  2. Simulated annealing model of acupuncture (United States)

    Shang, Charles; Szu, Harold


    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  3. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko


    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  4. Multilevel Methodology for Simulation of Spatio-Temporal Systems with Heterogeneous Activity; Application to Spread of Valley Fever Fungus (United States)

    Jammalamadaka, Rajanikanth


    This report consists of a dissertation submitted to the faculty of the Department of Electrical and Computer Engineering, in partial fulfillment of the requirements for the degree of Doctor of Philosophy, Graduate College, The University of Arizona, 2008. Spatio-temporal systems with heterogeneity in their structure and behavior have two major problems associated with them. The first one is that such complex real world systems extend over very large spatial and temporal domains and consume so many computational resources to simulate that they are infeasible to study with current computational platforms. The second one is that the data available for understanding such systems is limited because they are spread over space and time making it hard to obtain micro and macro measurements. This also makes it difficult to get the data for validation of their constituent processes while simultaneously considering their global behavior. For example, the valley fever fungus considered in this dissertation is spread over a large spatial grid in the arid Southwest and typically needs to be simulated over several decades of time to obtain useful information. It is also hard to get the temperature and moisture data (which are two critical factors on which the survival of the valley fever fungus depends) at every grid point of the spatial domain over the region of study. In order to address the first problem, we develop a method based on the discrete event system specification which exploits the heterogeneity in the activity of the spatio-temporal system and which has been shown to be effective in solving relatively simple partial differential equation systems. The benefit of addressing the first problem is that it now makes it feasible to address the second problem. We address the second problem by making use of a multilevel methodology based on modeling and simulation and systems theory. This methodology helps us in the construction of models with different resolutions (base and

  5. The simulation of daily temperature time series from GCM output. Part II: Sensitivity analysis of an empirical transfer function methodology

    Energy Technology Data Exchange (ETDEWEB)

    Winkler, J.A.; Andresen, J.A. [Michigan State Univ., East Lansing, MI (United States); Palutikof, J.P.; Goodess, C.M. [Univ. of East Anglia, Norwich (United Kingdom)


    Empirical transfer functions have been proposed as a means for {open_quotes}downscaling{close_quotes} simulations from general circulation models (GCMs) to the local scale. However, subjective decisions made during the development of these functions may influence the ensuing climate scenarios. This research evaluated the sensitivity of a selected empirical transfer function methodology to (1) the definition of the seasons for which separate specification equations are derived, (2) adjustments for known departures of the GCM simulations of the predictor variables from observations, (3) the length of the calibration period, (4) the choice of function form, and (5) the choice of predictor variables. A modified version of the Climatological Projection by Model Statistics method was employed to generate control (1 x CO{sub 2}) and perturbed (2 x CO{sub 2}) scenarios of daily maximum and minimum temperature for two locations with diverse climates (Alcantarilla, Spain, and Eau Claire, Michigan). The GCM simulations used in the scenario development were from the Canadian Climate Centre second-generation model (CCC GCMII). Variations in the downscaling methodology were found to have a statistically significant impact on the 2 x CO{sub 2} climate scenarios, even though the 1 x CO{sub 2} scenarios for the different transfer function approaches were often similar. The daily temperature scenarios for Alcantarilla and Eau Claire were most sensitive to the decision to adjust for deficiencies in the GCM simulations, the choice of predictor variables, and the seasonal definitions used to derive the functions (i.e., fixed seasons, floating seasons, or no seasons). The scenarios were less sensitive to the choice of function form (i.e., linear versus nonlinear) and to an increase in the length of the calibration period. 44 refs., 5 figs., 11 tabs.

  6. Collecting real-time data with a behavioral simulation: A new methodological trait

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    As the complexity of problems and the speed of changes increase for companies, the accuracy of research relies on the measurement of behavior close to the real world of the respondents. Recent reviews within different research fields of the social sciences have pointed at the need for more...... interactive methods of collecting data [1, 2]. To collect real-time data as opposed to retrospective data, new methodological traits are needed. The paper proposes that a behavioral simulation supported by Web technology is a valid new research strategy to handle the collection of real-time data. Adapting...... to 'survey fatigues' and the lack of interaction [1, 12-16]. The concerns of the paper are the contributions of a simulation, the opportunities and challenges with Web technology, ensuring the validity and reliability of the data collected, and finally, a designed behavioral simulation is presented...

  7. Uterine Contraction Modeling and Simulation (United States)

    Liu, Miao; Belfore, Lee A.; Shen, Yuzhong; Scerbo, Mark W.


    Building a training system for medical personnel to properly interpret fetal heart rate tracing requires developing accurate models that can relate various signal patterns to certain pathologies. In addition to modeling the fetal heart rate signal itself, the change of uterine pressure that bears strong relation to fetal heart rate and provides indications of maternal and fetal status should also be considered. In this work, we have developed a group of parametric models to simulate uterine contractions during labor and delivery. Through analysis of real patient records, we propose to model uterine contraction signals by three major components: regular contractions, impulsive noise caused by fetal movements, and low amplitude noise invoked by maternal breathing and measuring apparatus. The regular contractions are modeled by an asymmetric generalized Gaussian function and least squares estimation is used to compute the parameter values of the asymmetric generalized Gaussian function based on uterine contractions of real patients. Regular contractions are detected based on thresholding and derivative analysis of uterine contractions. Impulsive noise caused by fetal movements and low amplitude noise by maternal breathing and measuring apparatus are modeled by rational polynomial functions and Perlin noise, respectively. Experiment results show the synthesized uterine contractions can mimic the real uterine contractions realistically, demonstrating the effectiveness of the proposed algorithm.

  8. A methodological approach for using high-level Petri Nets to model the immune system response. (United States)

    Pennisi, Marzio; Cavalieri, Salvatore; Motta, Santo; Pappalardo, Francesco


    Mathematical and computational models showed to be a very important support tool for the comprehension of the immune system response against pathogens. Models and simulations allowed to study the immune system behavior, to test biological hypotheses about diseases and infection dynamics, and to improve and optimize novel and existing drugs and vaccines. Continuous models, mainly based on differential equations, usually allow to qualitatively study the system but lack in description; conversely discrete models, such as agent based models and cellular automata, permit to describe in detail entities properties at the cost of losing most qualitative analyses. Petri Nets (PN) are a graphical modeling tool developed to model concurrency and synchronization in distributed systems. Their use has become increasingly marked also thanks to the introduction in the years of many features and extensions which lead to the born of "high level" PN. We propose a novel methodological approach that is based on high level PN, and in particular on Colored Petri Nets (CPN), that can be used to model the immune system response at the cellular scale. To demonstrate the potentiality of the approach we provide a simple model of the humoral immune system response that is able of reproducing some of the most complex well-known features of the adaptive response like memory and specificity features. The methodology we present has advantages of both the two classical approaches based on continuous and discrete models, since it allows to gain good level of granularity in the description of cells behavior without losing the possibility of having a qualitative analysis. Furthermore, the presented methodology based on CPN allows the adoption of the same graphical modeling technique well known to life scientists that use PN for the modeling of signaling pathways. Finally, such an approach may open the floodgates to the realization of multi scale models that integrate both signaling pathways (intra

  9. Methodology and preliminary models for analyzing nuclear safeguards decisions

    Energy Technology Data Exchange (ETDEWEB)


    This report describes a general analytical tool designed to assist the NRC in making nuclear safeguards decisions. The approach is based on decision analysis--a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material, demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria), and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant.

  10. Methodology and basic algorithms of the Livermore Economic Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.B.


    The methodology and the basic pricing algorithms used in the Livermore Economic Modeling System (EMS) are described. The report explains the derivations of the EMS equations in detail; however, it could also serve as a general introduction to the modeling system. A brief but comprehensive explanation of what EMS is and does, and how it does it is presented. The second part examines the basic pricing algorithms currently implemented in EMS. Each algorithm's function is analyzed and a detailed derivation of the actual mathematical expressions used to implement the algorithm is presented. EMS is an evolving modeling system; improvements in existing algorithms are constantly under development and new submodels are being introduced. A snapshot of the standard version of EMS is provided and areas currently under study and development are considered briefly.

  11. Integrating FMEA in a Model-Driven Methodology (United States)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno


    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  12. Bridging the Gap of Standardized Animals Models for Blast Neurotrauma: Methodology for Appropriate Experimental Testing. (United States)

    VandeVord, Pamela J; Leonardi, Alessandra Dal Cengio; Ritzel, David


    Recent military combat has heightened awareness to the complexity of blast-related traumatic brain injuries (bTBI). Experiments using animal, cadaver, or biofidelic physical models remain the primary measures to investigate injury biomechanics as well as validate computational simulations, medical diagnostics and therapies, or protection technologies. However, blast injury research has seen a range of irregular and inconsistent experimental methods for simulating blast insults generating results which may be misleading, cannot be cross-correlated between laboratories, or referenced to any standard for exposure. Both the US Army Medical Research and Materiel Command and the National Institutes of Health have noted that there is a lack of standardized preclinical models of TBI. It is recommended that the blast injury research community converge on a consistent set of experimental procedures and reporting of blast test conditions. This chapter describes the blast conditions which can be recreated within a laboratory setting and methodology for testing in vivo models within the appropriate environment.

  13. Simulating the Heterogeneity in Braided Channel Belt Deposits: Part 1. A Geometric-Based Methodology and Code

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, Ramya; Guin, Arijit; Ritzi, Robert W.; Dominic, David F.; Freedman, Vicky L.; Scheibe, Timothy D.; Lunt, Ian A.


    A geometric-based simulation methodology was developed and incorporated into a computer code to model the hierarchical stratal architecture, and the corresponding spatial distribution of permeability, in braided channel belt deposits. The code creates digital models of these deposits as a three-dimensional cubic lattice, which can be used directly in numerical aquifer or reservoir models for fluid flow. The digital models have stratal units defined from the km scale to the cm scale. These synthetic deposits are intended to be used as high-resolution base cases in various areas of computational research on multiscale flow and transport processes, including the testing of upscaling theories. The input parameters are primarily univariate statistics. These include the mean and variance for characteristic lengths of sedimentary unit types at each hierarchical level, and the mean and variance of log-permeability for unit types defined at only the lowest level (smallest scale) of the hierarchy. The code has been written for both serial and parallel execution. The methodology is described in Part 1 of this series. In Part 2, models generated by the code are presented and evaluated.

  14. A Novel Personalized Diagnosis Methodology Using Numerical Simulation and an Intelligent Method to Detect Faults in a Shaft

    Directory of Open Access Journals (Sweden)

    Jiawei Xiang


    Full Text Available Personalized medicine is a hot topic to develop a medical procedure for healthcare. Motivated by molecular dynamics simulation-based personalized medicine, we propose a novel numerical simulation-based personalized diagnosis methodology and explain the fundamental procedures. As an example, a personalized fault diagnosis method is developed using the finite element method (FEM, wavelet packet transform (WPT and support vector machine (SVM to detect faults in a shaft. The shaft unbalance, misalignment, rub-impact and the combination of rub-impact and unbalance are investigated using the present method. The method includes three steps. In the first step, Theil’s inequality coefficient (TIC-based FE model updating technique is employed to determine the boundary conditions, and the fault-induced FE model of the faulty shaft is constructed. Further, the vibration signals of the faulty shaft are obtained using numerical simulation. In the second step, WPT is employed to decompose the vibration signal into several signal components. Specific time-domain feature parameters of all of the signal components are calculated to generate the training samples to train the SVM. Finally, the measured vibration signal and its components decomposed by WPT serve as a test sample to the trained SVM. The fault types are finally determined. In the simulation of a simple shaft, the classification accuracy rates of unbalance, misalignment, rub-impact and the combination of rub-impact and unbalance are 93%, 95%, 89% and 91%, respectively, whereas in the experimental investigations, these decreased to 82%, 87%, 73% and 79%. In order to increase the fault diagnosis precision and general applicability, further works are continuously improving the personalized diagnosis methodology and the corresponding specific methods.

  15. Methodology for physical modeling of melter electrode power plug

    Energy Technology Data Exchange (ETDEWEB)

    Heath, W.O.


    A method is presented for building and testing a one-third scale model of an electrode power plug used to supply up to 3000 amperes to a liquid fed ceramic melter. The method describes how a one-third scale model can be used to verify the ampacity of the power plug, the effectiveness of the power plug cooling system and the effect of the high amperage current on eddy current heating of rebar in the cell wall. Scale-up of the test data, including cooling air flow rate and pressure drop, temperature profiles, melter water jacket heat duty and electrical resistance is covered. The materials required to build the scale model are specified as well as scale surface finish and dimensions. The method for designing and testing a model power plug involves developing a way to recreate the thermal conditions including heat sources, sinks and boundary temperatures on a scale basis. The major heat sources are the molten glass in contact with the electrode, joule heat generation within the power plug, and eddy current heating of the wall rebar. The melting cavity heat source is modelled using a plate heater to provide radiant heat transfer to a geometrically similar, one-third scale electrode housed in a scale model of a melting cavity having a thermally and geometrically similar wall and floor. The joule heat generation within the power plug is simulated by passing electricity through the model power plug with geometrically similar rebar positioned to simulate the eddy heating phenomenon. The proposed model also features two forced air cooling circuits similar to those on the full design. The interaction of convective, natural and radiant heat transfer in the wall cooling circuit are considered. The cell environment and a melter water jacket, along with the air cooling circuits, constitute the heat sinks and are also simulated.

  16. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley


    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.


    Directory of Open Access Journals (Sweden)

    G. Floros


    Full Text Available The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model’s generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects’ purposes.

  18. Hybrid neuro-heuristic methodology for simulation and control of dynamic systems over time interval. (United States)

    Woźniak, Marcin; Połap, Dawid


    Simulation and positioning are very important aspects of computer aided engineering. To process these two, we can apply traditional methods or intelligent techniques. The difference between them is in the way they process information. In the first case, to simulate an object in a particular state of action, we need to perform an entire process to read values of parameters. It is not very convenient for objects for which simulation takes a long time, i.e. when mathematical calculations are complicated. In the second case, an intelligent solution can efficiently help on devoted way of simulation, which enables us to simulate the object only in a situation that is necessary for a development process. We would like to present research results on developed intelligent simulation and control model of electric drive engine vehicle. For a dedicated simulation method based on intelligent computation, where evolutionary strategy is simulating the states of the dynamic model, an intelligent system based on devoted neural network is introduced to control co-working modules while motion is in time interval. Presented experimental results show implemented solution in situation when a vehicle transports things over area with many obstacles, what provokes sudden changes in stability that may lead to destruction of load. Therefore, applied neural network controller prevents the load from destruction by positioning characteristics like pressure, acceleration, and stiffness voltage to absorb the adverse changes of the ground. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. On model-driven design of robot software using co-simulation

    NARCIS (Netherlands)

    Broenink, Johannes F.; Ni, Yunyun; Groothuis, M.A.; Menegatti, E.


    In this paper we show that using co-simulation for robot software design will be more efficient than without co-simulation. We will show an example of the plotter how the co-simulation is helping with the design process. We believe that a collaborative methodology based on model-driven design will


    National Research Council Canada - National Science Library

    G Floros; D Solou; I Pispidikis; E Dimopoulou


    ... of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies...

  1. A Comparison Study of a Generic Coupling Methodology for Modeling Wake Effects of Wave Energy Converter Arrays

    Directory of Open Access Journals (Sweden)

    Tim Verbrugghe


    Full Text Available Wave Energy Converters (WECs need to be deployed in large numbers in an array layout in order to have a significant power production. Each WEC has an impact on the incoming wave field, by diffracting, reflecting and radiating waves. Simulating the wave transformations within and around a WEC array is complex; it is difficult, or in some cases impossible, to simulate both these near-field and far-field wake effects using a single numerical model, in a time- and cost-efficient way in terms of computational time and effort. Within this research, a generic coupling methodology is developed to model both near-field and far-field wake effects caused by floating (e.g., WECs, platforms or fixed offshore structures. The methodology is based on the coupling of a wave-structure interaction solver (Nemoh and a wave propagation model. In this paper, this methodology is applied to two wave propagation models (OceanWave3D and MILDwave, which are compared to each other in a wide spectrum of tests. Additionally, the Nemoh-OceanWave3D model is validated by comparing it to experimental wave basin data. The methodology proves to be a reliable instrument to model wake effects of WEC arrays; results demonstrate a high degree of agreement between the numerical simulations with relative errors lower than 5 % and to a lesser extent for the experimental data, where errors range from 4 % to 17 % .

  2. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent


    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  3. Modeling and simulation of turbulent multiphase flows (United States)

    Li, Zhaorui

    dilute spray simulations, a robust and efficient Eulerian-Lagrangian-Lagrangian mathematical/numerical LES model is employed. This is based on the filtered mass density function (FMDF) methodology and is applicable to two-phase turbulent reacting flows with two-way mass, momentum and energy coupling between phases included. In the LES/FMDF methodology, the "resolved" carrier gas velocity field is obtained by solving the filtered form of the compressible Navier-Stokes equations with a high-order finite difference scheme. The sugrid species, energy and combustion are modeled with the two-phase scalar FMDF transport equation, which is solved by a Lagrangian Monte Carlo method. The liquid droplet/spray is simulated with a non-equilibrium Lagrangian model and stochastic SGS closures. The two-way coupling is implemented through series of source/sink terms. The two-phase LES/FMDF is employed for systematic analysis of turbulent combustion in the double swirl spray burner and spray-controlled dump combustor for various flows and spray parameters. The effects of fuel type, spray/injection angle, mass loading ratio, droplet size and its distribution, fuel/air composition, wall, and other parameters on the combustion and turbulence are investigated.

  4. Benchmark simulation models, quo vadis? (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D


    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  5. A dynamic simulation model of desertification in Egypt

    Directory of Open Access Journals (Sweden)

    M. Rasmy


    Full Text Available This paper presents the development of a system dynamic model to simulate and analyze potential future state of desertification in Egypt. The presented model enhances the MEDALUS methodology developed by European Commission. It illustrates the concept of desertification through different equations and simulation output graphs. It is supplemented with a causal loop diagram showing the feedback between different variables. For the purpose of testing and measuring the effect of different policy scenarios on desertification in Egypt, a simulation model using stock and flow diagram was designed. Multi-temporal data were used to figure out the dynamic changes in desertification sensitivity related to the dynamic nature of desert environment. The model was applied to Al Bihira governorate in western Nile Delta, Egypt, as the study area, and the results showed that the urban expansion, salinization, and not applying the policy enforcement are considered the most variables provoking the desertification.

  6. Fractional Order Modeling of Atmospheric Turbulence - A More Accurate Modeling Methodology for Aero Vehicles (United States)

    Kopasakis, George


    The presentation covers a recently developed methodology to model atmospheric turbulence as disturbances for aero vehicle gust loads and for controls development like flutter and inlet shock position. The approach models atmospheric turbulence in their natural fractional order form, which provides for more accuracy compared to traditional methods like the Dryden model, especially for high speed vehicle. The presentation provides a historical background on atmospheric turbulence modeling and the approaches utilized for air vehicles. This is followed by the motivation and the methodology utilized to develop the atmospheric turbulence fractional order modeling approach. Some examples covering the application of this method are also provided, followed by concluding remarks.

  7. Comparing a simple methodology to evaluate hydrodynamic parameters with rainfall simulation experiments (United States)

    Di Prima, Simone; Bagarello, Vincenzo; Bautista, Inmaculada; Burguet, Maria; Cerdà, Artemi; Iovino, Massimo; Prosdocimi, Massimo


    Studying soil hydraulic properties is necessary for interpreting and simulating many hydrological processes having environmental and economic importance, such as rainfall partition into infiltration and runoff. The saturated hydraulic conductivity, Ks, exerts a dominating influence on the partitioning of rainfall in vertical and lateral flow paths. Therefore, estimates of Ks are essential for describing and modeling hydrological processes (Zimmermann et al., 2013). According to several investigations, Ks data collected by ponded infiltration tests could be expected to be unusable for interpreting field hydrological processes, and particularly infiltration. In fact, infiltration measured by ponding give us information about the soil maximum or potential infiltration rate (Cerdà, 1996). Moreover, especially for the hydrodynamic parameters, many replicated measurements have to be carried out to characterize an area of interest since they are known to vary widely both in space and time (Logsdon and Jaynes, 1996; Prieksat et al., 1994). Therefore, the technique to be applied at the near point scale should be simple and rapid. Bagarello et al. (2014) and Alagna et al. (2015) suggested that the Ks values determined by an infiltration experiment carried applying water at a relatively large distance from the soil surface could be more appropriate than those obtained with a low height of water pouring to explain surface runoff generation phenomena during intense rainfall events. These authors used the Beerkan Estimation of Soil Transfer parameters (BEST) procedure for complete soil hydraulic characterization (Lassabatère et al., 2006) to analyze the field infiltration experiment. This methodology, combining low and high height of water pouring, seems appropriate to test the effect of intense and prolonged rainfall events on the hydraulic characteristics of the surface soil layer. In fact, an intense and prolonged rainfall event has a perturbing effect on the soil surface

  8. Modeling and simulation technology readiness levels.

    Energy Technology Data Exchange (ETDEWEB)

    Clay, Robert L.; Shneider, Max S.; Marburger, S. J.; Trucano, Timothy Guy


    This report summarizes the results of an effort to establish a framework for assigning and communicating technology readiness levels (TRLs) for the modeling and simulation (ModSim) capabilities at Sandia National Laboratories. This effort was undertaken as a special assignment for the Weapon Simulation and Computing (WSC) program office led by Art Hale, and lasted from January to September 2006. This report summarizes the results, conclusions, and recommendations, and is intended to help guide the program office in their decisions about the future direction of this work. The work was broken out into several distinct phases, starting with establishing the scope and definition of the assignment. These are characterized in a set of key assertions provided in the body of this report. Fundamentally, the assignment involved establishing an intellectual framework for TRL assignments to Sandia's modeling and simulation capabilities, including the development and testing of a process to conduct the assignments. To that end, we proposed a methodology for both assigning and understanding the TRLs, and outlined some of the restrictions that need to be placed on this process and the expected use of the result. One of the first assumptions we overturned was the notion of a ''static'' TRL--rather we concluded that problem context was essential in any TRL assignment, and that leads to dynamic results (i.e., a ModSim tool's readiness level depends on how it is used, and by whom). While we leveraged the classic TRL results from NASA, DoD, and Sandia's NW program, we came up with a substantially revised version of the TRL definitions, maintaining consistency with the classic level definitions and the Predictive Capability Maturity Model (PCMM) approach. In fact, we substantially leveraged the foundation the PCMM team provided, and augmented that as needed. Given the modeling and simulation TRL definitions and our proposed assignment methodology, we

  9. In silico simulations of experimental protocols for cardiac modeling. (United States)

    Carro, Jesus; Rodriguez, Jose Felix; Pueyo, Esther


    A mathematical model of the AP involves the sum of different transmembrane ionic currents and the balance of intracellular ionic concentrations. To each ionic current corresponds an equation involving several effects. There are a number of model parameters that must be identified using specific experimental protocols in which the effects are considered as independent. However, when the model complexity grows, the interaction between effects becomes increasingly important. Therefore, model parameters identified considering the different effects as independent might be misleading. In this work, a novel methodology consisting in performing in silico simulations of the experimental protocol and then comparing experimental and simulated outcomes is proposed for parameter model identification and validation. The potential of the methodology is demonstrated by validating voltage-dependent L-type calcium current (ICaL) inactivation in recently proposed human ventricular AP models with different formulations. Our results show large differences between ICaL inactivation as calculated from the model equation and ICaL inactivation from the in silico simulations due to the interaction between effects and/or to the experimental protocol. Our results suggest that, when proposing any new model formulation, consistency between such formulation and the corresponding experimental data that is aimed at being reproduced needs to be first verified considering all involved factors.

  10. Modeling the Capacity and Emissions Impacts of Reduced Electricity Demand. Part 1. Methodology and Preliminary Results

    Energy Technology Data Exchange (ETDEWEB)

    Coughlin, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Shen, Hongxia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; McDevitt, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Sturges, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division


    Policies aimed at energy conservation and efficiency have broad environmental and economic impacts. Even if these impacts are relatively small, they may be significant compared to the cost of implementing the policy. Methodologies that quantify the marginal impacts of reduced demand for energy have an important role to play in developing accurate measures of both the benefits and costs of a given policy choice. This report presents a methodology for estimating the impacts of reduced demand for electricity on the electric power sector as a whole. The approach uses the National Energy Modeling System (NEMS), a mid-range energy forecast model developed and maintained by the U.S. Department of Energy, Energy Information Administration (EIA)(DOE EIA 2013). The report is organized as follows: In the rest of this section the traditional NEMS-BT approach is reviewed and an outline of the new reduced form NEMS methodology is presented. Section 2 provides an overview of how the NEMS model works, and describes the set of NEMS-BT runs that are used as input to the reduced form approach. Section 3 presents our NEMS-BT simulation results and post-processing methods. In Section 4 we show how the NEMS-BT output can be generalized to apply to a broader set of end-uses. In Section 5 we disuss the application of this approach to policy analysis, and summarize some of the issues that will be further investigated in Part 2 of this study.

  11. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

    Directory of Open Access Journals (Sweden)

    Elie Bienenstock


    Full Text Available Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i For all estimators considered, the main source of error is the bias. (ii The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in

  12. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study (United States)

    Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie


    Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word

  13. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.


    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  14. Modeling and simulation of spacecraft power systems (United States)

    Lee, J. R.; Cho, B. H.; Kim, S. J.; Lee, F. C.


    EASY5 modeling of a complete spacecraft power processing system is presented. Component models are developed, and several system models including a solar array switching system, a partially-shunted solar array system and COBE system are simulated. The power system's modes of operation, such as shunt mode, battery-charge mode, and battery-discharge mode, are simulated for a complete orbit cycle.

  15. Modeling methodology for supply chain synthesis and disruption analysis (United States)

    Wu, Teresa; Blackhurst, Jennifer


    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  16. Modeling and Simulation of LDMOS Device


    Sunitha HD; Keshaveni N


    Laterally Diffused MOSFET (LDMOS) are widely used in modern communication industry and other applications. LDMOS offers various advantages over conventional MOSFETs with little process change. In the present paper, an LDMOS device is modeled and simulated in SILVACO device simulator package using the ATHENA and ATLAS modules. The complete fabrication process is modeled and the device performance is simulated. The modeled device gives a 46 V breakdown voltage for a devi...

  17. Collisionless Electrostatic Shock Modeling and Simulation (United States)


    release: distribution unlimited. PA#16490 Air Force Research Laboratory Collisionless Electrostatic Shock Modeling and Simulation Daniel W. Crews In-Space... ModelSimulation Results and Verification • Future Work 3Distribution A. Approved for public release: distribution unlimited. PA#16490 Background... model problem for simulation code validation. What’s the Point? 5Distribution A. Approved for public release: distribution unlimited. PA#16490 The

  18. Baccalaureate nursing students' perspectives of peer tutoring in simulation laboratory, a Q methodology study. (United States)

    Li, Ting; Petrini, Marcia A; Stone, Teresa E


    The study aim was to identify the perceived perspectives of baccalaureate nursing students toward the peer tutoring in the simulation laboratory. Insight into the nursing students' experiences and baseline data related to their perception of peer tutoring will assist to improve nursing education. Q methodology was applied to explore the students' perspectives of peer tutoring in the simulation laboratory. A convenience P-sample of 40 baccalaureate nursing students was used. Fifty-eight selected Q statements from each participant were classified into the shape of a normal distribution using an 11-point bipolar scale form with a range from -5 to +5. PQ Method software analyzed the collected data. Three discrete factors emerged: Factor I ("Facilitate or empower" knowledge acquisition), Factor II ("Safety Net" Support environment), and Factor III ("Mentoring" learn how to learn). The findings of this study support and indicate that peer tutoring is an effective supplementary strategy to promote baccalaureate students' knowledge acquisition, establishing a supportive safety net and facilitating their abilities to learn in the simulation laboratory. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Modelling and simulation of a heat exchanger (United States)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.


    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  20. Methodology for modeling the microbial contamination of air filters.

    Directory of Open Access Journals (Sweden)

    Yun Haeng Joe

    Full Text Available In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  1. A methodology for ecosystem-scale modeling of selenium (United States)

    Presser, T.S.; Luoma, S.N.


    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  2. Defining team performance for simulation-based training: methodology, metrics, and opportunities for emergency medicine. (United States)

    Shapiro, Marc J; Gardner, Roxane; Godwin, Steven A; Jay, Gregory D; Lindquist, David G; Salisbury, Mary L; Salas, Eduardo


    Across health care, teamwork is a critical element for effective patient care. Yet, numerous well-intentioned training programs may fail to achieve the desired outcomes in team performance. Hope for the improvement of teamwork in health care is provided by the success of the aviation and military communities in utilizing simulation-based training (SBT) for training and evaluating teams. This consensus paper 1) proposes a scientifically based methodology for SBT design and evaluation, 2) reviews existing team performance metrics in health care along with recommendations, and 3) focuses on leadership as a target for SBT because it has a high likelihood to improve many team processes and ultimately performance. It is hoped that this discussion will assist those in emergency medicine (EM) and the larger health care field in the design and delivery of SBT for training and evaluating teamwork.

  3. Methodological challenges to bridge the gap between regional climate and hydrology models (United States)

    Bozhinova, Denica; José Gómez-Navarro, Juan; Raible, Christoph; Felder, Guido


    The frequency and severity of floods worldwide, together with their impacts, are expected to increase under climate change scenarios. It is therefore very important to gain insight into the physical mechanisms responsible for such events in order to constrain the associated uncertainties. Model simulations of the climate and hydrological processes are important tools that can provide insight in the underlying physical processes and thus enable an accurate assessment of the risks. Coupled together, they can provide a physically consistent picture that allows to assess the phenomenon in a comprehensive way. However, climate and hydrological models work at different temporal and spatial scales, so there are a number of methodological challenges that need to be carefully addressed. An important issue pertains the presence of biases in the simulation of precipitation. Climate models in general, and Regional Climate models (RCMs) in particular, are affected by a number of systematic biases that limit their reliability. In many studies, prominently the assessment of changes due to climate change, such biases are minimised by applying the so-called delta approach, which focuses on changes disregarding absolute values that are more affected by biases. However, this approach is not suitable in this scenario, as the absolute value of precipitation, rather than the change, is fed into the hydrological model. Therefore, bias has to be previously removed, being this a complex matter where various methodologies have been proposed. In this study, we apply and discuss the advantages and caveats of two different methodologies that correct the simulated precipitation to minimise differences with respect an observational dataset: a linear fit (FIT) of the accumulated distributions and Quantile Mapping (QM). The target region is Switzerland, and therefore the observational dataset is provided by MeteoSwiss. The RCM is the Weather Research and Forecasting model (WRF), driven at the

  4. Methodology and design of adaptive a gent-based simulation architectures for bamboo or visual C


    Mark A. Boyd; Gagnon, Todd A.


    Zero-sum budgeting, downsizing, and increased mission requirements make it more challenging for U.S. Navy leaders to understand the short and long- term consequences of their decisions. An enterprise model of the Navy could provide decision-makers with a tool to study how their decisions might affect the Navy's ability to conduct worldwide operations. Agent- based simulation technology provides a flexible platform to model the complex relationships between the Navy's many components. Agent-ba...

  5. Integral methodology simulation support for the improvement of production systems job shop. Metalworking applications in SMES

    Directory of Open Access Journals (Sweden)

    Jaime Alberto Giraldo García


    Full Text Available Metalworking companies represent one of the strategic sectors in the regional economy of the Caldas department in Colombia; in fact, this sector is involved in 31% of the department’s industrial establishments and 29% of industrial employment according to DANE (Colombian State Statistical Department statistical data from 2005. The sector also exports to Andean countries. However, preliminary studies conducted with 57% of the entrepreneurs from this sector (excluding micro companies and family businesses have revealed serious structural (technology, processing, installations and infrastructure weaknesses (production planning, quality systems in these organisations’ production systems. It is hoped that this paper will lead to disseminating the results amongst the academic community of implementing a comprehensive methodology for improving the production system of a pilot company from this particular sector. An experimental framework for improving the levels reached by the system regarding such priorities is proposed following universally accepted methodology in discrete simulation studies; it proposes using sequential bifurcation, factorial design and response surface experimentation based on defining and weighting the competing priorities which the company should achieve. The improvements in the pilot company’s production system priorities are presented in terms of an effectiveness index (EI which rose from 1.84 to 2.46 by the end of the study.

  6. An improved methodology for dynamic modelling and simulation of ...

    Indian Academy of Sciences (India)

    Abstract. The complexity of electromechanical coupling drive system (ECDS)s, specifically electrical drive systems, makes studying them in their entirety challenging since they consist of elements of diverse nature, i.e. electric, electronics and mechan- ics. This presents a real struggle to the engineers who want to design ...

  7. Using the Simulated Patient Methodology to Assess Paracetamol-Related Counselling for Headache (United States)

    Horvat, Nejc; Koder, Marko; Kos, Mitja


    Objectives Firstly, to assess paracetamol-related counselling. Secondly, to evaluate the patient’s approach as a determinant of counselling and to test the acceptability of the simulated patient method in Slovenian pharmacies. Methods The simulated patient methodology was used in 17 community pharmacies. Three scenarios related to self-medication for headaches were developed and used in all participating pharmacies. Two scenarios were direct product requests: scenario 1: a patient with an uncomplicated short-term headache; scenario 2: a patient with a severe, long-duration headache who takes paracetamol for too long and concurrently drinks alcohol. Scenario 3 was a symptom-based request: a patient asking for medicine for a headache. Pharmacy visits were audio recorded and scored according to predetermined criteria arranged in two categories: counselling content and manner of counselling. The acceptability of the methodology used was evaluated by surveying the participating pharmacists. Results The symptom-based request was scored significantly better (a mean 2.17 out of a possible 4 points) than the direct product requests (means of 1.64 and 0.67 out of a possible 4 points for scenario 1 and 2, respectively). The most common information provided was dosage and adverse effects. Only the symptom-based request stimulated spontaneous counselling. No statistically significant differences in the duration of the consultation between the scenarios were found. There were also no significant differences in the quality of counselling between the Masters of Pharmacy and Pharmacy Technicians. The acceptability of the SP method was not as high as in other countries. Conclusion The assessment of paracetamol-related counselling demonstrates room for practice improvement. PMID:23300691


    Directory of Open Access Journals (Sweden)



    Full Text Available Bouc-Wen model is theoretical formulation that allows to reflect real hysteresis loop of modeled object. Such object is for example a wire rope, which is present on equipment of crane lifting mechanism. Where adopted modified version of the model has nine parameters. Determination of such a number of parameters is complex and problematic issue. In this article are shown the methodology to identify and sample results of numerical simulations. The results were compared with data obtained on the basis of laboratory tests of ropes [3] and on their basis it was found that there is compliance between results and there is possibility to apply in dynamic systems containing in their structures wire ropes [4].

  9. A combination of streamtube and geostatical simulation methodologies for the study of large oil reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Chakravarty, A.; Emanuel, A.S.; Bernath, J.A. [Chevron Petroleum Technology Company, LaHabra, CA (United States)


    The application of streamtube models for reservoir simulation has an extensive history in the oil industry. Although these models are strictly applicable only to fields under voidage balance, they have proved to be useful in a large number of fields provided that there is no solution gas evolution and production. These models combine the benefit of very fast computational time with the practical ability to model a large reservoir over the course of its history. These models do not, however, directly incorporate the detailed geological information that recent experience has taught is important. This paper presents a technique for mapping the saturation information contained in a history matched streamtube model onto a detailed geostatistically derived finite difference grid. With this technique, the saturation information in a streamtube model, data that is actually statistical in nature, can be identified with actual physical locations in a field and a picture of the remaining oil saturation can be determined. Alternatively, the streamtube model can be used to simulate the early development history of a field and the saturation data then used to initialize detailed late time finite difference models. The proposed method is presented through an example application to the Ninian reservoir. This reservoir, located in the North Sea (UK), is a heterogeneous sandstone characterized by a line drive waterflood, with about 160 wells, and a 16 year history. The reservoir was satisfactorily history matched and mapped for remaining oil saturation. A comparison to 3-D seismic survey and recently drilled wells have provided preliminary verification.


    Directory of Open Access Journals (Sweden)

    Petr Dlask


    Full Text Available This paper reports on change as an indicator that can be provide more focused goals in studies of development. The paper offers an answer to the question: How might management gain information from a simulation model and thus influence reality through pragmatic changes. We focus on where and when to influence, manage, and control basic technical-economic proposals. These proposals are mostly formed as simulation models. Unfortunately, however, they do not always provide an explanation of formation changes. A wide variety of simulation tools have become available, e.g. Simulink, Wolfram SystemModeler, VisSim, SystemBuild, STELLA, Adams, SIMSCRIPT, COMSOL Multiphysics, etc. However, there is only limited support for the construction of simulation models of a technical-economic nature. Mathematics has developed the concept of differentiation. Economics has developed the concept of marginality. Technical-economic design has yet to develop an equivalent methodology. This paper discusses an,alternative approach that uses the phenomenon of change, and provides a way from professional knowledge, which can be seen as a purer kind of information, to a more dynamic computing model (a simulation model that interprets changes as method. The validation of changes, as a result for use in managerial decision making, and condition for managerial decision making, can thus be improved.

  11. An electromechanical based deformable model for soft tissue simulation. (United States)

    Zhong, Yongmin; Shirinzadeh, Bijan; Smith, Julian; Gu, Chengfan


    Soft tissue deformation is of great importance to surgery simulation. Although a significant amount of research efforts have been dedicated to simulating the behaviours of soft tissues, modelling of soft tissue deformation is still a challenging problem. This paper presents a new deformable model for simulation of soft tissue deformation from the electromechanical viewpoint of soft tissues. Soft tissue deformation is formulated as a reaction-diffusion process coupled with a mechanical load. The mechanical load applied to a soft tissue to cause a deformation is incorporated into the reaction-diffusion system, and consequently distributed among mass points of the soft tissue. Reaction-diffusion of mechanical load and non-rigid mechanics of motion are combined to govern the simulation dynamics of soft tissue deformation. An improved reaction-diffusion model is developed to describe the distribution of the mechanical load in soft tissues. A three-layer artificial cellular neural network is constructed to solve the reaction-diffusion model for real-time simulation of soft tissue deformation. A gradient based method is established to derive internal forces from the distribution of the mechanical load. Integration with a haptic device has also been achieved to simulate soft tissue deformation with haptic feedback. The proposed methodology does not only predict the typical behaviours of living tissues, but it also accepts both local and large-range deformations. It also accommodates isotropic, anisotropic and inhomogeneous deformations by simple modification of diffusion coefficients.

  12. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle


    A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  13. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.


    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  14. Modelling Geotechnical Heterogeneities Using Geostatistical Simulation and Finite Differences Analysis

    Directory of Open Access Journals (Sweden)

    Marisa Pinheiro


    Full Text Available Modelling a rock mass in an accurate and realistic way allows researchers to reduce the uncertainty associated with its characterisation and reproduce the intrinsic spatial variability and heterogeneities present in the rock mass. However, there is often a lack of a structured methodology to characterise heterogeneous rock masses using geotechnical information available from the prospection phase. This paper presents a characterization methodology based on the geostatistical simulation of geotechnical variables and the application of a scenario reduction technique aimed at selecting a reduced number of realisations able to statistically represent a large set of realisations obtained by the geostatistical approach. This type of information is useful for a further rock mass behaviour analysis. The methodology is applied to a gold deposit with the goal of understanding its main differences to traditional approaches based on a deterministic modelling of the rock mass. The obtained results show the suitability of the methodology to characterise heterogeneous rock masses, since there were considerable differences between the results of the proposed methodology, mainly concerning the theoretical tunnel displacements, and the ones obtained with a traditional approach.

  15. SEIR model simulation for Hepatitis B (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah


    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B. With approval from the proceedings editor article 020185 titled, "SEIR model simulation for Hepatitis B," is retracted from the public record, as it is a duplication of article 020198 published in the same volume.

  16. Methodologies for modelling energy and amino acid responses in poultry

    Directory of Open Access Journals (Sweden)

    Robert Mervyn Gous


    Full Text Available The objective of this paper is to present some of the issues faced by those whose interest is to predict responses in poultry, concentrating mainly on those related to the prediction of voluntary food intake, as this should be the basis of models designed to optimise both performance and feeding programmes. The value of models designed to predict growth or reproductive performance has been improved inestimably by making food intake an output from, as opposed to an input to, such models. Predicting voluntary food intake requires the potential of the bird to be known, be this the growth of body protein or lipid, the growth of feather protein, or the rate at which yolk and albumen may be deposited daily in the form of an egg, and some of the issues relating to the description of potentials are discussed. This potential defines the nutrients that would be required by the bird on the day, which can be converted to a desired food intake by dividing each requirement by the content of that nutrient in the feed. There will be occasions when the bird will be unable to consume what is required, and predicting the magnitude of these constraints on intake and performance provides the greatest challenge for modellers. This paper concentrates on some issues raised in defining the nutrient requirements of an individual, on constraints such as high temperatures and the social and infectious environment on voluntary food intake, on some recent differences in the response to dietary protein that have been observed between the major broiler strains, and on the methodologies used to deal with populations of birds, and finally with broiler breeder hens, whose food intake is constrained by management, not by the environment. These issues suggest that there are still challenges that lie ahead for those wishing to predict responses to nutrients in poultry. It is imperative, however, that the methods used to measure the numbers that make theories work, and that the

  17. Modeling and inverse simulation of somersaults on the trampoline. (United States)

    Blajer, W; Czaplicki, A


    This paper describes a biomechanical model for numerical simulation of front and back somersaults, without twist, performed on the trampoline. The developed mathematical formulation is used to solve an inverse dynamics problem, in which the moments of muscle forces at the joints that result in a given (measured) motion are determined. The nature of the stunts and the way the human body is maneuvered and controlled can be studied. The calculated torques can then be used as control signals for a dynamic simulation. This provides a way to check the inverse dynamics procedures, and influence of typical control errors on somersault performance can be studied. To achieve these goals, the nonlinear dynamical model of the trampolinist and the interacting trampoline bed has been identified, and a methodology for recording the actual somersault performances was proposed. Some results of numerical simulations are reported.

  18. RLV vehicle health management system modeling and simulation (United States)

    Wangu, Srimal


    Sanders, a Lockheed Martin Company, is leading the development and integration of the Vehicle Health Management (VHM) system for Lockheed Martin's VentureStar Reusable Launch Vehicle. The primary objective of this effort is to provide an automated health status and decision-making system for the vehicle. A detailed simulation of the VHM system on RLV is currently being developed using the Foresight Design and Modeling Tool. The simulation will consists of models of key components of the RLV VHM system. An effective detailed system simulation will allow for system and design engineering, as well as program management teams, to accurately and efficiently system designs, analyze the behavior of current systems, and predict the feasibility of making smooth and cost-efficient transitions form older technologies to newer ones. This methodology will reduce program costs, decrease total program life-cycle time, and ultimately increase mission success.

  19. Methodology for identifying parameters for the TRNSYS model Type 210 - wood pellet stoves and boilers

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Tomas; Fiedler, Frank; Nordlander, Svante


    This report describes a method how to perform measurements on boilers and stoves and how to identify parameters from the measurements for the boiler/stove-model TRNSYS Type 210. The model can be used for detailed annual system simulations using TRNSYS. Experience from measurements on three different pellet stoves and four boilers were used to develop this methodology. Recommendations for the set up of measurements are given and the required combustion theory for the data evaluation and data preparation are given. The data evaluation showed that the uncertainties are quite large for the measured flue gas flow rate and for boilers and stoves with high fraction of energy going to the water jacket also the calculated heat rate to the room may have large uncertainties. A methodology for the parameter identification process and identified parameters for two different stoves and three boilers are given. Finally the identified models are compared with measured data showing that the model generally agreed well with measured data during both stationary and dynamic conditions.

  20. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.


    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  1. Monte-Carlo simulation-based statistical modeling

    CERN Document Server

    Chen, John


    This book brings together expert researchers engaged in Monte-Carlo simulation-based statistical modeling, offering them a forum to present and discuss recent issues in methodological development as well as public health applications. It is divided into three parts, with the first providing an overview of Monte-Carlo techniques, the second focusing on missing data Monte-Carlo methods, and the third addressing Bayesian and general statistical modeling using Monte-Carlo simulations. The data and computer programs used here will also be made publicly available, allowing readers to replicate the model development and data analysis presented in each chapter, and to readily apply them in their own research. Featuring highly topical content, the book has the potential to impact model development and data analyses across a wide spectrum of fields, and to spark further research in this direction.

  2. Dynamic simulation of DH house substations. Simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Thorsen, J.E. [Danfoss A/S, Nordborg (Denmark). Building Control Division


    Danfoss AS proceeds on developing simulation models of HVAC components including control equipment for district heating systems. The author presents an example of a simulated domestic hot water service station, describes some of the model components and shows the link between mathematical model and simulation model. Furthermore, an example of hardware in the loop simulation is presented. In this case a domestic heating system is built up in the laboratory by hardware components connected with real time simulations. This system forms the basis for test and evaluation of new control strategies. (orig.) [German] Danfoss AS, Nordborg/Daenemark, entwickelt Simulationsmodelle fuer Komponenten im Bereich Heizung/Lueftung/Klimatechnik einschliesslich der Regelungssysteme fuer Fernwaermeanlagen. Der Autor stellt das Simulationsmodell fuer einen Warmwassererwaermer dar. Darueber hinaus wird das Beispiel einer Simulation unter Einbeziehung von realen Komponenten beschrieben. Dabei wurde im Labor eine Heizanlage aufgebaut und an ein Echtzeit-Simulationsprogramm angeschlossen. Dieses System bildet die Grundlage fuer die Erprobung und Evaluierung neuer Regelungsstrategien. In den letzten 10 Jahren hat Danfoss mit dem Einsatz dynamischer Simulationen bei der Entwicklung von Regelungssystemen fuer Fernwaermeanlagen positive Erfahrungen gesammelt. Es hat sich gezeigt, dass die Simulation erfolgreich eingesetzt werden kann, und zwar nicht nur zur Erprobung besonderer Entwicklungsvorschlaege. Ebenso wichtig war es, Informationen und ein besseres Verstaendnis der Wechselbeziehungen zwischen verschiedenen Parametern zu gewinnen, die das Funktionieren einer Heizungs- oder Heisswasseranlage beeinflussen. Danfoss richtet zur Zeit ein Zentrum fuer die Anwendung von Gebaeudeautomatisierungssystemen ein. Dieses Zentrum wird Moeglichkeiten zur Fortbildung und praktischen Erfahrung auf dem Gebiet der Heizungs- Lueftungs- und Klimatechnik bieten. Die Simulationsprogramme werden einen

  3. Developing the Business Model – a Methodology for Virtual Enterprises

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan


    up an Enterprise Network, which works as a breeding ground for setting up VEs. The VEM applies the Virtual Enterprise Reference Architecture (VERA) as an underlying structure. VERA is a specialisation of GERA, which is a component of the GERAM Framework (Generalised Enterprise Reference Architecture......This chapter presents a methodology to develop Virtual Enterprises (VEs). This Virtual Enterprise Methodology (VEM) outlines activities to consider when setting up and managing virtual enterprises. As a methodology the VEM helps companies to ask the right questions when preparing for, and setting...

  4. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl


    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  5. Simulation modeling for the health care manager. (United States)

    Kennedy, Michael H


    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  6. Techniques and Simulation Models in Risk Management

    Directory of Open Access Journals (Sweden)

    Mirela GHEORGHE


    Full Text Available In the present paper, the scientific approach of the research starts from the theoretical framework of the simulation concept and then continues in the setting of the practical reality, thus providing simulation models for a broad range of inherent risks specific to any organization and simulation of those models, using the informatics instrument @Risk (Palisade. The reason behind this research lies in the need for simulation models that will allow the person in charge with decision taking inside the field of risk management to adopt new corporate strategies which will answer their current needs. The results of the research are represented by two simulation models specific to risk management. The first model follows the net profit simulation as well as simulating the impact that could be generated by a series of inherent risk factors such as losing some important colleagues, a drop in selling prices, a drop in sales volume, retrofitting, and so on. The second simulation model is associated to the IT field, through the analysis of 10 informatics threats, in order to evaluate the potential financial loss.

  7. A novel methodology improves reservoir characterization models using geologic fuzzy variables

    Energy Technology Data Exchange (ETDEWEB)

    Soto B, Rodolfo [DIGITOIL, Maracaibo (Venezuela); Soto O, David A. [Texas A and M University, College Station, TX (United States)


    One of the research projects carried out in Cusiana field to explain its rapid decline during the last years was to get better permeability models. The reservoir of this field has a complex layered system that it is not easy to model using conventional methods. The new technique included the development of porosity and permeability maps from cored wells following the same trend of the sand depositions for each facie or layer according to the sedimentary facie and the depositional system models. Then, we used fuzzy logic to reproduce those maps in three dimensions as geologic fuzzy variables. After multivariate statistical and factor analyses, we found independence and a good correlation coefficient between the geologic fuzzy variables and core permeability and porosity. This means, the geologic fuzzy variable could explain the fabric, the grain size and the pore geometry of the reservoir rock trough the field. Finally, we developed a neural network permeability model using porosity, gamma ray and the geologic fuzzy variable as input variables. This model has a cross-correlation coefficient of 0.873 and average absolute error of 33% compared with the actual model with a correlation coefficient of 0.511 and absolute error greater than 250%. We tested different methodologies, but this new one showed dramatically be a promiser way to get better permeability models. The use of the models have had a high impact in the explanation of well performance and workovers, and reservoir simulation models. (author)

  8. Animated simulation models: Miracle or menace

    Directory of Open Access Journals (Sweden)

    P.S Kruger


    Full Text Available There has been a dramatic increase in the use of computer based simulation modelling over the last decade. A development that has made a significant contribution to the popularity of the simulation approach is the availability of animation facilities. These facilities are usually part of simulation model development software and often do not require very expensive microcomputer equipment. Animation provides some significant advantages during most phases of a simulation modelling effort but also has some inherent dangers and pitfalls. The purpose of this paper is: to identify and discuss some of the more important advantages and disadvantages of animation, and to provide information about some of the available simulation model development software supporting animation capabilities.

  9. High level models and methodologies for information systems

    CERN Document Server

    Isaias, Pedro


    This book introduces methods and methodologies in Information Systems (IS) by presenting, describing, explaining, and illustrating their uses in various contexts, including website development, usability evaluation, quality evaluation, and success assessment.

  10. Modeling and simulation of blood collection systems. (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier


    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    This is supposed to recall gambling and hence the name Monte Carlo simulation. The procedure was developed by. Stanislaw Ulam and John Van Neumann. They used the simu- lation method to solve partial differential equations for diffu- sion of neutrons! (Box 2). We can illustrate the MC method by a simple example.

  12. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede


    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  13. Monte Carlo simulation of model Spin systemsr

    Indian Academy of Sciences (India)

    three~dimensional Ising models and Heisenberg models are dealt with in some detail. Recent applications of the Monte Carlo method to spin glass systems and to estimate renormalisation group critical exponents are reviewod. Keywords. _ Monte-carlo simulation; critical phenomena; Ising models; Heisenberg models ...

  14. A novel methodology to model the cooling processes of packed horticultural produce using 3D shape models (United States)

    Gruyters, Willem; Verboven, Pieter; Rogge, Seppe; Vanmaercke, Simon; Ramon, Herman; Nicolai, Bart


    Freshly harvested horticultural produce require a proper temperature management to maintain their high economic value. Towards this end, low temperature storage is of crucial importance to maintain a high product quality. Optimizing both the package design of packed produce and the different steps in the postharvest cold chain can be achieved by numerical modelling of the relevant transport phenomena. This work presents a novel methodology to accurately model both the random filling of produce in a package and the subsequent cooling process. First, a cultivar-specific database of more than 100 realistic CAD models of apple and pear fruit is built with a validated geometrical 3D shape model generator. To have an accurate representation of a realistic picking season, the model generator also takes into account the biological variability of the produce shape. Next, a discrete element model (DEM) randomly chooses surface meshed bodies from the database to simulate the gravitational filling process of produce in a box or bin, using actual mechanical properties of the fruit. A computational fluid dynamics (CFD) model is then developed with the final stacking arrangement of the produce to study the cooling efficiency of packages under several conditions and configurations. Here, a typical precooling operation is simulated to demonstrate the large differences between using actual 3D shapes of the fruit and an equivalent spheres approach that simplifies the problem drastically. From this study, it is concluded that using a simplified representation of the actual fruit shape may lead to a severe overestimation of the cooling behaviour.

  15. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)


    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  16. Do Methodological Choices in Environmental Modeling Bias Rebound Effects? A Case Study on Electric Cars. (United States)

    Font Vivanco, David; Tukker, Arnold; Kemp, René


    Improvements in resource efficiency often underperform because of rebound effects. Calculations of the size of rebound effects are subject to various types of bias, among which methodological choices have received particular attention. Modellers have primarily focused on choices related to changes in demand, however, choices related to modeling the environmental burdens from such changes have received less attention. In this study, we analyze choices in the environmental assessment methods (life cycle assessment (LCA) and hybrid LCA) and environmental input-output databases (E3IOT, Exiobase and WIOD) used as a source of bias. The analysis is done for a case study on battery electric and hydrogen cars in Europe. The results describe moderate rebound effects for both technologies in the short term. Additionally, long-run scenarios are calculated by simulating the total cost of ownership, which describe notable rebound effect sizes-from 26 to 59% and from 18 to 28%, respectively, depending on the methodological choices-with favorable economic conditions. Relevant sources of bias are found to be related to incomplete background systems, technology assumptions and sectorial aggregation. These findings highlight the importance of the method setup and of sensitivity analyses of choices related to environmental modeling in rebound effect assessments.

  17. A new methodology to determine kinetic parameters for one- and two-step chemical models (United States)

    Mantel, T.; Egolfopoulos, F. N.; Bowman, C. T.


    In this paper, a new methodology to determine kinetic parameters for simple chemical models and simple transport properties classically used in DNS of premixed combustion is presented. First, a one-dimensional code is utilized to performed steady unstrained laminar methane-air flame in order to verify intrinsic features of laminar flames such as burning velocity and temperature and concentration profiles. Second, the flame response to steady and unsteady strain in the opposed jet configuration is numerically investigated. It appears that for a well determined set of parameters, one- and two-step mechanisms reproduce the extinction limit of a laminar flame submitted to a steady strain. Computations with the GRI-mech mechanism (177 reactions, 39 species) and multicomponent transport properties are used to validate these simplified models. A sensitivity analysis of the preferential diffusion of heat and reactants when the Lewis number is close to unity indicates that the response of the flame to an oscillating strain is very sensitive to this number. As an application of this methodology, the interaction between a two-dimensional vortex pair and a premixed laminar flame is performed by Direct Numerical Simulation (DNS) using the one- and two-step mechanisms. Comparison with the experimental results of Samaniego et al. (1994) shows a significant improvement in the description of the interaction when the two-step model is used.

  18. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J


    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  19. Multiscale methodology for bone remodelling simulation using coupled finite element and neural network computation. (United States)

    Hambli, Ridha; Katerchi, Houda; Benhamou, Claude-Laurent


    The aim of this paper is to develop a multiscale hierarchical hybrid model based on finite element analysis and neural network computation to link mesoscopic scale (trabecular network level) and macroscopic (whole bone level) to simulate the process of bone remodelling. As whole bone simulation, including the 3D reconstruction of trabecular level bone, is time consuming, finite element calculation is only performed at the macroscopic level, whilst trained neural networks are employed as numerical substitutes for the finite element code needed for the mesoscale prediction. The bone mechanical properties are updated at the macroscopic scale depending on the morphological and mechanical adaptation at the mesoscopic scale computed by the trained neural network. The digital image-based modelling technique using μ-CT and voxel finite element analysis is used to capture volume elements representative of 2 mm³ at the mesoscale level of the femoral head. The input data for the artificial neural network are a set of bone material parameters, boundary conditions and the applied stress. The output data are the updated bone properties and some trabecular bone factors. The current approach is the first model, to our knowledge, that incorporates both finite element analysis and neural network computation to rapidly simulate multilevel bone adaptation.

  20. Methodology for transient simulation of a small heliothermic central station; Metodologia para simulacao transiente de uma pequena central heliotermica

    Energy Technology Data Exchange (ETDEWEB)

    Wendel, Marcelo


    The final steps of generating electricity from concentrated solar power technologies are similar to conventional thermal processes, since steam or gas is also employed for moving turbines or pistons. The fundamental difference lies on the fact that steam or hot gas is generated by solar radiation instead of fossil fuels or nuclear heat. The cheapest electricity generated from solar energy has been achieved with large-scale power stations based on this concept. Computer simulations represent a low-cost option for the design of thermal systems. The present study aims to develop a methodology for the transient simulation of a micro-scale solar-thermal power plant (120 kWe) which should be appropriate in terms of accuracy and computational effort. The facility considered can optionally operate as a cogeneration plant producing electric power as well as chilled water. Solar radiation is collected by parabolic troughs, electricity is generated by an organic Rankine cycle and chilled water is produced by an absorption cooling cycle. The organic Rankine cycle is of interest because it allows for a plant with relatively simple structure and automated operation. The simulation methodology proposed in this study is implemented in TRNSYS with new components (TYPEs) developed for the solar field and thermal cycles. The parabolic trough field component is based on an experimental efficiency curve of the solar collector. In the case of the Rankine and absorption cycles, the components are based on performance polynomials generated with EES from detailed thermodynamic models, which are calibrated with performance data from manufacturers. Distinct plant configurations are considered. An optimization algorithm is used for searching the best operating point in each case. Results are presented for the following Brazilian sites: Fortaleza, Petrolina and Bom Jesus da Lapa. The latter offers the highest global plant performance. An analysis about the influence of the thermal storage on

  1. Natural gas production problems : solutions, methodologies, and modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, Christopher Arthur; Herrin, James M.; Cooper, Scott Patrick; Basinski, Paul M. (El Paso Production Company, Houston, TX); Olsson, William Arthur; Arnold, Bill Walter; Broadhead, Ronald F. (New Mexico Bureau of Geology and Mineral Resources, Socorro, NM); Knight, Connie D. (Consulting Geologist, Golden, CO); Keefe, Russell G.; McKinney, Curt (Devon Energy Corporation, Oklahoma City, OK); Holm, Gus (Vermejo Park Ranch, Raton, NM); Holland, John F.; Larson, Rich (Vermejo Park Ranch, Raton, NM); Engler, Thomas W. (New Mexico Institute of Mining and Technology, Socorro, NM); Lorenz, John Clay


    Natural gas is a clean fuel that will be the most important domestic energy resource for the first half the 21st centtuy. Ensuring a stable supply is essential for our national energy security. The research we have undertaken will maximize the extractable volume of gas while minimizing the environmental impact of surface disturbances associated with drilling and production. This report describes a methodology for comprehensive evaluation and modeling of the total gas system within a basin focusing on problematic horizontal fluid flow variability. This has been accomplished through extensive use of geophysical, core (rock sample) and outcrop data to interpret and predict directional flow and production trends. Side benefits include reduced environmental impact of drilling due to reduced number of required wells for resource extraction. These results have been accomplished through a cooperative and integrated systems approach involving industry, government, academia and a multi-organizational team within Sandia National Laboratories. Industry has provided essential in-kind support to this project in the forms of extensive core data, production data, maps, seismic data, production analyses, engineering studies, plus equipment and staff for obtaining geophysical data. This approach provides innovative ideas and technologies to bring new resources to market and to reduce the overall environmental impact of drilling. More importantly, the products of this research are not be location specific but can be extended to other areas of gas production throughout the Rocky Mountain area. Thus this project is designed to solve problems associated with natural gas production at developing sites, or at old sites under redevelopment.

  2. Contribution to the Development of Simulation Model of Ship Turbine

    Directory of Open Access Journals (Sweden)

    Božić Ratko


    Full Text Available Simulation modelling, performed by System Dynamics Modelling Approach and intensive use of computers, is one of the most convenient and most successful scientific methods of analysis of performance dynamics of nonlinear and very complex natural technical and organizational systems [1]. The purpose of this work is to demonstrate the successful application of system dynamics simulation modelling at analyzing performance dynamics of a complex system of ship’s propulsion system. Gas turbine is a complex non-linear system, which needs to be systematically investigated as a unit consisting of a number of subsystems and elements, which are linked by cause-effect (UPV feedback loops (KPD, both within the propulsion system and with the relevant surrounding. In this paper the authors will present an efficient application of scientific methods for the study of complex dynamic systems called qualitative and quantitative simulation System Dynamics Methodology. Gas turbine will be presented by a set of non-linear differential equations, after which mental-verbal structural models and flowcharts in System dynamics symbols will be produced, and the performance dynamics in load condition will be simulated in POWERSIM simulation language.

  3. Modified network simulation model with token method of bus access

    Directory of Open Access Journals (Sweden)

    L.V. Stribulevich


    Full Text Available Purpose. To study the characteristics of the local network with the marker method of access to the bus its modified simulation model was developed. Methodology. Defining characteristics of the network is carried out on the developed simulation model, which is based on the state diagram-layer network station with the mechanism of processing priorities, both in steady state and in the performance of control procedures: the initiation of a logical ring, the entrance and exit of the station network with a logical ring. Findings. A simulation model, on the basis of which can be obtained the dependencies of the application the maximum waiting time in the queue for different classes of access, and the reaction time usable bandwidth on the data rate, the number of network stations, the generation rate applications, the number of frames transmitted per token holding time, frame length was developed. Originality. The technique of network simulation reflecting its work in the steady condition and during the control procedures, the mechanism of priority ranking and handling was proposed. Practical value. Defining network characteristics in the real-time systems on railway transport based on the developed simulation model.

  4. Development of a system dynamics model based on Six Sigma methodology

    Directory of Open Access Journals (Sweden)

    José Jovani Cardiel Ortega


    Full Text Available A dynamic model to analyze the complexity associated with the manufacturing systems and to improve the performance of the process through the Six Sigma philosophy is proposed. The research focuses on the implementation of the system dynamics tool to comply with each of the phases of the DMAIC methodology. In the first phase, define, the problem is articulated, collecting data, selecting the variables, and representing them in a mental map that helps build the dynamic hypothesis. In the second phase, measure, model is formulated, equations are developed, and Forrester diagram is developed to carry out the simulation. In the third phase, analyze, the simulation results are studied. For the fourth phase, improving, the model is validated through a sensitivity analysis. Finally, in control phase, operation policies are proposed. This paper presents the development of a dynamic model of the system of knitted textile production knitted developed; the implementation was done in a textile company in southern Guanajuato. The results show an improvement in the process performance by increasing the level of sigma allowing the validation of the proposed approach.

  5. Microgrid Modeling and Simulation Study (United States)


    lightning , and other scenarios need to be simulated and hardware tested to characterize system robustness. The reviewed M&S tools were divided into the...5 capability categories for tactical microgrids: Demand Management, Power Distribution, Source Management, Communications, and Smart Controls. In...requires a short-term investment to produce results. • Component Metadata is the use of digital information from equipment for microgrid Smart

  6. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua; Alfonsi, Andrea; Askin Guler; Tunc Aldemir


    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper represents an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation

  7. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne


    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  8. Reference Management Methodologies for Large Structural Models at Kennedy Space Center (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick


    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  9. Evaluating Asset Pricing Models in a Simulated Multifactor Approach

    Directory of Open Access Journals (Sweden)

    Wagner Piazza Gaglianone


    Full Text Available In this paper a methodology to compare the performance of different stochastic discount factor (SDF models is suggested. The starting point is the estimation of several factor models in which the choice of the fundamental factors comes from different procedures. Then, a Monte Carlo simulation is designed in order to simulate a set of gross returns with the objective of mimicking the temporal dependency and the observed covariance across gross returns. Finally, the artificial returns are used to investigate the performance of the competing asset pricing models through the Hansen and Jagannathan (1997 distance and some goodness-of-fit statistics of the pricing error. An empirical application is provided for the U.S. stock market.

  10. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul


    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  11. An expert system for national economy model simulations

    Directory of Open Access Journals (Sweden)

    Roljić Lazo


    Full Text Available There are some fundamental economic uncertainties. We cannot forecast economic events with a very high scientific precision. It is very clear that there does not exist a unique 'general' model, which can yield all answers to a wide range of macroeconomic issues. Therefore, we use several different kinds of models on segments of the macroeconomic problem. Different models can distinguish/solve economy desegregation, time series analysis and other subfactors involved in macroeconomic problem solving. A major issue becomes finding a meaningful method to link these econometric models. Macroeconomic models were linked through development of an Expert System for National Economy Model Simulations (ESNEMS. ESNEMS consists of five parts: (1 small-scale short-term national econometric model, (2 Methodology of Interactive Nonlinear Goal Programming (MINGP, (3 data-base of historical macro-economic aggregates, (4 software interface for interactive communications between a model and a decision maker, and (5 software for solving problems. ESNEMS was developed to model the optimum macro-economic policy of a developing country (SFRY-formerly Yugoslavia. Most econometric models are very complex. Optimizing of the economic policy is typically defined as a nonlinear goal programming problem. To solve/optimize these models, a new methodology, MINGP, was developed as a part of ESNEMS. MINGP is methodologically based on linear goal programming and feasible directions method. Using Euler's Homogeneous Function Theorem, MINGP linearizes nonlinear homogeneous functions. The highest priorities in minimizing the objective function are the growth of gross domestic product and the decrease of inflation. In the core of the optimization model, MINGP, there is a small-scale econometric model. This model was designed through analysis of the causal relations in the SFRY's social reproduction process of the past 20 years. The objective of the econometric model is to simulate

  12. Modeling and simulation for RF system design

    CERN Document Server

    Frevert, Ronny; Jancke, Roland; Knöchel, Uwe; Schwarz, Peter; Kakerow, Ralf; Darianian, Mohsen


    Focusing on RF specific modeling and simulation methods, and system and circuit level descriptions, this work contains application-oriented training material. Accompanied by a CD- ROM, it combines the presentation of a mixed-signal design flow, an introduction into VHDL-AMS and Verilog-A, and the application of commercially available simulators.

  13. Construction Safety Risk Modeling and Simulation. (United States)

    Tixier, Antoine J-P; Hallowell, Matthew R; Rajagopalan, Balaji


    By building on a genetic-inspired attribute-based conceptual framework for safety risk analysis, we propose a novel approach to define, model, and simulate univariate and bivariate construction safety risk at the situational level. Our fully data-driven techniques provide construction practitioners and academicians with an easy and automated way of getting valuable empirical insights from attribute-based data extracted from unstructured textual injury reports. By applying our methodology on a data set of 814 injury reports, we first show the frequency-magnitude distribution of construction safety risk to be very similar to that of many natural phenomena such as precipitation or earthquakes. Motivated by this observation, and drawing on state-of-the-art techniques in hydroclimatology and insurance, we then introduce univariate and bivariate nonparametric stochastic safety risk generators based on kernel density estimators and copulas. These generators enable the user to produce large numbers of synthetic safety risk values faithful to the original data, allowing safety-related decision making under uncertainty to be grounded on extensive empirical evidence. One of the implications of our study is that like natural phenomena, construction safety may benefit from being studied quantitatively by leveraging empirical data rather than strictly being approached through a managerial perspective using subjective data, which is the current industry standard. Finally, a side but interesting finding is that in our data set, attributes related to high energy levels (e.g., machinery, hazardous substance) and to human error (e.g., improper security of tools) emerge as strong risk shapers. © 2017 Society for Risk Analysis.

  14. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert


    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  15. Computational Modeling of Simulation Tests. (United States)


    cavity was simulated with a nonrigid, partially reflecting heavy gas (the rigid wall of 905.0021 was replaced with additional cells of ideal gas which...the shock tunnel at the 4.14-Mpa range found in calculation 906.1081. The driver consisted of 25 cells of burned ammonium nitrate and fuel oil ( ANFO AX = 250 mm Reflected Wave Geometry--Calculation 906.1091 65 m Driver Region Reaction Region Boundary Burned Rigid ANFO Real Air Reflecting k 90.6

  16. How Does Environmental Regulation Affect Industrial Transformation? A Study Based on the Methodology of Policy Simulation

    Directory of Open Access Journals (Sweden)

    Wei Liu


    Full Text Available The difference of factor input structure determines different response to environmental regulation. This paper constructs a theoretical model including environmental regulation, factor input structure, and industrial transformation and conducts a policy simulation based on the difference of influencing mechanism of environmental regulation considering industrial heterogeneity. The findings show that the impact of environmental regulation on industrial transformation presents comparison of distortion effect of resource allocation and technology effect. Environmental regulation will promote industrial transformation when technology effect of environmental regulation is stronger than distortion effect of resource allocation. Particularly, command-control environmental regulation has a significant incentive effect and spillover effect of technological innovation on cleaning industries, but these effects do not exist in pollution-intensive industries. Command-control environmental regulation promotes industrial transformation. The result of simulation showed that environmental regulation of market incentives is similar to that of command-control.

  17. SEIR model simulation for Hepatitis B (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah


    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  18. A new methodology to test galaxy formation models using the dependence of clustering on stellar mass (United States)

    Campbell, David J. R.; Baugh, Carlton M.; Mitchell, Peter D.; Helly, John C.; Gonzalez-Perez, Violeta; Lacey, Cedric G.; Lagos, Claudia del P.; Simha, Vimal; Farrow, Daniel J.


    We present predictions for the two-point correlation function of galaxy clustering as a function of stellar mass, computed using two new versions of the GALFORM semi-analytic galaxy formation model. These models make use of a high resolution, large volume N-body simulation, set in the 7-year Wilkinson Microwave Anisotropy Probe cosmology. One model uses a universal stellar initial mass function (IMF), while the other assumes different IMFs for quiescent star formation and bursts. Particular consideration is given to how the assumptions required to estimate the stellar masses of observed galaxies (such as the choice of IMF, stellar population synthesis model, and dust extinction) influence the perceived dependence of galaxy clustering on stellar mass. Broad-band spectral energy distribution fitting is carried out to estimate stellar masses for the model galaxies in the same manner as in observational studies. We show clear differences between the clustering signals computed using the true and estimated model stellar masses. As such, we highlight the importance of applying our methodology to compare theoretical models to observations. We introduce an alternative scheme for the calculation of the merger time-scales for satellite galaxies in GALFORM, which takes into account the dark matter subhalo information from the simulation. This reduces the amplitude of small-scale clustering. The new merger scheme offers improved or similar agreement with observational clustering measurements, over the redshift range 0 < z < 0.7. We find reasonable agreement with clustering measurements from the Galaxy and Mass Assembly Survey, but find larger discrepancies for some stellar mass ranges and separation scales with respect to measurements from the Sloan Digital Sky Survey and the VIMOS Public Extragalactic Redshift Survey, depending on the GALFORM model used.

  19. Application of infinite model predictive control methodology to other advanced controllers. (United States)

    Abu-Ayyad, M; Dubay, R; Hernandez, J M


    This paper presents an application of most recent developed predictive control algorithm an infinite model predictive control (IMPC) to other advanced control schemes. The IMPC strategy was derived for systems with different degrees of nonlinearity on the process gain and time constant. Also, it was shown that IMPC structure uses nonlinear open-loop modeling which is conducted while closed-loop control is executed every sampling instant. The main objective of this work is to demonstrate that the methodology of IMPC can be applied to other advanced control strategies making the methodology generic. The IMPC strategy was implemented on several advanced controllers such as PI controller using Smith-Predictor, Dahlin controller, simplified predictive control (SPC), dynamic matrix control (DMC), and shifted dynamic matrix (m-DMC). Experimental work using these approaches combined with IMPC was conducted on both single-input-single-output (SISO) and multi-input-multi-output (MIMO) systems and compared with the original forms of these advanced controllers. Computer simulations were performed on nonlinear plants demonstrating that the IMPC strategy can be readily implemented on other advanced control schemes providing improved control performance. Practical work included real-time control applications on a DC motor, plastic injection molding machine and a MIMO three zone thermal system.

  20. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators

    Directory of Open Access Journals (Sweden)

    Borja Bordel Sánchez


    Full Text Available Cyber-Physical Social Sensing (CPSS is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  1. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators (United States)

    Sánchez-Picot, Álvaro


    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610

  2. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators. (United States)

    Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego


    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  3. A methodology for linking 2D overland flow models with the sewer network model SWMM 5.1 based on dynamic link libraries. (United States)

    Leandro, Jorge; Martins, Ricardo


    Pluvial flooding in urban areas is characterized by a gradually varying inundation process caused by surcharge of the sewer manholes. Therefore urban flood models need to simulate the interaction between the sewer network and the overland flow in order to accurately predict the flood inundation extents. In this work we present a methodology for linking 2D overland flow models with the storm sewer model SWMM 5. SWMM 5 is a well-known free open-source code originally developed in 1971. The latest major release saw its structure re-written in C ++ allowing it to be compiled as a command line executable or through a series of calls made to function inside a dynamic link library (DLL). The methodology developed herein is written inside the same DLL in C + +, and is able to simulate the bi-directional interaction between both models during simulation. Validation is done in a real case study with an existing urban flood coupled model. The novelty herein is that the new methodology can be added to SWMM without the need for editing SWMM's original code. Furthermore, it is directly applicable to other coupled overland flow models aiming to use SWMM 5 as the sewer network model.

  4. Gas Turbine Plant Modeling for Dynamic Simulation


    Endale Turie, Samson


    Gas turbines have become effective in industrial applications for electric and thermal energy production partly due to their quick response to load variations. A gas turbine power plant is a complex assembly of a varietyof components that are designed on the basis of aero thermodynamiclaws. This thesis work presents model development of a single-shaft gas turbine plant cycle that can operate at wide range of load settings in complete dynamic GTP simulator. The modeling and simulation has been...

  5. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)


    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  6. Development and Application of a Clinical Microsystem Simulation Methodology for Human Factors-Based Research of Alarm Fatigue. (United States)

    Kobayashi, Leo; Gosbee, John W; Merck, Derek L


    (1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.

  7. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira


    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  8. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu


    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  9. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  10. Fourier based methodology for simulating 2D-random shapes in heterogeneous materials (United States)

    Mattrand, C.; Béakou, A.; Charlet, K.


    Gaining insights into the effects of microstructural details on materials behavior may be achieved by incorporating their attributes into numerical modeling. This requires us to make considerable efforts to feature heterogeneity morphology distributions and their spatial arrangement. This paper focuses on modeling the scatter observed in materials heterogeneity geometry. The proposed strategy is based on the development of a 1D-shape signature function representing the 2D-section of a given shape, on Fourier basis functions. The Fourier coefficients are then considered as random variables. This methodology has been applied to flax fibers which are gradually introduced into composite materials as a potential alternative to synthetic reinforcements. In this contribution, the influence of some underlying assumptions regarding the choice of one 1D-shape signature function, its discretization scheme and truncation level, and the best way of modeling the associated random variables is also investigated. Some configurations coming from the combination of these tuning parameters are found to be sufficiently relevant to render efficiently the morphometric factors of the observed fibers statistically speaking.

  11. A methodology for the design and testing of atmospheric boundary layer models for wind energy applications

    Directory of Open Access Journals (Sweden)

    J. Sanz Rodrigo


    Full Text Available The GEWEX Atmospheric Boundary Layer Studies (GABLS 1, 2 and 3 are used to develop a methodology for the design and testing of Reynolds-averaged Navier–Stokes (RANS atmospheric boundary layer (ABL models for wind energy applications. The first two GABLS cases are based on idealized boundary conditions and are suitable for verification purposes by comparing with results from higher-fidelity models based on large-eddy simulation. Results from three single-column RANS models, of 1st, 1.5th and 2nd turbulence closure order, show high consistency in predicting the mean flow. The third GABLS case is suitable for the study of these ABL models under realistic forcing such that validation versus observations from the Cabauw meteorological tower are possible. The case consists on a diurnal cycle that leads to a nocturnal low-level jet and addresses fundamental questions related to the definition of the large-scale forcing, the interaction of the ABL with the surface and the evaluation of model results with observations. The simulations are evaluated in terms of surface-layer fluxes and wind energy quantities of interest: rotor equivalent wind speed, hub-height wind direction, wind speed shear and wind direction veer. The characterization of mesoscale forcing is based on spatially and temporally averaged momentum budget terms from Weather Research and Forecasting (WRF simulations. These mesoscale tendencies are used to drive single-column models, which were verified previously in the first two GABLS cases, to first demonstrate that they can produce similar wind profile characteristics to the WRF simulations even though the physics are more simplified. The added value of incorporating different forcing mechanisms into microscale models is quantified by systematically removing forcing terms in the momentum and heat equations. This mesoscale-to-microscale modeling approach is affected, to a large extent, by the input uncertainties of the mesoscale

  12. Validation of 3-D Ice Accretion Measurement Methodology for Experimental Aerodynamic Simulation (United States)

    Broeren, Andy P.; Addy, Harold E., Jr.; Lee, Sam; Monastero, Marianne C.


    Determining the adverse aerodynamic effects due to ice accretion often relies on dry-air wind-tunnel testing of artificial, or simulated, ice shapes. Recent developments in ice-accretion documentation methods have yielded a laser-scanning capability that can measure highly three-dimensional (3-D) features of ice accreted in icing wind tunnels. The objective of this paper was to evaluate the aerodynamic accuracy of ice-accretion simulations generated from laser-scan data. Ice-accretion tests were conducted in the NASA Icing Research Tunnel using an 18-in. chord, two-dimensional (2-D) straight wing with NACA 23012 airfoil section. For six ice-accretion cases, a 3-D laser scan was performed to document the ice geometry prior to the molding process. Aerodynamic performance testing was conducted at the University of Illinois low-speed wind tunnel at a Reynolds number of 1.8 × 10(exp 6) and a Mach number of 0.18 with an 18-in. chord NACA 23012 airfoil model that was designed to accommodate the artificial ice shapes. The ice-accretion molds were used to fabricate one set of artificial ice shapes from polyurethane castings. The laser-scan data were used to fabricate another set of artificial ice shapes using rapid prototype manufacturing such as stereolithography. The iced-airfoil results with both sets of artificial ice shapes were compared to evaluate the aerodynamic simulation accuracy of the laser-scan data. For five of the six ice-accretion cases, there was excellent agreement in the iced-airfoil aerodynamic performance between the casting and laser-scan based simulations. For example, typical differences in iced-airfoil maximum lift coefficient were less than 3 percent with corresponding differences in stall angle of approximately 1 deg or less. The aerodynamic simulation accuracy reported in this paper has demonstrated the combined accuracy of the laser-scan and rapid-prototype manufacturing approach to simulating ice accretion for a NACA 23012 airfoil. For several

  13. Avoiding and identifying errors in health technology assessment models: qualitative study and methodological review. (United States)

    Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A


    Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for

  14. Anticipation Models for On-Line Control in Steel Industry: Methodologies and Case Study (United States)

    Briano, Enrico; Caballini, Claudia; Revetria, Roberto; Testa, Alessandro; De Leo, Marco; Belgrano, Franco; Bertolotto, Alessandro


    This paper describes a simulation system according to improve steelmaking's efficiency and to monitor its performances by anticipating the next period workload. Usually the production planning in those cases is made by the use of Gantt diagrams, based on operator's work. This means that if an accident occurs, the operator himself has to change in few minutes the production plan with a lower performance than the original one. The first consideration is obviously that the operator's experience itself it's not sufficient to re-plan a performing steelmaking chain. Hence the necessity of simulation as problem-solving technique in this complex situation. A brief introduction on this paper is devoted to identify the common problems in most plants about production planning, and this is indeed needed to define the boundary conditions and the framework of the problem. Then, a description of steelmaking processes and the general features of critical aspects about steelmaking planning (Paragraph 2) is given in order to understand the bonds, features, criticalities to be analyzed and implemented in the simulation model. In paragraph 3 a detailed analysis of proposed methodology and system architecture is given in order to make the reader understand the complexity that the Authors had to face in modeling the system and the solutions they found with approximations, considerations, techniques and algorithms that were the most suitable to be used in this particular situation. A short description of the likely steelmaking plant modeled and Verification and Validation (V&V) results are carried in paragraph 4. It was in fact very important in such a complex system, to define the acceptability of results in terms of verification of the correctness, validation of the results, and accreditation to the users. This is a generally valid principle in simulation, but moreover in a complex system modeling such a steelmaking process, where an error can cost millions. At last, in paragraph 5

  15. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Likewise, ships and buildings are built by naval and civil architects. While these are useful, they are, in most cases, static models. We are ..... The basic theory of transition from one state to another was developed by the Russian mathematician. Andrei Markov and hence the name Markov chains. Andrei Markov [1856-1922] ...

  16. Network Modeling and Simulation (NEMSE) (United States)


    Prioritized Packet Fragmentation", IEEE Trans. Multimedia , Oct. 2012. [13 SYSENG] . Defense Acquisition Guidebook, Chapter 4 System Engineering, and...2012 IEEE High Performance Extreme Computing Conference (HPEC) poster session [1 Ross]. Motivation  Air Force Research Lab needs o virtual. These eight virtualizations were: System-in-the-Loop (SITL) using OPNET Modeler, COPE, Field Programmable Gate Array ( FPGA Physical

  17. Computer Based Modelling and Simulation-Modelling and ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 4. Computer Based Modelling and Simulation-Modelling and Simulation with Probability and Throwing Dice. N K Srinivasan. General Article Volume 6 Issue 4 April 2001 pp 69-77 ...

  18. Land surface modeling in convection permitting simulations (United States)

    van Heerwaarden, Chiel; Benedict, Imme


    The next generation of weather and climate models permits convection, albeit at a grid spacing that is not sufficient to resolve all details of the clouds. Whereas much attention is being devoted to the correct simulation of convective clouds and associated precipitation, the role of the land surface has received far less interest. In our view, convective permitting simulations pose a set of problems that need to be solved before accurate weather and climate prediction is possible. The heart of the problem lies at the direct runoff and at the nonlinearity of the surface stress as a function of soil moisture. In coarse resolution simulations, where convection is not permitted, precipitation that reaches the land surface is uniformly distributed over the grid cell. Subsequently, a fraction of this precipitation is intercepted by vegetation or leaves the grid cell via direct runoff, whereas the remainder infiltrates into the soil. As soon as we move to convection permitting simulations, this precipitation falls often locally in large amounts. If the same land-surface model is used as in simulations with parameterized convection, this leads to an increase in direct runoff. Furthermore, spatially non-uniform infiltration leads to a very different surface stress, when scaled up to the course resolution of simulations without convection. Based on large-eddy simulation of realistic convection events at a large domain, this study presents a quantification of the errors made at the land surface in convection permitting simulation. It compares the magnitude of the errors to those made in the convection itself due to the coarse resolution of the simulation. We find that, convection permitting simulations have less evaporation than simulations with parameterized convection, resulting in a non-realistic drying of the atmosphere. We present solutions to resolve this problem.

  19. Methodology and application of high performance electrostatic field simulation in the KATRIN experiment (United States)

    Corona, Thomas

    The Karlsruhe Tritium Neutrino (KATRIN) experiment is a tritium beta decay experiment designed to make a direct, model independent measurement of the electron neutrino mass. The experimental apparatus employs strong ( O[T]) magnetostatic and (O[10 5 V/m]) electrostatic fields in regions of ultra high (O[10-11 mbar]) vacuum in order to obtain precise measurements of the electron energy spectrum near the endpoint of tritium beta-decay. The electrostatic fields in KATRIN are formed by multiscale electrode geometries, necessitating the development of high performance field simulation software. To this end, we present a Boundary Element Method (BEM) with analytic boundary integral terms in conjunction with the Robin Hood linear algebraic solver, a nonstationary successive subspace correction (SSC) method. We describe an implementation of these techniques for high performance computing environments in the software KEMField, along with the geometry modeling and discretization software KGeoBag. We detail the application of KEMField and KGeoBag to KATRIN's spectrometer and detector sections, and demonstrate its use in furthering several of KATRIN's scientific goals. Finally, we present the results of a measurement designed to probe the electrostatic profile of KATRIN's main spectrometer in comparison to simulated results.

  20. Model Driven Development of Simulation Models : Defining and Transforming Conceptual Models into Simulation Models by Using Metamodels and Model Transformations

    NARCIS (Netherlands)

    Küçükkeçeci Çetinkaya, D.


    Modeling and simulation (M&S) is an effective method for analyzing and designing systems and it is of interest to scientists and engineers from all disciplines. This thesis proposes the application of a model driven software development approach throughout the whole set of M&S activities and it

  1. Simulation and modeling of turbulent flows

    CERN Document Server

    Gatski, Thomas B; Lumley, John L


    This book provides students and researchers in fluid engineering with an up-to-date overview of turbulent flow research in the areas of simulation and modeling. A key element of the book is the systematic, rational development of turbulence closure models and related aspects of modern turbulent flow theory and prediction. Starting with a review of the spectral dynamics of homogenous and inhomogeneous turbulent flows, succeeding chapters deal with numerical simulation techniques, renormalization group methods and turbulent closure modeling. Each chapter is authored by recognized leaders in their respective fields, and each provides a thorough and cohesive treatment of the subject.

  2. A model for personality and emotion simulation

    NARCIS (Netherlands)

    Egges, A.; Kshirsagar, S.; Magnenat-Thalmann, N.


    This paper describes a generic model for personality, mood and emotion simulation for conversational virtual humans. We present a generic model for describing and updating the parameters related to emotional behaviour. Also, this paper explores how existing theories for appraisal can be integrated

  3. The behaviour of adaptive boneremodeling simulation models

    NARCIS (Netherlands)

    Weinans, H.; Huiskes, R.; Grootenboer, H.J.


    The process of adaptive bone remodeling can be described mathematically and simulated in a computer model, integrated with the finite element method. In the model discussed here, cortical and trabecular bone are described as continuous materials with variable density. The remodeling rule applied to

  4. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C


    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  5. Molecular simulation and modeling of complex I. (United States)

    Hummer, Gerhard; Wikström, Mårten


    Molecular modeling and molecular dynamics simulations play an important role in the functional characterization of complex I. With its large size and complicated function, linking quinone reduction to proton pumping across a membrane, complex I poses unique modeling challenges. Nonetheless, simulations have already helped in the identification of possible proton transfer pathways. Simulations have also shed light on the coupling between electron and proton transfer, thus pointing the way in the search for the mechanistic principles underlying the proton pump. In addition to reviewing what has already been achieved in complex I modeling, we aim here to identify pressing issues and to provide guidance for future research to harness the power of modeling in the functional characterization of complex I. This article is part of a Special Issue entitled Respiratory complex I, edited by Volker Zickermann and Ulrich Brandt. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Landscape Modelling and Simulation Using Spatial Data

    Directory of Open Access Journals (Sweden)

    Amjed Naser Mohsin AL-Hameedawi


    Full Text Available In this paper a procedure was performed for engendering spatial model of landscape acclimated to reality simulation. This procedure based on combining spatial data and field measurements with computer graphics reproduced using Blender software. Thereafter that we are possible to form a 3D simulation based on VIS ALL packages. The objective was to make a model utilising GIS, including inputs to the feature attribute data. The objective of these efforts concentrated on coordinating a tolerable spatial prototype, circumscribing facilitation scheme and outlining the intended framework. Thus; the eventual result was utilized in simulation form. The performed procedure contains not only data gathering, fieldwork and paradigm providing, but extended to supply a new method necessary to provide the respective 3D simulation mapping production, which authorises the decision makers as well as investors to achieve permanent acceptance an independent navigation system for Geoscience applications.

  7. Three-dimensional finite element stress analysis: the technique and methodology of non-linear property simulation and soft tissue loading behavior for different partial denture designs. (United States)

    Kanbara, Ryo; Nakamura, Yoshinori; Ochiai, Kent T; Kawai, Tatsushi; Tanaka, Yoshinobu


    The purpose of this study was to develop and report upon a methodology for a non-linear capacity 3D modeling finite element analysis evaluating the loading behavior of different partial denture designs. A 3D finite element model using human CT data was constructed. An original material constant conversion program was implemented in the data simulation of non-linear tissue behavior. The finite element method material properties of residual ridge mucosa were found to have seven material constants and six conversion points of stress values. Periodontal tissues were found to have three constants, and two conversion points. Three magnetic attachment partial denture designs with different bracing elements were evaluated. Technical procedures for finite element model simulation of nonlinear tissue behavior properties evaluating the oral behavior of prosthetic device designs are reported for prosthodontic testing. The use of horizontal cross-arch bracing positively impacts upon the comparative stability of the partial denture designs tested.

  8. Development of NASA's Models and Simulations Standard (United States)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.


    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  9. Fully Adaptive Radar Modeling and Simulation Development (United States)


    have developed a MATLAB-based modeling and simulation (M&S) architecture for distributed fully adaptive radar (FAR) that will enable algorithm...development and testing on simulated, previously collected, and real-time streaming data. The architecture is coded in MATLAB using an object oriented...programming approach. The architecture includes a FAR engine to control the operation of the perception-action cycle and software objects that determine the

  10. Aqueous Electrolytes: Model Parameters and Process Simulation

    DEFF Research Database (Denmark)

    Thomsen, Kaj

    This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer...... program including a steady state process simulator for the design, simulation, and optimization of fractional crystallization processes is presented....

  11. PBPK modeling and simulation in drug research and development. (United States)

    Zhuang, Xiaomei; Lu, Chuang


    Physiologically based pharmacokinetic (PBPK) modeling and simulation can be used to predict the pharmacokinetic behavior of drugs in humans using preclinical data. It can also explore the effects of various physiologic parameters such as age, ethnicity, or disease status on human pharmacokinetics, as well as guide dose and dose regiment selection and aid drug-drug interaction risk assessment. PBPK modeling has developed rapidly in the last decade within both the field of academia and the pharmaceutical industry, and has become an integral tool in drug discovery and development. In this mini-review, the concept and methodology of PBPK modeling are briefly introduced. Several case studies were discussed on how PBPK modeling and simulation can be utilized through various stages of drug discovery and development. These case studies are from our own work and the literature for better understanding of the absorption, distribution, metabolism and excretion (ADME) of a drug candidate, and the applications to increase efficiency, reduce the need for animal studies, and perhaps to replace clinical trials. The regulatory acceptance and industrial practices around PBPK modeling and simulation is also discussed.

  12. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.


    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together...... to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal...... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  13. URC Fuzzy Modeling and Simulation of Gene Regulation

    Energy Technology Data Exchange (ETDEWEB)

    Sokhansanj, B A; Fitch, J P


    Recent technological advances in high-throughput data collection give biologists the ability to study increasingly complex systems. A new methodology is needed to develop and test biological models based on experimental observations and predict the effect of perturbations of the network (e.g. genetic engineering, pharmaceuticals, gene therapy). Diverse modeling approaches have been proposed, in two general categories: modeling a biological pathway as (a) a logical circuit or (b) a chemical reaction network. Boolean logic models can not represent necessary biological details. Chemical kinetics simulations require large numbers of parameters that are very difficult to accurately measure. Based on the way biologists have traditionally thought about systems, we propose that fuzzy logic is a natural language for modeling biology. The Union Rule Configuration (URC) avoids combinatorial explosion in the fuzzy rule base, allowing complex system models. We demonstrate the fuzzy modeling method on the commonly studied lac operon of E. coli. Our goal is to develop a modeling and simulation approach that can be understood and applied by biologists without the need for experts in other fields or ''black-box'' software.

  14. Role of modeling and simulation in pediatric investigation plans. (United States)

    Manolis, Efthymios; Osman, Tariq Eldirdiry; Herold, Ralf; Koenig, Franz; Tomasi, Paolo; Vamvakas, Spiros; Saint Raymond, Agnes


    Ethical and practical constraints encourage the optimal use of resources in pediatric drug development. Modeling and simulation has emerged as a promising methodology acknowledged by industry, academia, and regulators. We previously proposed a paradigm in pediatric drug development, whereby modeling and simulation is used as a decision tool, for study optimization and/or as a data analysis tool. Three and a half years since the Paediatric Regulation came into force in 2007, the European Medicines Agency has gained substantial experience in the use of modeling and simulation in pediatric drug development. In this review, we present examples on how the proposed paradigm applies in real case scenarios of planned pharmaceutical developments. We also report the results of a pediatric database search to further 'validate' the paradigm. There were 47 of 210 positive pediatric investigation plan (PIP) opinions that made reference to modeling and simulation (data included all positive opinions issued up to January 2010). This reflects a major shift in regulatory thinking. The ratio of PIPs with modeling and simulation rose to two in five based on the summary reports. Population pharmacokinetic (POP-PK) and pharmacodynamics (POP-PD) and physiologically based pharmacokinetic models are widely used by industry and endorsed or even imposed by regulators as a way to circumvent some difficulties in developing medicinal products in children. The knowledge of the effects of age and size on PK is improving, and models are widely employed to make optimal use of this knowledge but less is known about the effects of size and maturation on PD, disease progression, and safety. Extrapolation of efficacy from different age groups is often used in pediatric medicinal development as another means to alleviate the burden of clinical trials in children, and this can be aided by modeling and simulation to supplement clinical data. The regulatory assessment is finally judged on clinical grounds

  15. Modelling and Simulation of Crude Oil Dispersion

    Directory of Open Access Journals (Sweden)

    Abdulfatai JIMOH


    Full Text Available This research work was carried out to develop a model equation for the dispersion of crude oil in water. Seven different crude oils (Bonny Light, Antan Terminal, Bonny Medium, Qua Iboe Light, Brass Light Mbede, Forcados Blend and Heavy H were used as the subject crude oils. The developed model equation in this project which is given as...It was developed starting from the equation for the oil dispersion rate in water which is given as...The developed equation was then simulated with the aid of MathCAD 2000 Professional software. The experimental and model results obtained from the simulation of the model equation were plotted on the same axis against time of dispersion. The model results revealed close fittings between the experimental and the model results because the correlation coefficients and the r-square values calculated using Spreadsheet Program were both found to be unity (1.00.

  16. Evaluation of methodologies for interpolation of data for hydrological modeling in glacierized basins with limited information (United States)

    Muñoz, Randy; Paredes, Javier; Huggel, Christian; Drenkhan, Fabian; García, Javier


    The availability and consistency of data is a determining factor for the reliability of any hydrological model and simulated results. Unfortunately, there are many regions worldwide where data is not available in the desired quantity and quality. The Santa River basin (SRB), located within a complex topographic and climatic setting in the tropical Andes of Peru is a clear example of this challenging situation. A monitoring network of in-situ stations in the SRB recorded series of hydro-meteorological variables which finally ceased to operate in 1999. In the following years, several researchers evaluated and completed many of these series. This database was used by multiple research and policy-oriented projects in the SRB. However, hydroclimatic information remains limited, making it difficult to perform research, especially when dealing with the assessment of current and future water resources. In this context, here the evaluation of different methodologies to interpolate temperature and precipitation data at a monthly time step as well as ice volume data in glacierized basins with limited data is presented. The methodologies were evaluated for the Quillcay River, a tributary of the SRB, where the hydro-meteorological data is available from nearby monitoring stations since 1983. The study period was 1983 - 1999 with a validation period among 1993 - 1999. For temperature series the aim was to extend the observed data and interpolate it. Data from Reanalysis NCEP was used to extend the observed series: 1) using a simple correlation with multiple field stations, or 2) applying the altitudinal correction proposed in previous studies. The interpolation then was applied as a function of altitude. Both methodologies provide very close results, by parsimony simple correlation is shown as a viable choice. For precipitation series, the aim was to interpolate observed data. Two methodologies were evaluated: 1) Inverse Distance Weighting whose results underestimate the amount

  17. A methodology to urban air quality assessment during large time periods of winter using computational fluid dynamic models (United States)

    Parra, M. A.; Santiago, J. L.; Martín, F.; Martilli, A.; Santamaría, J. M.


    The representativeness of point measurements in urban areas is limited due to the strong heterogeneity of the atmospheric flows in cities. To get information on air quality in the gaps between measurement points, and have a 3D field of pollutant concentration, Computational Fluid Dynamic (CFD) models can be used. However, unsteady simulations during time periods of the order of months, often required for regulatory purposes, are not possible for computational reasons. The main objective of this study is to develop a methodology to evaluate the air quality in a real urban area during large time periods by means of steady CFD simulations. One steady simulation for each inlet wind direction was performed and factors like the number of cars inside each street, the length of streets and the wind speed and direction were taken into account to compute the pollutant concentration. This approach is only valid in winter time when the pollutant concentrations are less affected by atmospheric chemistry. A model based on the steady-state Reynolds-Averaged Navier-Stokes equations (RANS) and standard k-ɛ turbulence model was used to simulate a set of 16 different inlet wind directions over a real urban area (downtown Pamplona, Spain). The temporal series of NO x and PM 10 and the spatial differences in pollutant concentration of NO 2 and BTEX obtained were in agreement with experimental data. Inside urban canopy, an important influence of urban boundary layer dynamics on the pollutant concentration patterns was observed. Large concentration differences between different zones of the same square were found. This showed that concentration levels measured by an automatic monitoring station depend on its location in the street or square, and a modelling methodology like this is useful to complement the experimental information. On the other hand, this methodology can also be applied to evaluate abatement strategies by redistributing traffic emissions.

  18. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte


    flood water, based on either measured waste water pathogen concentrations or on assumptions regarding the prevalence of infections in the population. The exposure (dosage) to pathogens was estimated by multiplying the concentration with literature values for the ingestion of water for different exposure groups (e.g. children, adults). The probability of infection was determined by applying dose response relations and MonteCarlo simulation. The methodology is demonstrated on two cases, i.e one case from a developing country with poor sanitation and one case from a developed country, where climate adaptation is the main issue: The risk of cholera in the City of Dhaka, Bangladesh during a flood event 2004, and the risk of bacterial and viral infections of during a flood event in Copenhagen, Denmark in 2011. Results PIC The historical flood events in Dhaka (2004) and Copenhagen (2011) were successfully modelled. The urban flood model was successfully coupled to QMRA. An example of the results of the quantitative microbial risk assessment given as the average estimated risk of cholera infection for children below 5 years living in slum areas in Dhaka is shown in the figure. Similarly, the risk of infection during the flood event in Copenhagen will be presented in the article. Conclusions We have developed a methodology for the dynamic modeling of the risk of infection during waste water influenced urban flooding. The outcome of the modelling exercise indicates that direct contact with polluted flood water is a likely route of transmission of cholera in Dhaka, and bacterial and viral infectious diseases in Copenhagen. It demonstrates the applicability and the potential for linking urban flood models with QMRA in order to identify interventions to reduce the burden of disease on the population in Dhaka City and Copenhagen.

  19. MEMS 3-DoF gyroscope design, modeling and simulation through equivalent circuit lumped parameter model

    Energy Technology Data Exchange (ETDEWEB)

    Mian, Muhammad Umer, E-mail:; Khir, M. H. Md.; Tang, T. B. [Department of Electrical and Electronic Engineering, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia); Dennis, John Ojur [Department of Fundamental & Applied Sciences, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia); Riaz, Kashif; Iqbal, Abid [Faculty of Electronics Engineering, GIK Institute of Engineering Sciences and Technology, Topi, Khyber Pakhtunkhaw (Pakistan); Bazaz, Shafaat A. [Department of Computer Science, Center for Advance Studies in Engineering, Islamabad (Pakistan)


    Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for the proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.

  20. MEMS 3-DoF gyroscope design, modeling and simulation through equivalent circuit lumped parameter model (United States)

    Mian, Muhammad Umer; Dennis, John Ojur; Khir, M. H. Md.; Riaz, Kashif; Iqbal, Abid; Bazaz, Shafaat A.; Tang, T. B.


    Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for the proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.

  1. Towards a common methodology to simulate tree mortality based on ring-width data (United States)

    Cailleret, Maxime; Bigler, Christof; Bugmann, Harald; Davi, Hendrik; Minunno, Francesco; Peltoniemi, Mikko; Martínez-Vilalta, Jordi


    Individual mortality is a key process of population and community dynamics, especially for long-lived species such as trees. As the rates of vegetation background mortality and of massive diebacks accelerated during the last decades and would continue in the future due to rising temperature and increasing drought, there is a growing demand of early warning signals that announce that the likelihood of death is very high. If physiological indicators have a high potential to predict tree mortality, their development requires an intensive tree monitoring which cannot be currently done on a representative sample of a population and on several species. An easier approach is to use radial growth data such as tree ring-widths measurements. During the last decades, an increasing number of studies aimed to derive these growth-mortality functions. However, as they followed different approaches concerning the choice of the sampling strategy (number of dead and living trees), of the type of growth explanatory variables (growth level, growth trend variables…), and of the length of the time-window (number of rings before death) used to calculate them, it makes difficult to compare results among studies and a subsequent biological interpretation. We detailed a new methodology for assessing reliable tree-ring based growth-mortality relationships using binomial logistic regression models. As examples we used published tree-ring datasets from Abies alba growing in 13 different sites, and from Nothofagus dombeyi and Quercus petraea located in one single site. Our first approach, based on constant samplings, aims to (1) assess the dependency of growth-mortality relationships on the statistical sampling scheme used; (2) determine the best length of the time-window used to calculate each growth variable; and (3) reveal the presence of intra-specific shifts in growth-mortality relationships. We also followed a Bayesian approach to build the best multi-variable logistic model considering

  2. Response Surface Methodology and Aspen Plus Integration for the Simulation of the Catalytic Steam Reforming of Ethanol

    Directory of Open Access Journals (Sweden)

    Bernay Cifuentes


    Full Text Available The steam reforming of ethanol (SRE on a bimetallic RhPt/CeO2 catalyst was evaluated by the integration of Response Surface Methodology (RSM and Aspen Plus (version 9.0, Aspen Tech, Burlington, MA, USA, 2016. First, the effect of the Rh–Pt weight ratio (1:0, 3:1, 1:1, 1:3, and 0:1 on the performance of SRE on RhPt/CeO2 was assessed between 400 to 700 °C with a stoichiometric steam/ethanol molar ratio of 3. RSM enabled modeling of the system and identification of a maximum of 4.2 mol H2/mol EtOH (700 °C with the Rh0.4Pt0.4/CeO2 catalyst. The mathematical models were integrated into Aspen Plus through Excel in order to simulate a process involving SRE, H2 purification, and electricity production in a fuel cell (FC. An energy sensitivity analysis of the process was performed in Aspen Plus, and the information obtained was used to generate new response surfaces. The response surfaces demonstrated that an increase in H2 production requires more energy consumption in the steam reforming of ethanol. However, increasing H2 production rebounds in more energy production in the fuel cell, which increases the overall efficiency of the system. The minimum H2 yield needed to make the system energetically sustainable was identified as 1.2 mol H2/mol EtOH. According to the results of the integration of RSM models into Aspen Plus, the system using Rh0.4Pt0.4/CeO2 can produce a maximum net energy of 742 kJ/mol H2, of which 40% could be converted into electricity in the FC (297 kJ/mol H2 produced. The remaining energy can be recovered as heat.

  3. Simulation Models to Size and Retrofit District Heating Systems

    Directory of Open Access Journals (Sweden)

    Kevin Sartor


    Full Text Available District heating networks are considered as convenient systems to supply heat to consumers while reducing CO 2 emissions and increasing renewable energies use. However, to make them as profitable as possible, they have to be developed, operated and sized carefully. In order to cope with these objectives, simulation tools are required to analyze several configuration schemes and control methods. Indeed, the most common problems are heat losses, the electric pump consumption and the peak heat demand while ensuring the comfort of the users. In this contribution, a dynamic simulation model of all the components of the network is described. It is dedicated to assess some energetic, environmental and economic indicators. Finally, the methodology is used on an existing application test case namely the district heating network of the University of Liège to study the pump control and minimize the district heating network heat losses.

  4. Planning Model Based on Projection Methodology (PM2)

    National Research Council Canada - National Science Library

    Ellner, Paul M; Hall, J. B


    There are a number of benefits associated with the new planning model. PM2 is unique in comparison to other reliability growth planning models in that it utilizes planning parameters that are directly influenced by program management...

  5. Molecular Simulation towards Efficient and Representative Subsurface Reservoirs Modeling

    KAUST Repository

    Kadoura, Ahmad


    This dissertation focuses on the application of Monte Carlo (MC) molecular simulation and Molecular Dynamics (MD) in modeling thermodynamics and flow of subsurface reservoir fluids. At first, MC molecular simulation is proposed as a promising method to replace correlations and equations of state in subsurface flow simulators. In order to accelerate MC simulations, a set of early rejection schemes (conservative, hybrid, and non-conservative) in addition to extrapolation methods through reweighting and reconstruction of pre-generated MC Markov chains were developed. Furthermore, an extensive study was conducted to investigate sorption and transport processes of methane, carbon dioxide, water, and their mixtures in the inorganic part of shale using both MC and MD simulations. These simulations covered a wide range of thermodynamic conditions, pore sizes, and fluid compositions shedding light on several interesting findings. For example, the possibility to have more carbon dioxide adsorbed with more preadsorbed water concentrations at relatively large basal spaces. The dissertation is divided into four chapters. The first chapter corresponds to the introductory part where a brief background about molecular simulation and motivations are given. The second chapter is devoted to discuss the theoretical aspects and methodology of the proposed MC speeding up techniques in addition to the corresponding results leading to the successful multi-scale simulation of the compressible single-phase flow scenario. In chapter 3, the results regarding our extensive study on shale gas at laboratory conditions are reported. At the fourth and last chapter, we end the dissertation with few concluding remarks highlighting the key findings and summarizing the future directions.

  6. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare

    NARCIS (Netherlands)

    Bracke, M.B.M.; Edwards, S.A.; Metz, J.H.M.; Noordhuizen, J.P.T.M.; Algers, B.


    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called `semantic modelling¿ has been developed. To date, however, this methodology has not been generally

  7. International orientation on methodologies for modelling developments in road safety.

    NARCIS (Netherlands)

    Reurings, M.C.B. & Commandeur, J.J.F.


    This report gives an overview of the models developed in countries other than the Netherlands to evaluate past developments in road traffic safety and to obtain estimates of these developments in the future. These models include classical linear regression and loglinear models as applied in Great

  8. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)


    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  9. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F


    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...


    Directory of Open Access Journals (Sweden)



    Full Text Available Literature analysis concerning the selection or creation a project management methodology is performed. Creating a "complete" methodology is proposed which can be applied to managing projects with any complexity, various degrees of responsibility for results and different predictability of the requirements. For the formation of a "complete" methodology, it is proposed to take the PMBOK standard as the basis, which would be supplemented by processes of the most demanding plan driven and flexible Agile Methodologies. For each knowledge area of the PMBOK standard, The following groups of processes should be provided: initiation, planning, execution, reporting, and forecasting, controlling, analysis, decision making and closing. The method for generating a methodology for the specific project is presented. The multiple criteria mathematical model and method aredeveloped for the synthesis of methodology when initial data about the project and its environment are fuzzy.

  11. A conditional simulation model of intermittent rain fields

    Directory of Open Access Journals (Sweden)

    L. G. Lanza


    Full Text Available The synthetic generation of random fields with specified probability distribution, correlation structure and probability of no-rain areas is used as the basis for the formulation of a stochastic space-time rainfall model conditional on rain gauge observations. A new procedure for conditioning while preserving intermittence is developed to provide constraints to Monte Carlo realisations of possible rainfall scenarios. The method addresses the properties of the convolution operator involved in generating random field realisations and is actually independent of the numerical algorithm used for unconditional simulation. It requires only the solution of a linear system of algebraic equations the order of which is given by the number of the conditioning nodes. Applications of the methodology are expected in rainfall field reconstruction from sparse rain gauge data and in rainfall downscaling from the large scale information that may be provided by remote sensing devices or numerical weather prediction models. Keywords: Space-time rainfall; Conditioning; Stochastic models

  12. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis. (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep


    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  13. A simulated annealing methodology to multiproduct capacitated facility location with stochastic demand. (United States)

    Qin, Jin; Xiang, Hui; Ye, Yong; Ni, Linglin


    A stochastic multiproduct capacitated facility location problem involving a single supplier and multiple customers is investigated. Due to the stochastic demands, a reasonable amount of safety stock must be kept in the facilities to achieve suitable service levels, which results in increased inventory cost. Based on the assumption of normal distributed for all the stochastic demands, a nonlinear mixed-integer programming model is proposed, whose objective is to minimize the total cost, including transportation cost, inventory cost, operation cost, and setup cost. A combined simulated annealing (CSA) algorithm is presented to solve the model, in which the outer layer subalgorithm optimizes the facility location decision and the inner layer subalgorithm optimizes the demand allocation based on the determined facility location decision. The results obtained with this approach shown that the CSA is a robust and practical approach for solving a multiple product problem, which generates the suboptimal facility location decision and inventory policies. Meanwhile, we also found that the transportation cost and the demand deviation have the strongest influence on the optimal decision compared to the others.

  14. A Simulated Annealing Methodology to Multiproduct Capacitated Facility Location with Stochastic Demand

    Directory of Open Access Journals (Sweden)

    Jin Qin


    Full Text Available A stochastic multiproduct capacitated facility location problem involving a single supplier and multiple customers is investigated. Due to the stochastic demands, a reasonable amount of safety stock must be kept in the facilities to achieve suitable service levels, which results in increased inventory cost. Based on the assumption of normal distributed for all the stochastic demands, a nonlinear mixed-integer programming model is proposed, whose objective is to minimize the total cost, including transportation cost, inventory cost, operation cost, and setup cost. A combined simulated annealing (CSA algorithm is presented to solve the model, in which the outer layer subalgorithm optimizes the facility location decision and the inner layer subalgorithm optimizes the demand allocation based on the determined facility location decision. The results obtained with this approach shown that the CSA is a robust and practical approach for solving a multiple product problem, which generates the suboptimal facility location decision and inventory policies. Meanwhile, we also found that the transportation cost and the demand deviation have the strongest influence on the optimal decision compared to the others.

  15. A Response Surface Methodology Approach to Groundwater Model Calibration (United States)


    previously calibrated model to create a target data set had the advantages of eliminating uncertainties due to field measurement errors and providing...exponential, linear, quadratic, and spherical variogram models and the inverse distance method to the first, second, and third powers. The different types... Uncertainty in Numerical Models of Groundwater Flow, 1, Mathematical Development," Water Resour. Res., 17(1): 149-161, (1981). Frind, E.O. and G.F

  16. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro


    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  17. Modeling salmonella Dublin into the dairy herd simulation model Simherd

    DEFF Research Database (Denmark)

    Kudahl, Anne Braad


    Infection with Salmonella Dublin in the dairy herd and effects of the infection and relevant control measures are currently being modeled into the dairy herd simulation model called Simherd. The aim is to compare the effects of different control strategies against Salmonella Dublin on both within...... of the simulations will therefore be used for decision support in the national surveillance and eradication program against Salmonella Dublin. Basic structures of the model are programmed and will be presented at the workshop. The model is in a phase of face-validation by a group of Salmonella......-herd- prevalence and economy by simulations. The project Dublin on both within-herd- prevalence and economy by simulations. The project is a part of a larger national project "Salmonella 2007 - 2011" with the main objective to reduce the prevalence of Salmonella Dublin in Danish Dairy herds. Results...

  18. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)


    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  19. Robust modelling and simulation integration of SIMIO with coloured petri nets

    CERN Document Server

    De La Mota, Idalia Flores; Mujica Mota, Miguel; Angel Piera, Miquel


    This book presents for the first time a methodology that combines the power of a modelling formalism such as colored petri nets with the flexibility of a discrete event program such as SIMIO. Industrial practitioners have seen the growth of simulation as a methodology for tacking problems in which variability is the common denominator. Practically all industrial systems, from manufacturing to aviation are considered stochastic systems. Different modelling techniques have been developed as well as mathematical techniques for formalizing the cause-effect relationships in industrial and complex systems. The methodology in this book illustrates how complexity in modelling can be tackled by the use of coloured petri nets, while at the same time the variability present in systems is integrated in a robust fashion. The book can be used as a concise guide for developing robust models, which are able to efficiently simulate the cause-effect relationships present in complex industrial systems without losing the simulat...

  20. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo


    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.


    Directory of Open Access Journals (Sweden)

    Christian Lantuéjoul


    Full Text Available A Boolean model is a union of independent objects (compact random subsets located at Poisson points. Two algorithms are proposed for simulating a Boolean model in a bounded domain. The first one applies only to stationary models. It generates the objects prior to their Poisson locations. Two examples illustrate its applicability. The second algorithm applies to stationary and non-stationary models. It generates the Poisson points prior to the objects. Its practical difficulties of implementation are discussed. Both algorithms are based on importance sampling techniques, and the generated objects are weighted.

  2. Modeling methodology of the conducted emission of a DC-DC converter board


    Boyer, Alexandre; Gonzalez Sentis, Manuel; Ghfiri, Chaimae; Durier, André


    International audience; This paper proposes a methodology to model an electronic board according to a bottom-up approach. This method is applied to build the model of a synchronous buck DC-DC converter board for conducted emission prediction purpose. The different steps to select the model terminals and the construction of the component and PCB interconnect models are described.

  3. Multiscale modelling and simulation: a position paper

    NARCIS (Netherlands)

    Hoekstra, A.; Chopard, B.; Coveney, P.


    We argue that, despite the fact that the field of multiscale modelling and simulation has enjoyed significant success within the past decade, it still holds many open questions that are deemed important but so far have barely been explored. We believe that this is at least in part due to the fact

  4. preliminary multidomain modelling and simulation study

    African Journals Online (AJOL)



  5. MATLAB Based PCM Modeling and Simulation


    Yongchao Jin; Hong Liang; Weiwei Feng; Qiong Wang


    PCM is the key technology of digital communication, and has especially been widely used in the optical fiber communication, digital microwave communication, satellite communication. Modeling PCM communication systems with the pulse code system by programming, and conduct computer simulation by MATLAB, to analysis performance of the linear PCM and logarithmic PCM.  

  6. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.


    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  7. Model based development of fruit simulators (United States)

    Huang, Huijian; Tunnicliffe, Mark; Shim, Young-Min; Bronlund, John E.


    Optimisation of temperature management in postharvest operations, such as precooling, requires extensive experimental measurement. For this purpose, real fruit are used, but due to their relatively high cost and perishable nature, commercial scale trials are not easily conducted. In addition, significant variability between trials exists (Vigneault et al., 2005). Physical fruit analogues or simulators could provide a solution to overcome these issues. To be a solution the fruit simulators must be designed to mimic the relevant heat transfer modes and properties of individual and/or bulk fruit, ideally using an inexpensive and durable material that allows the fruit simulator to be mass produced (Redding et al., 2016). In this paper, we use a mathematical model to characterize the relative importance of the different heat transfer modes occurring during precooling. Based on this model, the modes of heat transfer that must be matched by the fruit simulator are identified. A simplified model is used, representing four fruit stacked on top of each other in a column. The contribution of each heat transfer mode can be evaluated by including or excluding terms in the model.

  8. Applying the epidemiologic problem oriented approach (EPOA) methodology in developing a knowledge base for the modeling of HIV/AIDS. (United States)

    Nganwa, David; Habtemariam, Tsegaye; Tameru, Berhanu; Gerbi, Gemechu; Bogale, Asseged; Robnett, Vinaida; Wilson, Wanda


    In the epidemiologic modeling of diseases, the epidemiologic problem oriented approach (EPOA) methodology facilitates the development of systematic and structured knowledge bases, which are crucial for development of models. A detailed understanding of the epidemiology of a given disease provides the essential framework for model development and enables the laying out of the comprehensive and fundamental structures for the models. To develop such a knowledge base for developing HIV/AIDS models. The EPOA methodology was utilized to develop the knowledge base for HIV/AIDS; it is composed of six pillars within two triads: the Problem Identification/Characterization and the Problem Management/Solution/Mitigation Triads, interlinked by the diagnostic procedure. Using information from various sources, the triads are decomposed into their respective pillar variables and parameters. The agent pillar identifies the causative agent (HIV) and its characteristics. The host pillar identifies and characterizes the host (human). The environment pillar characterizes the physical, biological and socioeconomic environments for both the host and agent. The therapeutics/treatment pillar considers the treatment options for HIV/AIDS. The prevention/control pillar considers prevention and control measures. The health maintenance/health promotion pillar considers measures for the health maintenance of the population. Models for HIV/AIDS can be conceptual, in vivo or in vitro, systems analysis, mathematical, or computational just to name a few. The knowledge base developed using the EPOA methodology provides a well-organized structured source of information, which is used in the variable and parameter estimations as well as analysis (biological, mathematical, statistical and computer simulations), which are crucial in epidemiologic modeling of HIV/AIDS. EPOA methodology has become an important tool in the development of models that can enlighten decision making in public health.

  9. Modeling the Cloud: Methodology for Cloud Computing Strategy and Design (United States)


    technology levels, and define parameters (as if to outsource) - 28 - Example: Logical Data Model and Business Object CRUD Matrix § Integration and...right: § The data model excerpt identifies data entities, attributes and relationships, and § The Business Object CRUD Matrix maps primary functions... Generates detailed implementation roadmap, timeline as well as the impact analysis. • Implementation Governance: • Defines implementation contract

  10. Object-oriented analysis and design: a methodology for modeling the computer-based patient record. (United States)

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L


    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  11. Modeling and Simulation of Count Data (United States)

    Plan, E L


    Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity. PMID:25116273

  12. Global positioning system: a methodology for modelling the pseudorange measurements (United States)

    Barros, M. S. S.; Rosa, L. C. L.; Walter, F.; Alves, L. H. P. M.


    A model for GPS measurements (pseudorange) based on time series statistic (ARIMA) is presented. The model is based on data collected by a Trimble differential GPS receiver over a period of 20 minutes, corresponding to more than 700 data points. The best fitting model, ARIMA(2,2,1), was obtained after testing different models. The final model shows a square law behavior and a residue with a Gaussian like shape with a mean close to zero, and standard deviation of 1.21 m. This result is confirmed by the Kolmogorov-Smirnov test at the significance level of 5%. The independence is tested finding the autocorrelation function, and it is shown that within the confidence interval the independence hypothesis is confirmed (Durbin-Watson test).

  13. Towards a methodology for educational modelling: a case in educational assessment

    NARCIS (Netherlands)

    Giesbers, Bas; Van Bruggen, Jan; Hermans, Henry; Joosten-ten Brinke, Desirée; Burgers, Jan; Koper, Rob; Latour, Ignace


    Giesbers, B., van Bruggen, J., Hermans, H., Joosten-ten Brinke, D., Burgers, J., Koper, R., & Latour, I. (2007). Towards a methodology for educational modelling: a case in educational assessment. Educational Technology & Society, 10 (1), 237-247.

  14. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.


    There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...... of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions...

  15. Methodology and models in erosion research: discussion and conclusions

    National Research Council Canada - National Science Library

    Shellis, R P; Ganss, C; Ren, Y; Zero, D T; Lussi, A


    .... The prospects for clinical trials are also discussed. All models in erosion research require a number of choices regarding experimental conditions, study design and measurement techniques, and these general aspects are discussed first...

  16. Box & Jenkins Model Identification:A Comparison of Methodologies

    Directory of Open Access Journals (Sweden)

    Maria Augusta Soares Machado


    Full Text Available This paper focuses on a presentation of a comparison of a neuro-fuzzy back propagation network and Forecast automatic model Identification to identify automatically Box & Jenkins non seasonal models.Recently some combinations of neural networks and fuzzy logic technologies have being used to deal with uncertain and subjective problems. It is concluded on the basis of the obtained results that this type of approach is very powerful to be used.

  17. A Comparative Study of Three Methodologies for Modeling Dynamic Stall (United States)

    Sankar, L.; Rhee, M.; Tung, C.; ZibiBailly, J.; LeBalleur, J. C.; Blaise, D.; Rouzaud, O.


    During the past two decades, there has been an increased reliance on the use of computational fluid dynamics methods for modeling rotors in high speed forward flight. Computational methods are being developed for modeling the shock induced loads on the advancing side, first-principles based modeling of the trailing wake evolution, and for retreating blade stall. The retreating blade dynamic stall problem has received particular attention, because the large variations in lift and pitching moments encountered in dynamic stall can lead to blade vibrations and pitch link fatigue. Restricting to aerodynamics, the numerical prediction of dynamic stall is still a complex and challenging CFD problem, that, even in two dimensions at low speed, gathers the major difficulties of aerodynamics, such as the grid resolution requirements for the viscous phenomena at leading-edge bubbles or in mixing-layers, the bias of the numerical viscosity, and the major difficulties of the physical modeling, such as the turbulence models, the transition models, whose both determinant influences, already present in static maximal-lift or stall computations, are emphasized by the dynamic aspect of the phenomena.

  18. Systematic reviews of animal models: methodology versus epistemology. (United States)

    Greek, Ray; Menache, Andre


    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  19. The methodology for the existing complex pneumatic systems efficiency increase with the use of mathematical modeling (United States)

    Danilishin, A. M.; Kartashov, S. V.; Kozhukhov, Y. V.; Kozin, E. G.


    The method for the existing complex pneumatic systems efficiency increase has been developed, including the survey steps, mathematical technological process modeling, optimizing the pneumatic system configuration, its operation modes, selection of optimal compressor units and additional equipment. Practical application of the methodology is considered by the example of the existing pneumatic systems underground depot reconstruction. The first stage of the methodology is the survey of acting pneumatic system. The second stage of technique is multivariable mathematical modeling of the pneumatic system operation. The developed methodology is applicable to complex pneumatic systems.

  20. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana


    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  1. Software Abstractions and Methodologies for HPC Simulation Codes on Future Architectures

    Directory of Open Access Journals (Sweden)

    Anshu Dubey


    Full Text Available Simulations with multi-physics modeling have become crucial to many science and engineering fields, and multi-physics capable scientific software is as important to these fields as instruments and facilities are to experimental sciences. The current generation of mature multi-physics codes would have sustainably served their target communities with modest amount of ongoing investment for enhancing capabilities. However, the revolution occurring in the hardware architecture has made it necessary to tackle the parallelism and performance management in these codes at multiple levels. The requirements of various levels are often at cross-purposes with one another, and therefore hugely complicate the software design. All of these considerations make it essential to approach this challenge cooperatively as a community. We conducted a series of workshops under an NSF-SI2 conceptualization grant to get input from various stakeholders, and to identify broad approaches that might lead to a solution. In this position paper we detail the major concerns articulated by the application code developers, and emerging trends in utilization of programming abstractions that we found through these workshops.

  2. Code Blue: methodology for a qualitative study of teamwork during simulated cardiac arrest. (United States)

    Clarke, Samuel; Carolina Apesoa-Varano, Ester; Barton, Joseph


    In-hospital cardiac arrest (IHCA) is a particularly vexing entity from the perspective of preparedness, as it is neither common nor truly rare. Survival from IHCA requires the coordinated efforts of multiple providers with different skill sets who may have little prior experience working together. Survival rates have remained low despite advances in therapy, suggesting that human factors may be at play. This qualitative study uses a quasiethnographic data collection approach combining focus group interviews with providers involved in IHCA resuscitation as well as analysis of video recordings from in situ-simulated cardiac arrest events. Using grounded theory-based analysis, we intend to understand the organisational, interpersonal, cognitive and behavioural dimensions of IHCA resuscitation, and to build a descriptive model of code team functioning. This ongoing study has been approved by the IRB at UC Davis Medical Center. The results will be disseminated in a subsequent manuscript. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to

  3. Methodology for Measurement the Energy Efficiency Involving Solar Heating Systems Using Stochastic Modelling

    Directory of Open Access Journals (Sweden)

    Bruno G. Menita


    Full Text Available The purpose of the present study is to evaluate gains through measurement and verification methodology adapted from the International Performance Measurement and Verification Protocol, from case studies involving Energy Efficiency Projects in the Goias State, Brazil. This paper also presents the stochastic modelling for the generation of future scenarios of electricity saving resulted by these Energy Efficiency Projects. The model is developed by using the Geometric Brownian Motion Stochastic Process with Mean Reversion associated with the Monte Carlo simulation technique. Results show that the electricity saved from the replacement of electric showers by solar water heating systems in homes of low-income families has great potential to bring financial benefits to such families, and that the reduction in peak demand obtained from this Energy Efficiency Action is advantageous to the Brazilian electrical system. Results contemplate also the future scenarios of electricity saving and a sensitivity analysis in order to verify how values of some parameters influence on the results, once there is no historical data available for obtaining these values.

  4. Development of a fast high performance liquid chromatographic screening system for eight antidiabetic drugs by an improved methodology of in-silico robustness simulation. (United States)

    Mokhtar, Hatem I; Abdel-Salam, Randa A; Haddad, Ghada M


    Robustness of RP-HPLC methods is a crucial method quality attribute which has gained an increasing interest throughout the efforts to apply quality by design concepts in analytical methodology. Improvement to design space modeling approaches to represent method robustness was the goal of many previous works. Modeling of design spaces regarding to method robustness fulfils quality by design essence of ensuring method validity throughout the design space. The current work aimed to describe an improvement to robustness modeling of design spaces in context of RP-HPLC method development for screening of eight antidiabetic drugs. The described improvement consisted of in-silico simulation of practical robustness testing procedures thus had the advantage of modeling design spaces with higher confidence in estimated of method robustness. The proposed in-silico robustness test was performed as a full factorial design of simulated method conditions deliberate shifts for each predicted point in knowledge space with modeling error propagation. Design space was then calculated as zones exceeding a threshold probability to pass the simulated robustness testing. Potential design spaces were mapped for three different stationary phases as a function of gradient elution parameters, pH and ternary solvent ratio. A robust and fast separation for the eight compounds within less than 6 min was selected and confirmed through experimental robustness testing. The effectiveness of this approach regarding definition of design spaces with ensured robustness and desired objectives was demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Fault diagnosis based on continuous simulation models (United States)

    Feyock, Stefan


    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  6. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L


    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  7. A parallel computational model for GATE simulations. (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z


    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten


    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  9. Simulating the Sulphur Lamp with PLASIMO, a plasma simulation model. (United States)

    Johnston, C. W.; van der Heijden, H.; van Dijk, Jan; van der Mullen Joost


    Several electrodeless lamps are currently available on the market. Examples of these are the Philips QL, Osrams Endura and GE's Genura. While these lamps make use of induction as a means of power coupling, the source of their light, namely mercury, remains the same as in older lamps. Another electrodeless configuration is the microwave powered Sulphur Lamp. Sulphur lighting has several advantages over other lamp systems. Firstly, large fluxes (≈100,000 lm) of high quality light are obtained with circuit efficacies of up to 60 percent. Secondly, unlike fluorescent and HID lamps there is no decrease in brightness with time since phospors and electrodes are not needed. Another significant aspect of the sulphur lamp is that it contains no mercury, lessening environmental hazards associated with disposal. In order to simulate the operation of this light source, PLASIMO, a plasma modeling tool which was developed at the Eindhoven University of Technology, was used. Modules were included to describe the transport properties and power in- coupling. Results of the simulations will be shown and compared with experiment.

  10. Three Integrating Service Management Models: A Comparative on Methodological Criteria

    Directory of Open Access Journals (Sweden)

    Jussara Goulart da Silva


    Full Text Available Market-oriented service delivery is a challenge for organizations around the world, whose solution involves the articulation of several disciplines. In this line, three models of service management stand out, by their integrative nature, in the literature: of Hiatus, of Stages-Gate in Services and of Multi-Service Design Services. There remains to be known about the pattern of each of them as theoretical conception. Thus, a preliminary comparative evaluation of the three models was performed according to meteorological criteria. The results suggest that they all have merit but are in distinct stages of theoretical consolidation. Even the more seduced Hiatus claims compliment. The Multi-Service Design Service is the most lacking in enhancement. In the stage in which these stages are found, some indications for practical applications are attached to each model. The improvement of the outlook depends on more and more research, also in Brazil.

  11. Dynamics modeling and simulation of flexible airships (United States)

    Li, Yuwen

    The resurgence of airships has created a need for dynamics models and simulation capabilities of these lighter-than-air vehicles. The focus of this thesis is a theoretical framework that integrates the flight dynamics, structural dynamics, aerostatics and aerodynamics of flexible airships. The study begins with a dynamics model based on a rigid-body assumption. A comprehensive computation of aerodynamic effects is presented, where the aerodynamic forces and moments are categorized into various terms based on different physical effects. A series of prediction approaches for different aerodynamic effects are unified and applied to airships. The numerical results of aerodynamic derivatives and the simulated responses to control surface deflection inputs are verified by comparing to existing wind-tunnel and flight test data. With the validated aerodynamics and rigid-body modeling, the equations of motion of an elastic airship are derived by the Lagrangian formulation. The airship is modeled as a free-free Euler-Bernoulli beam and the bending deformations are represented by shape functions chosen as the free-free normal modes. In order to capture the coupling between the aerodynamic forces and the structural elasticity, local velocity on the deformed vehicle is used in the computation of aerodynamic forces. Finally, with the inertial, gravity, aerostatic and control forces incorporated, the dynamics model of a flexible airship is represented by a single set of nonlinear ordinary differential equations. The proposed model is implemented as a dynamics simulation program to analyze the dynamics characteristics of the Skyship-500 airship. Simulation results are presented to demonstrate the influence of structural deformation on the aerodynamic forces and the dynamics behavior of the airship. The nonlinear equations of motion are linearized numerically for the purpose of frequency domain analysis and for aeroelastic stability analysis. The results from the latter for the

  12. Fast Multiscale Reservoir Simulations using POD-DEIM Model Reduction

    KAUST Repository

    Ghasemi, Mohammadreza


    In this paper, we present a global-local model reduction for fast multiscale reservoir simulations in highly heterogeneous porous media with applications to optimization and history matching. Our proposed approach identifies a low dimensional structure of the solution space. We introduce an auxiliary variable (the velocity field) in our model reduction that allows achieving a high degree of model reduction. The latter is due to the fact that the velocity field is conservative for any low-order reduced model in our framework. Because a typical global model reduction based on POD is a Galerkin finite element method, and thus it can not guarantee local mass conservation. This can be observed in numerical simulations that use finite volume based approaches. Discrete Empirical Interpolation Method (DEIM) is used to approximate the nonlinear functions of fine-grid functions in Newton iterations. This approach allows achieving the computational cost that is independent of the fine grid dimension. POD snapshots are inexpensively computed using local model reduction techniques based on Generalized Multiscale Finite Element Method (GMsFEM) which provides (1) a hierarchical approximation of snapshot vectors (2) adaptive computations by using coarse grids (3) inexpensive global POD operations in a small dimensional spaces on a coarse grid. By balancing the errors of the global and local reduced-order models, our new methodology can provide an error bound in simulations. Our numerical results, utilizing a two-phase immiscible flow, show a substantial speed-up and we compare our results to the standard POD-DEIM in finite volume setup.

  13. A Simulation Model for Extensor Tendon Repair

    Directory of Open Access Journals (Sweden)

    Elizabeth Aronstam


    Full Text Available Audience: This simulation model is designed for use by emergency medicine residents. Although we have instituted this at the PGY-2 level of our residency curriculum, it is appropriate for any level of emergency medicine residency training. It might also be adapted for use for a variety of other learners, such as practicing emergency physicians, orthopedic surgery residents, or hand surgery trainees. Introduction: Tendon injuries commonly present to the emergency department, so it is essential that emergency physicians be competent in evaluating such injuries. Indeed, extensor tendon repair is included as an ACGME Emergency Medicine Milestone (Milestone 13, Wound Management, Level 5 – “Performs advanced wound repairs, such as tendon repairs…”.1 However, emergency medicine residents may have limited opportunity to develop these skills due to a lack of patients, competition from other trainees, or preexisting referral patterns. Simulation may provide an alternative means to effectively teach these skills in such settings. Previously described tendon repair simulation models that were designed for surgical trainees have used rubber worms4, licorice5, feeding tubes, catheters6,7, drinking straws8, microfoam tape9, sheep forelimbs10 and cadavers.11 These models all suffer a variety of limitations, including high cost, lack of ready availability, or lack of realism. Objectives: We sought to develop an extensor tendon repair simulation model for emergency medicine residents, designed to meet ACGME Emergency Medicine Milestone 13, Level 5. We wished this model to be simple, inexpensive, and realistic. Methods: The learner responsible content/educational handout component of our innovation teaches residents about emergency department extensor tendon repair, and includes: 1 relevant anatomy 2 indications and contraindications for emergency department extensor tendon repair 3 physical exam findings 4 tendon suture techniques and 5 aftercare. During

  14. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.


    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  15. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: From cognitive maps to agent-based models

    NARCIS (Netherlands)

    El-Sawah, Sondoss; Guillaume, Joseph H.A.; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J.


    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation

  16. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar


    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  17. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar


    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  18. Optimization Versus Robustness in Simulation : A Practical Methodology, With a Production-Management Case-Study

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Gaury, E.G.A.


    Whereas Operations Research has always paid much attention to optimization, practitioners judge the robustness of the 'optimum' solution to be of greater importance.Therefore this paper proposes a practical methodology that is a stagewise combination of the following four proven techniques: (1)

  19. Optimisation Strategies for Modelling and Simulation (United States)

    Louchet, Jean


    Progress in computation techniques has been dramatically reducing the gap between modeling and simulation. Simulation as the natural outcome of modeling is used both as a tool to predict the behavior of natural or artificial systems, a tool to validate modeling, and a tool to build and refine models - in particular identify model internal parameters. In this paper we will concentrate upon the latter, model building and identification, using modern optimization techniques, through application examples taken from the digital imaging field. The first example is given by Image Processing with retrieval of known patterns in an image. The second example is taken from synthetic image animation: we show how it is possible to learn the model's internal physical parameters from actual trajectory examples, using Darwin-inspired evolutionary algorithms. In the third example, we will demonstrate how it is possible, when the problem cannot easily be handled by a reasonably simple optimization technique, to split the problem into simpler elements which can be efficiently evolved by an evolutionary optimization algorithm - which is now called "Parisian Evolution". The "Fly algorithm" is a realtime stereovision algorithm which skips conventional preliminary stages of image processing, now applied into mobile robotics and medical imaging. The main question left is now, to which degree is it possible to delegate to a computer a part of the physicist's role, which is to collect examples and build general laws from these examples?

  20. A Semantic Web-based Methodology for Building Conceptual Models of Scientific Information (United States)

    Benedict, J. L.; McGuinness, D. L.; Fox, P.


    We have designed, developed, and deployed a number of applications that leverage ontologies in interdisciplinary scientific applications. As our work has evolved, we have developed a methodology for leveraging semantic technologies as kind of methodology for building and deploying semantically-enhanced science community applications. We will present our semantic web methodology that has been used in science applications covering Solar Terrestrial Physics, Atmospheric Research, Volcanology, and Plate Tectonics applications, among others. The methodology includes use case generation, carefully chosen mixed skill-set teams, use case analysis, development of a conceptual model, semantic tool usage, expert review and iteration, leveraging of technological infrastructure, rapid prototyping, and open world evolution, iteration, redesign, and redeployment. In this presentation, we will show how the methodology has been used in our projects and highlight benefits including quick deployment and rapid community buy-in.

  1. A Methodological Framework for Instructional Design Model Development: Critical Dimensions and Synthesized Procedures (United States)

    Lee, Jihyun; Jang, Seonyoung


    Instructional design (ID) models have been developed to promote understandings of ID reality and guide ID performance. As the number and diversity of ID practices grows, implicit doubts regarding the reliability, validity, and usefulness of ID models suggest the need for methodological guidance that would help to generate ID models that are…

  2. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira


    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  3. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.


    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  4. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ


    Full Text Available This report outlines progress with the development of computer based dynamic simulation models for ecosystems in the fynbos biome. The models are planned to run on a portable desktop computer with 500 kbytes of memory, extended BASIC language...

  5. eShopper modeling and simulation (United States)

    Petrushin, Valery A.


    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  6. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)


    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  7. Methodological aspects of journaling a dynamic adjusting entry model

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská


    Full Text Available This paper expands the discussion of the importance and function of adjusting entries for loan receivables. Discussion of the cyclical development of adjusting entries, their negative impact on the business cycle and potential solutions has intensified during the financial crisis. These discussions are still ongoing and continue to be relevant to members of the professional public, banking regulators and representatives of international accounting institutions. The objective of this paper is to evaluate a method of journaling dynamic adjusting entries under current accounting law. It also expresses the authors’ opinions on the potential for consistently implementing basic accounting principles in journaling adjusting entries for loan receivables under a dynamic model.

  8. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    Wind turbine wakes can cause 10-20% annual energy losses in wind farms, and wake turbulence can decrease the lifetime of wind turbine blades. One way of estimating these effects is the use of computational fluid dynamics (CFD) to simulate wind turbines wakes in the atmospheric boundary layer. Since...... wind farm, the simulated results cannot be compared directly with wind farm measurements that have a high uncertainty in the measured reference wind direction. When this uncertainty is used to post-process the CFD results, a fairer comparison with measurements is achieved....... this flow is in the high Reynolds number regime, it is mainly dictated by turbulence. As a result, the turbulence modeling in CFD dominates the wake characteristics, especially in Reynolds-averaged Navier-Stokes (RANS). The present work is dedicated to study and develop RANS-based turbulence models...

  9. Forest growth and timber quality: crown models and simulation methods for sustainable forest management (United States)

    Dennis P. Dykstra; Robert A. Monserud


    The purpose of the international conference from which these proceedings are drawn was to explore relationships between forest management activities and timber quality. Sessions were organized to explore models and simulation methodologies that contribute to an understanding of tree development over time and the ways that management and harvesting activities can...

  10. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations (United States)

    Pustejovsky, James E.; Runyon, Christopher


    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  11. Modelling Altitude Information in Two-Dimensional Traffic Networks for Electric Mobility Simulation

    Directory of Open Access Journals (Sweden)

    Diogo Santos


    Full Text Available Elevation data is important for electric vehicle simulation. However, traffic simulators are often two-dimensional and do not offer the capability of modelling urban networks taking elevation into account. Specifically, SUMO - Simulation of Urban Mobility, a popular microscopic traffic simulator, relies on networks previously modelled with elevation data as to provide this information during simulations. This work tackles the problem of adding elevation data to urban network models - particularly for the case of the Porto urban network, in Portugal. With this goal in mind, a comparison between different altitude information retrieval approaches is made and a simple tool to annotate network models with altitude data is proposed. The work starts by describing the methodological approach followed during research and development, then describing and analysing its main findings. This description includes an in-depth explanation of the proposed tool. Lastly, this work reviews some related work to the subject.

  12. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) assessment of property model prediction errors, (iii) effect of outliers and data pre-treatment, (iv) formulation of parameter estimation problem (e.g. weighted least squares, ordinary least squares, robust regression, etc.) In this study a comprehensive methodology is developed to perform a rigorous...... and step-by-step assessment and solution of the pitfalls involved in developing models. The methodology takes into account of the following steps. 1) Experimental data collection and providing structural information of molecules. 2) Choice of the regression model: a) ordinary least square b) robust or c...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...

  13. Difficulties with True Interoperability in Modeling & Simulation (United States)


    Standards in M&S cover multiple layers of technical abstraction. There are middleware specifica- tions, such as the High Level Architecture (HLA) ( IEEE Xplore ... IEEE Xplore Digital Library. 2010. 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA) – Framework and Rules...using different communication protocols being able to allow da- 2642978-1-4577-2109-0/11/$26.00 ©2011 IEEE Report Documentation Page Form ApprovedOMB No

  14. Streptococcus mutans, Caries and Simulation Models


    Ouwehand, Arthur C.; Marika Björklund; Forssten, Sofia D.


    Dental caries and dental plaque are among the most common diseases worldwide, and are caused by a mixture of microorganisms and food debris. Specific types of acid-producing bacteria, especially Streptococcus mutans, colonize the dental surface and cause damage to the hard tooth structure in the presence of fermentable carbohydrates e.g., sucrose and fructose. This paper reviews the link between S. mutans and caries, as well as different simulation models that are available for studying carie...

  15. A Placement Model for Flight Simulators. (United States)


    simulator basing strategies. Captains David R. VanDenburg and Jon D. Veith developed a mathematical model to assist in the placement analysis of A-7...Institute for Defense Analysis, Arlington VA, August 1977. AD A049979. 23. Sugarman , Robert C., Steven L. Johnson, and William F. H. Ring. "B-I Systems...USAF Cost and Plan- nin& Factors. AFR 173-13. Washington: Govern- ment Printing Office, I February 1982. * 30. Van Denburg, Captain David R., USAF

  16. High-Fidelity Roadway Modeling and Simulation (United States)

    Wang, Jie; Papelis, Yiannis; Shen, Yuzhong; Unal, Ozhan; Cetin, Mecit


    Roads are an essential feature in our daily lives. With the advances in computing technologies, 2D and 3D road models are employed in many applications, such as computer games and virtual environments. Traditional road models were generated by professional artists manually using modeling software tools such as Maya and 3ds Max. This approach requires both highly specialized and sophisticated skills and massive manual labor. Automatic road generation based on procedural modeling can create road models using specially designed computer algorithms or procedures, reducing the tedious manual editing needed for road modeling dramatically. But most existing procedural modeling methods for road generation put emphasis on the visual effects of the generated roads, not the geometrical and architectural fidelity. This limitation seriously restricts the applicability of the generated road models. To address this problem, this paper proposes a high-fidelity roadway generation method that takes into account road design principles practiced by civil engineering professionals, and as a result, the generated roads can support not only general applications such as games and simulations in which roads are used as 3D assets, but also demanding civil engineering applications, which requires accurate geometrical models of roads. The inputs to the proposed method include road specifications, civil engineering road design rules, terrain information, and surrounding environment. Then the proposed method generates in real time 3D roads that have both high visual and geometrical fidelities. This paper discusses in details the procedures that convert 2D roads specified in shape files into 3D roads and civil engineering road design principles. The proposed method can be used in many applications that have stringent requirements on high precision 3D models, such as driving simulations and road design prototyping. Preliminary results demonstrate the effectiveness of the proposed method.

  17. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  18. New temperature model of the Netherlands from new data and novel modelling methodology (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik


    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat


    Directory of Open Access Journals (Sweden)



    Full Text Available Hydrocracking is used in the petroleum industry to convert low quality feed stocks into high valued transportation fuels such as gasoline, diesel, and jet fuel. The aim of the present work is to develop a rigorous steady state two-dimensional mathematical model which includes conservation equations of mass and energy for simulating the operation of a hydrocracking unit. Both the catalyst bed and quench zone have been included in this integrated model. The model equations were numerically solved in both axial and radial directions using Matlab software. The presented model was tested against a real plant data in Egypt. The results indicated that a very good agreement between the model predictions and industrial values have been reported for temperature profiles, concentration profiles, and conversion in both radial and axial directions at the hydrocracking unit. Simulation of the quench zone conversion and temperature profiles in the quench zone was also included and gave a low deviation from the actual ones. In concentration profiles, the percentage deviation in the first reactor was found to be 9.28 % and 9.6% for the second reactor. The effect of several parameters such as: Pellet Heat Transfer Coefficient, Effective Radial Thermal Conductivity, Wall Heat Transfer Coefficient, Effective Radial Diffusivity, and Cooling medium (quench zone has been included in this study. The variation of Wall Heat Transfer Coefficient, Effective Radial Diffusivity for the near-wall region, gave no remarkable changes in the temperature profiles. On the other hand, even small variations of Effective Radial Thermal Conductivity, affected the simulated temperature profiles significantly, and this effect could not be compensated by the variations of the other parameters of the model.

  20. Modeling and simulation of coaxial helicopter rotor aerodynamics (United States)

    Gecgel, Murat

    A framework is developed for the computational fluid dynamics (CFD) analyses of a series of helicopter rotor flowfields in hover and in forward flight. The methodology is based on the unsteady solutions of the three-dimensional, compressible Navier-Stokes equations recast in a rotating frame of reference. The simulations are carried out by solving the developed mathematical model on hybrid meshes that aim to optimally exploit the benefits of both the structured and the unstructured grids around complex configurations. The computer code is prepared for parallel processing with distributed memory utilization in order to significantly reduce the computational time and the memory requirements. The developed model and the simulation methodology are validated for single-rotor-in-hover flowfields by comparing the present results with the published experimental data. The predictive merit of different turbulence models for complex helicopter aerodynamics are tested extensively. All but the kappa-o and LES results demonstrate acceptable agreement with the experimental data. It was deemed best to use the one-equation Spalart-Allmaras turbulence model for the subsequent rotor flowfield computations. First, the flowfield around a single rotor in forward flight is simulated. These time---accurate computations help to analyze an adverse effect of increasing the forward flight speed. A dissymmetry of the lift on the advancing and the retreating blades is observed for six different advance ratios. Since the coaxial rotor is proposed to mitigate the dissymmetry, it is selected as the next logical step of the present investigation. The time---accurate simulations are successfully obtained for the flowfields generated by first a hovering then a forward-flying coaxial rotor. The results for the coaxial rotor in forward flight verify the aerodynamic balance proposed by the previously published advancing blade concept. The final set of analyses aims to investigate if the gap between the

  1. Computer Models Simulate Fine Particle Dispersion (United States)


    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  2. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe


    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  3. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón


    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  4. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa


    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  5. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models. (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J


    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Simulating lightning into the RAMS model: two case studies (United States)

    Federico, Stefano; Avolio, Elenio; Petracca, Marco; Panegrossi, Giulia; Dietrich, Stefano


    In this paper we show the results of the implementation of a tailored version of a methodology already presented in the bibliography to simulate flashes into the Regional Atmospheric Modeling System (RAMS). The method gives the flash rate for each thundercloud, which is detected by a labelling algorithm applied to the output of RAMS. The flash rate is computed by assuming a plane capacitor model, which is charged by the non-inductive graupel-ice charge separation mechanism and is discharged by lightning. The method explicitly considers the charging zone and uses the geometry of the graupel field to redistribute the flashes. An important feature of the method is that it gives the position and time of occurrence of each flash, allowing for a detailed and comprehensive display of the lightning activity during the simulation period. The method is applied to two case studies occurred over the Lazio Region, in central Italy. Simulations are compared with the lightning detected by the LINET network. The cases refer to a thunderstorm characterized by an intense lightning activity (up to 2800 flashes per hour over the Lazio Region), and a moderate thunderstorm (up to 1600 flashes per hour over the same domain). The results show that the model is able to catch the main features of both storms and their relative differences. This feature is promising because the method is computationally fast and gives a tool to the forecaster to predict the lightning threat. Nevertheless there are errors in timing (O(3h)) and positioning (O(100km)) of the convection, which mirrors in timing and position errors of the lightning distribution. These model shortcomings presently limit the use of the lightning forecast; nevertheless the method can take advantages of future development of the model physics, initialization techniques, and ensemble forecast. A useful application of the method in an ensemble forecast is already suggested.

  7. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites (United States)

    Rehfield, Lawrence W.


    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  8. Geographic modelling of jaw fracture rates in Australia: a methodological model for healthcare planning. (United States)

    Kruger, Estie; Heitz-Mayfield, Lisa J A; Perera, Irosha; Tennant, Marc


    While Australians are one of the healthiest populations in the world, inequalities in access to health care and health outcomes exist for Indigenous Australians and Australians living in rural or urban areas of the country. Hence, the purpose of this study was to develop an innovative methodological approach for predicting the incidence rates of jaw fractures and estimating the demand for oral health services within Australia. Population data were obtained from the Australian Bureau of Statistics and was divided across Australia by statistical local area and related to a validated remoteness index. Every episode of discharge from all hospitals in Western Australia for the financial years 1999/2000 to 2004/2005 indicating a jaw fracture as the principle oral condition, as classified by the International Classification of Disease (ICD-10AM), was the inclusion criterion for the study. Hospitalization data were obtained from the Western Australian Hospital Morbidity Data System. The model estimated almost 10 times higher jaw fracture rates for Indigenous populations than their non-Indigenous counterparts. Moreover, incidence of jaw fractures was higher among Indigenous people living in rural and remote areas compared with their urban and semi-urban counterparts. In contrast, in the non-Indigenous population, higher rates of jaw fractures were estimated for urban and semi-urban inhabitants compared with their rural and remote counterparts. This geographic modelling technique could be improved by methodological refinements and further research. It will be useful in developing strategies for health management and reducing the burden of jaw fractures and the cost of treatment within Australia. This model will also have direct implications for strategic planning for prevention and management policies in Australia aimed at reducing the inequalities gap both in terms of geography as well as Aboriginality.

  9. Non-LTE modeling of the radiative properties of high-Z plasma using linear response methodology (United States)

    Foord, Mark; Harte, Judy; Scott, Howard


    Non-local thermodynamic equilibrium (NLTE) atomic processes play a key role in the radiation flow and energetics in highly ionized high temperature plasma encountered in inertial confinement fusion (ICF) and astrophysical applications. Modeling complex high-Z atomic systems, such as gold used in ICF hohlraums, is particularly challenging given the complexity and intractable number of atomic states involved. Practical considerations, i.e. speed and memory, in large radiation-hydrodynamic simulations further limit model complexity. We present here a methodology for utilizing tabulated NLTE radiative and EOS properties for use in our radiation-hydrodynamic codes. This approach uses tabulated data, previously calculated with complex atomic models, modified to include a general non-Planckian radiation field using a linear response methodology. This approach extends near-LTE response method to conditions far from LTE. Comparisons of this tabular method with in-line NLTE simulations of a laser heated 1-D hohlraum will be presented, which show good agreement in the time-evolution of the plasma conditions. This work was performed under the auspices of the U.S. Dept. of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  10. Integrating Visualizations into Modeling NEST Simulations (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.


    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  11. mRNA translation and protein synthesis: an analysis of different modelling methodologies and a new PBN based approach. (United States)

    Zhao, Yun-Bo; Krishnan, J


    mRNA translation involves simultaneous movement of multiple ribosomes on the mRNA and is also subject to regulatory mechanisms at different stages. Translation can be described by various codon-based models, including ODE, TASEP, and Petri net models. Although such models have been extensively used, the overlap and differences between these models and the implications of the assumptions of each model has not been systematically elucidated. The selection of the most appropriate modelling framework, and the most appropriate way to develop coarse-grained/fine-grained models in different contexts is not clear. We systematically analyze and compare how different modelling methodologies can be used to describe translation. We define various statistically equivalent codon-based simulation algorithms and analyze the importance of the update rule in determining the steady state, an aspect often neglected. Then a novel probabilistic Boolean network (PBN) model is proposed for modelling translation, which enjoys an exact numerical solution. This solution matches those of numerical simulation from other methods and acts as a complementary tool to analytical approximations and simulations. The advantages and limitations of various codon-based models are compared, and illustrated by examples with real biological complexities such as slow codons, premature termination and feedback regulation. Our studies reveal that while different models gives broadly similiar trends in many cases, important differences also arise and can be clearly seen, in the dependence of the translation rate on different parameters. Furthermore, the update rule affects the steady state solution. The codon-based models are based on different levels of abstraction. Our analysis suggests that a multiple model approach to understanding translation allows one to ascertain which aspects of the conclusions are robust with respect to the choice of modelling methodology, and when (and why) important differences may

  12. Biomedical Simulation Models of Human Auditory Processes (United States)

    Bicak, Mehmet M. A.


    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  13. Modeling and simulation of direct contact evaporators

    Directory of Open Access Journals (Sweden)

    Campos F.B.


    Full Text Available A dynamic model of a direct contact evaporator was developed and coupled to a recently developed superheated bubble model. The latter model takes into account heat and mass transfer during the bubble formation and ascension stages and is able to predict gas holdup in nonisothermal systems. The results of the coupled model, which does not have any adjustable parameter, were compared with experimental data. The transient behavior of the liquid-phase temperature and the vaporization rate under quasi-steady-state conditions were in very good agreement with experimental data. The transient behavior of liquid height was only reasonably simulated. In order to explain this partial disagreement, some possible causes were analyzed.

  14. Simulation and Modeling in High Entropy Alloys (United States)

    Toda-Caraballo, I.; Wróbel, J. S.; Nguyen-Manh, D.; Pérez, P.; Rivera-Díaz-del-Castillo, P. E. J.


    High entropy alloys (HEAs) is a fascinating field of research, with an increasing number of new alloys discovered. This would hardly be conceivable without the aid of materials modeling and computational alloy design to investigate the immense compositional space. The simplicity of the microstructure achieved contrasts with the enormous complexity of its composition, which, in turn, increases the variety of property behavior observed. Simulation and modeling techniques are of paramount importance in the understanding of such material performance. There are numerous examples of how different models have explained the observed experimental results; yet, there are theories and approaches developed for conventional alloys, where the presence of one element is predominant, that need to be adapted or re-developed. In this paper, we review of the current state of the art of the modeling techniques applied to explain HEAs properties, identifying the potential new areas of research to improve the predictability of these techniques.

  15. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    Directory of Open Access Journals (Sweden)

    Alexandre Tadeu Simon


    Full Text Available Despite the increasing interest in supply chain management (SCM by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert and Pagh’s original contribution and involves analysis of eleven referential axes established from key business processes, horizontal structures, and initiatives & practices. We analyze the applicability of the proposed model based on findings from interviews with experts - academics and practitioners - as well as from case studies of three focal firms and their supply chains. In general terms, the methodology can be considered a diagnostic instrument that allows companies to evaluate their maturity regarding SCM practices. From this diagnosis, firms can identify and implement activities to improve degree of adherence to the reference model and achieve SCM benefits. The methodology aims to contribute to SCM theory development. It is an initial, but structured, reference for translating a theoretical approach into practical aspects.

  16. A methodology and supply chain management inspired reference ontology for modeling healthcare teams. (United States)

    Kuziemsky, Craig E; Yazdi, Sara


    Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.

  17. A FEM based methodology to simulate multiple crack propagation in friction stir welds

    DEFF Research Database (Denmark)

    Lepore, Marcello; Carlone, Pierpaolo; Berto, Filippo


    In this work a numerical procedure, based on a finite element approach, is proposed to simulate multiple three-dimensional crack propagation in a welded structure. Cracks are introduced in a friction stir welded AA2024-T3 butt joint, affected by a process-induced residual stress scenario....... The residual stress field was inferred by a thermo-mechanical FEM simulation of the process, considering temperature dependent elastic-plastic material properties, material softening and isotropic hardening. Afterwards, cracks introduced in the selected location of FEM computational domain allow stress...... insertion, as well as with respect to crack sizes measured in three different points for each propagation step. This FEM-based approach simulates the fatigue crack propagation by considering accurately the residual stress field generated by plastic deformations imposed on a structural component and has...

  18. Best Practices for Crash Modeling and Simulation (United States)

    Fasanella, Edwin L.; Jackson, Karen E.


    Aviation safety can be greatly enhanced by the expeditious use of computer simulations of crash impact. Unlike automotive impact testing, which is now routine, experimental crash tests of even small aircraft are expensive and complex due to the high cost of the aircraft and the myriad of crash impact conditions that must be considered. Ultimately, the goal is to utilize full-scale crash simulations of aircraft for design evaluation and certification. The objective of this publication is to describe "best practices" for modeling aircraft impact using explicit nonlinear dynamic finite element codes such as LS-DYNA, DYNA3D, and MSC.Dytran. Although "best practices" is somewhat relative, it is hoped that the authors' experience will help others to avoid some of the common pitfalls in modeling that are not documented in one single publication. In addition, a discussion of experimental data analysis, digital filtering, and test-analysis correlation is provided. Finally, some examples of aircraft crash simulations are described in several appendices following the main report.

  19. Systematic simulations of modified gravity: chameleon models

    Energy Technology Data Exchange (ETDEWEB)

    Brax, Philippe [Institut de Physique Theorique, CEA, IPhT, CNRS, URA 2306, F-91191Gif/Yvette Cedex (France); Davis, Anne-Christine [DAMTP, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Li, Baojiu [Institute for Computational Cosmology, Department of Physics, Durham University, Durham DH1 3LE (United Kingdom); Winther, Hans A. [Institute of Theoretical Astrophysics, University of Oslo, 0315 Oslo (Norway); Zhao, Gong-Bo, E-mail:, E-mail:, E-mail:, E-mail:, E-mail: [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 3FX (United Kingdom)


    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.

  20. Application of numerical simulation methodology to automotive combustion. Project status report, October 28-November 24, 1978

    Energy Technology Data Exchange (ETDEWEB)



    Progress in developing mathematical models to describe combustion conditions with and without swirl in automotive engine combustion chambers and calculations performed with these models are discussed. (LCL)

  1. Closed loop models for analyzing engineering requirements for simulators (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.


    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  2. Modeling and Simulation of Amorphous Materials (United States)

    Pandey, Anup

    The general and practical inversion of diffraction data - producing a computer model correctly representing the material explored - is an important unsolved problem for disordered materials. Such modeling should proceed by using our full knowledge base, both from experiment and theory. In this dissertation, we introduce a robust method, Force-Enhanced Atomic Refinement (FEAR), which jointly exploits the power of ab initio atomistic simulation along with the information carried by diffraction data. As a preliminary trial, the method has been implemented using empirical potentials for amorphous silicon (a-Si) and silica ( SiO2). The models obtained are comparable to the ones prepared by the conventional approaches as well as the experiments. Using ab initio interactions, the method is applied to two very different systems: amorphous silicon (a-Si) and two compositions of a solid electrolyte memory material silver-doped GeSe3. It is shown that the method works well for both the materials. Besides that, the technique is easy to implement, is faster and yields results much improved over conventional simulation methods for the materials explored. It offers a means to add a priori information in first principles modeling of materials, and represents a significant step toward the computational design of non-crystalline materials using accurate interatomic interactions and experimental information. Moreover, the method has also been used to create a computer model of a-Si, using highly precise X-ray diffraction data. The model predicts properties that are close to the continuous random network models but with no a priori assumptions. In addition, using the ab initio molecular dynamics simulations (AIMD) we explored the doping and transport in hydrogenated amorphous silicon a-Si:H with the most popular impurities: boron and phosphorous. We investigated doping for these impurities and the role of H in the doping process. We revealed the network motion and H hopping induced by

  3. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment (United States)

    Pace, Dale K.


    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  4. A Simple Memristor Model for Circuit Simulations (United States)

    Fullerton, Farrah-Amoy; Joe, Aaleyah; Gergel-Hackett, Nadine; Department of Chemistry; Physics Team

    This work describes the development of a model for the memristor, a novel nanoelectronic technology. The model was designed to replicate the real-world electrical characteristics of previously fabricated memristor devices, but was constructed with basic circuit elements using a free widely available circuit simulator, LT Spice. The modeled memrsistors were then used to construct a circuit that performs material implication. Material implication is a digital logic that can be used to perform all of the same basic functions as traditional CMOS gates, but with fewer nanoelectronic devices. This memristor-based digital logic could enable memristors' use in new paradigms of computer architecture with advantages in size, speed, and power over traditional computing circuits. Additionally, the ability to model the real-world electrical characteristics of memristors in a free circuit simulator using its standard library of elements could enable not only the development of memristor material implication, but also the development of a virtually unlimited array of other memristor-based circuits.

  5. Spreadsheets Grow Up: Three Spreadsheet Engineering Methodologies for Large Financial Planning Models


    Grossman, Thomas A.; Ozluk, Ozgur


    Many large financial planning models are written in a spreadsheet programming language (usually Microsoft Excel) and deployed as a spreadsheet application. Three groups, FAST Alliance, Operis Group, and BPM Analytics (under the name "Spreadsheet Standards Review Board") have independently promulgated standardized processes for efficiently building such models. These spreadsheet engineering methodologies provide detailed guidance on design, construction process, and quality control. We summari...

  6. Methodology to Assess No Touch Audit Software Using Simulated Building Utility Data

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, Howard [Purdue Univ., West Lafayette, IN (United States); Braun, James E. [Purdue Univ., West Lafayette, IN (United States); Langner, M. Rois [National Renewable Energy Lab. (NREL), Golden, CO (United States)


    This report describes a methodology developed for assessing the performance of no touch building audit tools and presents results for an available tool. Building audits are conducted in many commercial buildings to reduce building energy costs and improve building operation. Because the audits typically require significant input obtained by building engineers, they are usually only affordable for larger commercial building owners. In an effort to help small building and business owners gain the benefits of an audit at a lower cost, no touch building audit tools have been developed to remotely analyze a building's energy consumption.

  7. Modeling and Simulation Tools: From Systems Biology to Systems Medicine. (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J


    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  8. Validation of population-based disease simulation models: a review of concepts and methods

    Directory of Open Access Journals (Sweden)

    Sharif Behnam


    Full Text Available Abstract Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1 the process of model development, 2 the performance of a model, and 3 the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction. More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility.

  9. A Monte-Carlo simulation analysis for evaluating the severity distribution functions (SDFs) calibration methodology and determining the minimum sample-size requirements. (United States)

    Shirazi, Mohammadali; Reddy Geedipally, Srinivas; Lord, Dominique


    Severity distribution functions (SDFs) are used in highway safety to estimate the severity of crashes and conduct different types of safety evaluations and analyses. Developing a new SDF is a difficult task and demands significant time and resources. To simplify the process, the Highway Safety Manual (HSM) has started to document SDF models for different types of facilities. As such, SDF models have recently been introduced for freeway and ramps in HSM addendum. However, since these functions or models are fitted and validated using data from a few selected number of states, they are required to be calibrated to the local conditions when applied to a new jurisdiction. The HSM provides a methodology to calibrate the models through a scalar calibration factor. However, the proposed methodology to calibrate SDFs was never validated through research. Furthermore, there are no concrete guidelines to select a reliable sample size. Using extensive simulation, this paper documents an analysis that examined the bias between the 'true' and 'estimated' calibration factors. It was indicated that as the value of the true calibration factor deviates further away from '1', more bias is observed between the 'true' and 'estimated' calibration factors. In addition, simulation studies were performed to determine the calibration sample size for various conditions. It was found that, as the average of the coefficient of variation (CV) of the 'KAB' and 'C' crashes increases, the analyst needs to collect a larger sample size to calibrate SDF models. Taking this observation into account, sample-size guidelines are proposed based on the average CV of crash severities that are used for the calibration process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Transforming GIS data into functional road models for large-scale traffic simulation. (United States)

    Wilkie, David; Sewall, Jason; Lin, Ming C


    There exists a vast amount of geographic information system (GIS) data that model road networks around the world as polylines with attributes. In this form, the data are insufficient for applications such as simulation and 3D visualization-tools which will grow in power and demand as sensor data become more pervasive and as governments try to optimize their existing physical infrastructure. In this paper, we propose an efficient method for enhancing a road map from a GIS database to create a geometrically and topologically consistent 3D model to be used in real-time traffic simulation, interactive visualization of virtual worlds, and autonomous vehicle navigation. The resulting representation provides important road features for traffic simulations, including ramps, highways, overpasses, legal merge zones, and intersections with arbitrary states, and it is independent of the simulation methodologies. We test the 3D models of road networks generated by our algorithm on real-time traffic simulation using both macroscopic and microscopic techniques.

  11. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  12. A new methodology for hydro-abrasive erosion tests simulating penstock erosive flow (United States)

    Aumelas, V.; Maj, G.; Le Calvé, P.; Smith, M.; Gambiez, B.; Mourrat, X.


    Hydro-abrasive resistance is an important property requirement for hydroelectric power plant penstock coating systems used by EDF. The selection of durable coating systems requires an experimental characterization of coating performance. This can be achieved by performing accelerated and representative laboratory tests. In case of severe erosion induced by a penstock flow, there is no suitable method or standard representative of real erosive flow conditions. The presented study aims at developing a new methodology and an associated laboratory experimental device. The objective of the laboratory apparatus is to subject coated test specimens to wear conditions similar to the ones generated at the penstock lower generatrix in actual flow conditions. Thirteen preselected coating solutions were first been tested during a 45 hours erosion test. A ranking of the thirteen coating solutions was then determined after characterisation. To complete this first evaluation and to determine the wear kinetic of the four best coating solutions, additional erosion tests were conducted with a longer duration of 216 hours. A comparison of this new method with standardized tests and with real service operating flow conditions is also discussed. To complete the final ranking based on hydro-abrasive erosion tests, some trial tests were carried out on penstock samples to check the application method of selected coating systems. The paper gives some perspectives related to erosion test methodologies for materials and coating solutions for hydraulic applications. The developed test method can also be applied in other fields.

  13. Mixed-realism simulation of adverse event disclosure: an educational methodology and assessment instrument. (United States)

    Matos, Francisco M; Raemer, Daniel B


    Physicians have an ethical duty to disclose adverse events to patients or families. Various strategies have been reported for teaching disclosure, but no instruments have been shown to be reliable for assessing them.The aims of this study were to report a structured method for teaching adverse event disclosure using mixed-realism simulation, develop and begin to validate an instrument for assessing performance, and describe the disclosure practice of anesthesiology trainees. Forty-two anesthesiology trainees participated in a 2-part exercise with mixed-realism simulation. The first part took place using a mannequin patient in a simulated operating room where trainees became enmeshed in a clinical episode that led to an adverse event and the second part in a simulated postoperative care unit where the learner is asked to disclose to a standardized patient who systematically moves through epochs of grief response. Two raters scored subjects using an assessment instrument we developed that combines a 4-element behaviorally anchored rating scale (BARS) and a 5-stage objective rating scale. The performance scores for elements within the BARS and the 5-stage instrument showed excellent interrater reliability (Cohen's κ = 0.7), appropriate range (mean range for BARS, 4.20-4.47; mean range for 5-stage instrument, 3.73-4.46), and high internal consistency (P realism simulation that engages learners in an adverse event and allows them to practice disclosure to a structured range of patient responses. We have developed a reliable 2-part instrument with strong psychometric properties for assessing disclosure performance.

  14. Assessment of a generalizable methodology to assess learning from manikin-based simulation technology. (United States)

    Giuliano, Dominic A; McGregor, Marion


    Objective : This study combined a learning outcomes-based checklist and salient characteristics derived from wisdom-of-crowds theory to test whether differing groups of judges (diversity maximized versus expertise maximized) would be able to appropriately assess videotaped, manikin-based simulation scenarios. Methods : Two groups of 3 judges scored 9 videos of interns managing a simulated cardiac event. The first group had a diverse range of knowledge of simulation procedures, while the second group was more homogeneous in their knowledge and had greater simulation expertise. All judges viewed 3 types of videos (predebriefing, postdebriefing, and 6 month follow-up) in a blinded fashion and provided their scores independently. Intraclass correlation coefficients (ICCs) were used to assess the reliability of judges as related to group membership. Scores from each group of judges were averaged to determine the impact of group on scores. Results : Results revealed strong ICCs for both groups of judges (diverse, 0.89; expert, 0.97), with the diverse group of judges having a much wider 95% confidence interval for the ICC. Analysis of variance of the average checklist scores indicated no significant difference between the 2 groups of judges for any of the types of videotapes assessed (F = 0.72, p = .4094). There was, however, a statistically significant difference between the types of videos (F = 14.39, p = .0004), with higher scores at the postdebrief and 6-month follow-up time periods. Conclusions : Results obtained in this study provide optimism for assessment procedures in simulation using learning outcomes-based checklists and a small panel of judges.

  15. Assessment of a generalizable methodology to assess learning from manikin-based simulation technology* (United States)

    Giuliano, Dominic A.; McGregor, Marion


    Objective This study combined a learning outcomes-based checklist and salient characteristics derived from wisdom-of-crowds theory to test whether differing groups of judges (diversity maximized versus expertise maximized) would be able to appropriately assess videotaped, manikin-based simulation scenarios. Methods Two groups of 3 judges scored 9 videos of interns managing a simulated cardiac event. The first group had a diverse range of knowledge of simulation procedures, while the second group was more homogeneous in their knowledge and had greater simulation expertise. All judges viewed 3 types of videos (predebriefing, postdebriefing, and 6 month follow-up) in a blinded fashion and provided their scores independently. Intraclass correlation coefficients (ICCs) were used to assess the reliability of judges as related to group membership. Scores from each group of judges were averaged to determine the impact of group on scores. Results Results revealed strong ICCs for both groups of judges (diverse, 0.89; expert, 0.97), with the diverse group of judges having a much wider 95% confidence interval for the ICC. Analysis of variance of the average checklist scores indicated no significant difference between the 2 groups of judges for any of the types of videotapes assessed (F = 0.72, p = .4094). There was, however, a statistically significant difference between the types of videos (F = 14.39, p = .0004), with higher scores at the postdebrief and 6-month follow-up time periods. Conclusions Results obtained in this study provide optimism for assessment procedures in simulation using learning outcomes-based checklists and a small panel of judges. PMID:24576004

  16. Traffic flow dynamics data, models and simulation

    CERN Document Server

    Treiber, Martin


    This textbook provides a comprehensive and instructive coverage of vehicular traffic flow dynamics and modeling. It makes this fascinating interdisciplinary topic, which to date was only documented in parts by specialized monographs, accessible to a broad readership. Numerous figures and problems with solutions help the reader to quickly understand and practice the presented concepts. This book is targeted at students of physics and traffic engineering and, more generally, also at students and professionals in computer science, mathematics, and interdisciplinary topics. It also offers material for project work in programming and simulation at college and university level. The main part, after presenting different categories of traffic data, is devoted to a mathematical description of the dynamics of traffic flow, covering macroscopic models which describe traffic in terms of density, as well as microscopic many-particle models in which each particle corresponds to a vehicle and its driver. Focus chapters on ...

  17. Biomechanics trends in modeling and simulation

    CERN Document Server

    Ogden, Ray


    The book presents a state-of-the-art overview of biomechanical and mechanobiological modeling and simulation of soft biological tissues. Seven well-known scientists working in that particular field discuss topics such as biomolecules, networks and cells as well as failure, multi-scale, agent-based, bio-chemo-mechanical and finite element models appropriate for computational analysis. Applications include arteries, the heart, vascular stents and valve implants as well as adipose, brain, collagenous and engineered tissues. The mechanics of the whole cell and sub-cellular components as well as the extracellular matrix structure and mechanotransduction are described. In particular, the formation and remodeling of stress fibers, cytoskeletal contractility, cell adhesion and the mechanical regulation of fibroblast migration in healing myocardial infarcts are discussed. The essential ingredients of continuum mechanics are provided. Constitutive models of fiber-reinforced materials with an emphasis on arterial walls ...

  18. De-individualized psychophysiological strain assessment during a flight simulation test—Validation of a space methodology (United States)

    Johannes, Bernd; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Hoermann, Hans-Juergen

    For the evaluation of an operator's skill reliability indicators of work quality as well as of psychophysiological states during the work have to be considered. The herein presented methodology and measurement equipment were developed and tested in numerous terrestrial and space experiments using a simulation of a spacecraft docking on a space station. However, in this study the method was applied to a comparable terrestrial task—the flight simulator test (FST) used in the DLR selection procedure for ab initio pilot applicants for passenger airlines. This provided a large amount of data for a statistical verification of the space methodology. For the evaluation of the strain level of applicants during the FST psychophysiological measurements were used to construct a "psychophysiological arousal vector" (PAV) which is sensitive to various individual reaction patterns of the autonomic nervous system to mental load. Its changes and increases will be interpreted as "strain". In the first evaluation study, 614 subjects were analyzed. The subjects first underwent a calibration procedure for the assessment of their autonomic outlet type (AOT) and on the following day they performed the FST, which included three tasks and was evaluated by instructors applying well-established and standardized rating scales. This new method will possibly promote a wide range of other future applications in aviation and space psychology.

  19. Key-Aspects of Scientific Modeling Exemplified by School Science Models: Some Units for Teaching Contextualized Scientific Methodology (United States)

    Develaki, Maria


    Models and modeling are core elements of scientific methods and consequently also are of key importance for the conception and teaching of scientific methodology. The epistemology of models and its transfer and adaption to nature of science education are not, however, simple themes. We present some conceptual units in which school science models…

  20. An unstructured direct simulation Monte Carlo methodology with Kinetic-Moment inflow and outflow boundary conditions (United States)

    Gatsonis, Nikolaos A.; Chamberlin, Ryan E.; Averkin, Sergey N.


    The mathematical and computational aspects of the direct simulation Monte Carlo on unstructured tetrahedral grids (U3DSMC) with a Kinetic-Moment (KM) boundary conditions method are presented. The algorithms for particle injection, particle loading, particle motion, and particle tracking are presented. The KM method applicable to a subsonic or supersonic inflow/outflow boundary, couples kinetic (particle) U3DSMC properties with fluid (moment) properties. The KM method obtains the number density, temperature and mean velocity needed to define the equilibrium, drifting Maxwellian distribution at a boundary. The moment component of KM is based on the local one dimensional inviscid (LODI) boundary conditions method consistent with the 5-moment compressible Euler equations. The kinetic component of KM is based on U3DSMC for interior properties and the equilibrium drifting Maxwellian at the boundary. The KM method is supplemented with a time-averaging procedure, allows for choices in sampling-cell procedures, minimizes fluctuations and accelerates the convergence in subsonic flows. Collision sampling in U3DSMC implements the no-time-counter method and includes elastic and inelastic collisions. The U3DSMC with KM boundary conditions is validated and verified extensively with simulations of subsonic nitrogen flows in a cylindrical tube with imposed inlet pressure and density and imposed outlet pressure. The simulations cover the regime from slip to free-molecular with inlet Knudsen numbers between 0.183 and 18.27 and resulting inlet Mach numbers between 0.037 and 0.027. The pressure and velocity profiles from U3DSMC-KM simulations are compared with analytical solutions obtained from first-order and second-order slip boundary conditions. Mass flow rates from U3DSMC-KM are compared with validated analytical solutions for the entire Knudsen number regime considered. Error and sensitivity analysis is performed and numerical fractional errors are in agreement with theoretical

  1. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte


    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  2. Modelling and simulations of controlled release fertilizer (United States)

    Irfan, Sayed Ameenuddin; Razali, Radzuan; Shaari, Ku Zilati Ku; Mansor, Nurlidia


    The recent advancement in controlled release fertilizer has provided an alternative solution to the conventional urea, controlled release fertilizer has a good plant nutrient uptake they are environment friendly. To have an optimum plant intake of nutrients from controlled release fertilizer it is very essential to understand the release characteristics. A mathematical model is developed to predict the release characteristics from polymer coated granule. Numerical simulations are performed by varying the parameters radius of granule, soil water content and soil porosity to study their effect on fertilizer release. Understanding these parameters helps in the better design and improve the efficiency of controlled release fertilizer.

  3. The mathematical model of a LUNG simulator

    Directory of Open Access Journals (Sweden)

    František Šolc


    Full Text Available The paper discusses the design, modelling, implementation and testing of a specific LUNG simulator,. The described research was performed as a part of the project AlveoPic – Advanced Lung Research for Veterinary Medicine of Particles for Inhalation. The simulator was designed to establish a combined study programme comprising Biomedical Engineering Sciences (FEEC BUT and Healthcare and Rehabilitation Technology (FH Technikum Wien. The simulator is supposed to be an advanced laboratory equipment which should enhance the standard of the existing research activities within the above-mentioned study programs to the required level. Thus, the proposed paper introduces significant technical equipment for the laboratory education of students at both FH Technikum Wien and the Faculty of Electrical Engineering and Communication, Brno University of Technology. The apparatuses described here will be also used to support cooperative research activities. In the given context, the authors specify certain technical solutions and parameters related to artificial lungs, present the electrical equipment of the system, and point out the results of the PC-based measurement and control.

  4. Simulation model for port shunting yards (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.


    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  5. Modeling and Simulation of Ultrasound Wave Propagation (United States)

    Isler, Sylvia Kay

    The specific aim of this work is to model diagnostic ultrasound under strong acoustic scattering conditions. This work is divided into three main sub-topics. The first concerns the solution of the Helmholtz integral equation in three-dimensions. The Pade approximant method for accelerating the convergence of the Neumann series, first proposed by Chandra and Thompson for two-dimensional acoustic scattering problems, is extended to three-dimensions. Secondly, the propagation of acoustic pulses through a medium that is characterized by spatial variations in compressibility is considered. The medium is excited using an ideal, bandlimited acoustic transducer having a Gaussian radiation profile. The time response is determined by using a spatial Fourier wavenumber decomposition of the incident and scattered pressure fields. Using the Pade approximant method, the pressure is evaluated for each wavenumber at each spatial grid location. By taking the inverse Fourier transform of the result, the temporal and spatial evolution of the pressure field is obtained. The third part examines acoustic wave propagation in simulated soft tissue. Methods for generating spatially correlated random media are discussed and applied to simulating the structure of soft tissue. Simulated sonograms are constructed and the effects of strong scattering are considered.

  6. Evaluation of automated decision making methodologies and development of an integrated robotic system simulation: Study results (United States)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelley, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.


    The implementation of a generic computer simulation for manipulator systems (ROBSIM) is described. The program is written in FORTRAN, and allows the user to: (1) Interactively define a manipulator system consisting of multiple arms, load objects, targets, and an environment; (2) Request graphic display or replay of manipulator motion; (3) Investigate and simulate various control methods including manual force/torque and active compliance control; and (4) Perform kinematic analysis, requirements analysis, and response simulation of manipulamotion. Previous reports have described the algorithms and procedures for using ROBSIM. These reports are superseded and additional features which were added are described. They are: (1) The ability to define motion profiles and compute loads on a common base to which manipulator arms are attached; (2) Capability to accept data describing manipulator geometry from a Computer Aided Design data base using the Initial Graphics exchange Specification format; (3) A manipulator control algorithm derived from processing the TV image of known reference points on a target; and (4) A vocabulary of simple high level task commands which can be used to define task scenarios.

  7. Modeling and Simulation of a Wind Turbine Driven Induction Generator Using Bond Graph

    Directory of Open Access Journals (Sweden)

    Lachouri Abderrazak


    Full Text Available The objective of this paper is to investigate the modelling and simulation of wind turbine applied on induction generator with bond graph methodology as   a graphical and multi domain approach. They provide a precise and unambiguous modelling tool, which allows for the specification of hierarchical physical structures. The paper begins with an introduction to the bond graphs technique, followed by an implementation of the wind turbine model. Simulation results illustrate the simplified system response obtained using the 20-sim software.

  8. Heartbeat Model for Component Failure in Simulation of Plant Behavior

    Energy Technology Data Exchange (ETDEWEB)

    R. W. Youngblood; R. R. Nourgaliev; D. L. Kelly; C. L. Smith; T-N. Dinh


    As part of the Department of Energy’s “Light Water Reactor Sustainability Program” (LWRSP), tools and methodology for risk-informed characterization of safety margin are being developed for use in supporting decision-making on plant life extension after the first license renewal. Beginning with the traditional discussion of “margin” in terms of a “load” (a physical challenge to system or component function) and a “capacity” (the capability of that system or component to accommodate the challenge), we are developing the capability to characterize realistic probabilistic load and capacity spectra, reflecting both aleatory and epistemic uncertainty in system behavior. This way of thinking about margin comports with work done in the last 10 years. However, current capabilities to model in this way are limited: it is currently possible, but difficult, to validly simulate enough time histories to support quantification in realistic problems, and the treatment of environmental influences on reliability is relatively artificial in many existing applications. The INL is working on a next-generation safety analysis capability (widely referred to as “R7”) that will enable a much better integration of reliability-related and phenomenology-related aspects of margin. In this paper, we show how to implement cumulative damage (“heartbeat”) models for component reliability that lend themselves naturally to being included as part of the phenomenology simulation. Implementation of this modeling approach relies on the way in which the phenomenology simulation implements its dynamic time step management. Within this approach, component failures influence the phenomenology, and the phenomenology influences the component failures.

  9. Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine. (United States)

    Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner


    Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.

  10. A parameter estimation and identifiability analysis methodology applied to a street canyon air pollution model

    DEFF Research Database (Denmark)

    Ottosen, Thor Bjørn; Ketzel, Matthias; Skov, Henrik


    Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street...... applied for the uncertainty calculations underestimated the parameter uncertainties. The model parameter uncertainty was qualitatively assessed to be significant, and reduction strategies were identified. © 2016 Elsevier Ltd...

  11. The Analysis of Ship Air Defense: The Simulation Model SEAROADS

    NARCIS (Netherlands)

    Dongen, M.P.F.M. van; Kos, J.


    The Simulation, Evaluation, Analysis, and Research On Air Defense Systems model (SEAROADS) is a computer simulation model for evaluating, analyzing, and studying the performance of air defense systems aboard naval frigates. The SEAROADS model simulates an engagement between a given ship

  12. A rainfall simulation model for agricultural development in Bangladesh

    Directory of Open Access Journals (Sweden)

    M. Sayedur Rahman


    Full Text Available A rainfall simulation model based on a first-order Markov chain has been developed to simulate the annual variation in rainfall amount that is observed in Bangladesh. The model has been tested in the Barind Tract of Bangladesh. Few significant differences were found between the actual and simulated seasonal, annual and average monthly. The distribution of number of success is asymptotic normal distribution. When actual and simulated daily rainfall data were used to drive a crop simulation model, there was no significant difference of rice yield response. The results suggest that the rainfall simulation model perform adequately for many applications.

  13. VISION: Verifiable Fuel Cycle Simulation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Abdellatif M. Yacout; Gretchen E. Matthern; Steven J. Piet; David E. Shropshire


    The nuclear fuel cycle is a very complex system that includes considerable dynamic complexity as well as detail complexity. In the nuclear power realm, there are experts and considerable research and development in nuclear fuel development, separations technology, reactor physics and waste management. What is lacking is an overall understanding of the entire nuclear fuel cycle and how the deployment of new fuel cycle technologies affects the overall performance of the fuel cycle. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing and delays in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works and can transition as technologies are changed. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model and some examples of how to use VISION.

  14. Assessment of potential improvements on regional air quality modelling related with implementation of a detailed methodology for traffic emission estimation. (United States)

    Coelho, Margarida C; Fontes, Tânia; Bandeira, Jorge M; Pereira, Sérgio R; Tchepel, Oxana; Dias, Daniela; Sá, Elisa; Amorim, Jorge H; Borrego, Carlos


    The accuracy and precision of air quality models are usually associated with the emission inventories. Thus, in order to assess if there are any improvements on air quality regional simulations using detailed methodology of road traffic emission estimation, a regional air quality modelling system was applied. For this purpose, a combination of top-down and bottom-up approaches was used to build an emission inventory. To estimate the road traffic emissions, the bottom-up approach was applied using an instantaneous emission model (Vehicle Specific Power - VSP methodology), and an average emission model (CORINAIR methodology), while for the remaining activity sectors the top-down approach was used. Weather Research and Forecasting (WRF) and Comprehensive Air quality (CAMx) models were selected to assess two emission scenarios: (i) scenario 1, which includes the emissions from the top-down approach; and (ii) scenario 2, which includes the emissions resulting from integration of top-down and bottom-up approaches. The results show higher emission values for PM10, NOx and HC, for scenario 1, and an inverse behaviour to CO. The highest differences between these scenarios were observed for PM10 and HC, about 55% and 75% higher (respectively for each pollutant) than emissions provided by scenario 2. This scenario gives better results for PM10, CO and O3. For NO2 concentrations better results were obtained with scenario 1. Thus, the results obtained suggest that with the combination of the top-down and bottom-up approaches to emission estimation several improvements in the air quality results can be achieved, mainly for PM10, CO and O3. © 2013 Elsevier B.V. All rights reserved.

  15. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo


    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  16. Modeling and numerical simulations of the influenced Sznajd model (United States)

    Karan, Farshad Salimi Naneh; Srinivasan, Aravinda Ramakrishnan; Chakraborty, Subhadeep


    This paper investigates the effects of independent nonconformists or influencers on the behavioral dynamic of a population of agents interacting with each other based on the Sznajd model. The system is modeled on a complete graph using the master equation. The acquired equation has been numerically solved. Accuracy of the mathematical model and its corresponding assumptions have been validated by numerical simulations. Regions of initial magnetization have been found from where the system converges to one of two unique steady-state PDFs, depending on the distribution of influencers. The scaling property and entropy of the stationary system in presence of varying level of influence have been presented and discussed.

  17. Methodology for the calculation of response factors through experimental tests and validation with simulation

    Energy Technology Data Exchange (ETDEWEB)

    Martin, K.; Flores, I.; Escudero, C.; Apaolaza, A. [Construction Quality Control Laboratory of the Basque Goverment, C/Aguirrelanda no 10, 01013 Vitoria-Gasteiz (Spain); Sala, J.M. [Thermal Engineering Department, Basque Country University (UPV/EHU), Alameda Urquijo s/n, 48013 Bilbao (Spain)


    One of the most simple and intuitive methods employed to characterise a building solution in transient regime is based on the use of response factors. Its acquisition by calculation is an appropriate approach when the thermo-physical properties of the materials are known. However, in a great number of building products these data are not available and thus large errors in the calculation may be incurred, which cannot be quantified. In this work, a dynamic testing method is presented inside a guarded hot-box unit, where the response factors of a wall can be obtained without requiring the corresponding material properties. This method has been validated by means of a finite volumes simulation code for a wall which thermal characteristics are perfectly defined. Although the errors committed when adding the response factors and comparing them with the transmittance values are higher in the experiment than in the numerical analysis, there is a good agreement between the heat flows obtained experimentally and with the simulation. (author)

  18. A Methodology for Defining Electricity Demand in Energy Simulations Referred to the Italian Context

    Directory of Open Access Journals (Sweden)

    Paola Caputo


    Full Text Available Electricity consumption in Europe is constantly increasing, despite the fact that in recent years, huge efforts in terms of programs and regulations have been made towards energy demand reduction and energy systems improvement. Since the electricity demand affects both the operation of the supply and distribution plants and the thermal loads of buildings, the importance of providing a proper definition of demand profiles is evident. The main aim of the paper is to provide a set of standard electricity profiles that can reasonably be adopted as input in energy simulations related to the built environment, with particular regards to the Italian context. The work presented in this paper originated within a wider long lasting research aimed at developing a platform for buildings’ energy simulations at district level, with particular reference to the Italian conditions. In this context, it was necessary to define hourly profiles regarding both occupancy and electricity use for lighting and appliances related to different building uses and typologies. For this purpose, the main methods and references for defining electricity loads in buildings were evaluated and average hourly profiles were accordingly developed for residential and commercial buildings. Then the related internal gains were determined and compared to the current Italian standards.

  19. Performance assessment in a flight simulator test—Validation of a space psychology methodology (United States)

    Johannes, B.; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Goeters, Klaus-Martin; Maschke, Peter; Stelling, Dirk; Eißfeldt, Hinnerk


    The objective assessment of operator performance in hand controlled docking of a spacecraft on a space station has 30 years of tradition and is well established. In the last years the performance assessment was successfully combined with a psycho-physiological approach for the objective assessment of the levels of physiological arousal and psychological load. These methods are based on statistical reference data. For the enhancement of the statistical power of the evaluation methods, both were actually implemented into a comparable terrestrial task: the flight simulator test of DLR in the selection procedure for ab initio pilot applicants for civil airlines. In the first evaluation study 134 male subjects were analysed. Subjects underwent a flight simulator test including three tasks, which were evaluated by instructors applying well-established and standardised rating scales. The principles of the performance algorithms of the docking training were adapted for the automated flight performance assessment. They are presented here. The increased human errors under instrument flight conditions without visual feedback required a manoeuvre recognition algorithm before calculating the deviation of the flown track from the given task elements. Each manoeuvre had to be evaluated independently of former failures. The expert rated performance showed a highly significant correlation with the automatically calculated performance for each of the three tasks: r=.883, r=.874, r=.872, respectively. An automated algorithm successfully assessed the flight performance. This new method will possibly provide a wide range of other future applications in aviation and space psychology.

  20. Simulations of an Offshore Wind Farm Using Large-Eddy Simulation and a Torque-Controlled Actuator Disc Model (United States)

    Creech, Angus; Früh, Wolf-Gerrit; Maguire, A. Eoghan


    We present here a computational fluid dynamics (CFD) simulation of Lillgrund offshore wind farm, which is located in the Øresund Strait between Sweden and Denmark. The simulation combines a dynamic representation of wind turbines embedded within a large-eddy simulation CFD solver and uses hr-adaptive meshing to increase or decrease mesh resolution where required. This allows the resolution of both large-scale flow structures around the wind farm, and the local flow conditions at individual turbines; consequently, the response of each turbine to local conditions can be modelled, as well as the resulting evolution of the turbine wakes. This paper provides a detailed description of the turbine model which simulates the interaction between the wind, the turbine rotors, and the turbine generators by calculating the forces on the rotor, the body forces on the air, and instantaneous power output. This model was used to investigate a selection of key wind speeds and directions, investigating cases where a row of turbines would be fully aligned with the wind or at specific angles to the wind. Results shown here include presentations of the spin-up of turbines, the observation of eddies moving through the turbine array, meandering turbine wakes, and an extensive wind farm wake several kilometres in length. The key measurement available for cross-validation with operational wind farm data is the power output from the individual turbines, where the effect of unsteady turbine wakes on the performance of downstream turbines was a main point of interest. The results from the simulations were compared to the performance measurements from the real wind farm to provide a firm quantitative validation of this methodology. Having achieved good agreement between the model results and actual wind farm measurements, the potential of the methodology to provide a tool for further investigations of engineering and atmospheric science problems is outlined.

  1. From LCAs to simplified models: a generic methodology applied to wind power electricity. (United States)

    Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle


    This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.

  2. Large clean mesocosms and simulated dust deposition: a new methodology to investigate responses of marine oligotrophic ecosystems to atmospheric inputs

    Directory of Open Access Journals (Sweden)

    C. Guieu


    mesocosms serving as controls (CONTROLS-Meso = no addition and three mesocosms seeded with the same amount of Saharan dust (DUST-Meso = 10 g m−2 of sprayed dust. A large panel of biogeochemical parameters was measured at 0.1 m, at 5 m and 10 m in all of the mesocosms and at a selected site outside the mesocosms before seeding and at regular intervals afterward. Statistical analyses of the results show that data from three mesocosms that received the same treatment are highly reproducible (variability < 30% and that there is no significant difference between data obtained from CONTROLS-Meso and data obtained outside the mesocosms.

    This paper demonstrates that the methodology developed in the DUNE project is suitable to quantifying and parameterizing the impact of atmospheric chemical forcing in a low-nutrient, low-chlorophyll (LNLC ecosystem. Such large mesocosms can be considered as 1-D ecosystems so that the parameterization obtained from these experiments can be integrated into ecosystem models.

  3. Dispersion modeling by kinematic simulation: Cloud dispersion model

    Energy Technology Data Exchange (ETDEWEB)

    Fung, J C H [Department of Mathematics, Hong Kong University of Science and Technology, Clear Water Bay (Hong Kong); Perkins, R J [Laboratoire de Mecanique des Fluides et d' Acoustique, Ecole Centrale de Lyon (France)], E-mail:


    A new technique has been developed to compute mean and fluctuating concentrations in complex turbulent flows (tidal current near a coast and deep ocean). An initial distribution of material is discretized into any small clouds which are advected by a combination of the mean flow and large scale turbulence. The turbulence can be simulated either by kinematic simulation (KS) or direct numerical simulation. The clouds also diffuse relative to their centroids; the statistics for this are obtained from a separate calculation of the growth of individual clouds in small scale turbulence, generated by KS. The ensemble of discrete clouds is periodically re-discretized, to limit the size of the small clouds and prevent overlapping. The model is illustrated with simulations of dispersion in uniform flow, and the results are compared with analytic, steady state solutions. The aim of this study is to understand how pollutants disperses in a turbulent flow through a numerical simulation of fluid particle motion in a random flow field generated by Fourier modes. Although this homogeneous turbulent is rather a 'simple' flow, it represents a building block toward understanding pollutant dispersion in more complex flow. The results presented here are preliminary in nature, but we expect that similar qualitative results should be observed in a genuine turbulent flow.

  4. Microfluidic very large scale integration (VLSI) modeling, simulation, testing, compilation and physical synthesis

    CERN Document Server

    Pop, Paul; Madsen, Jan


    This book presents the state-of-the-art techniques for the modeling, simulation, testing, compilation and physical synthesis of mVLSI biochips. The authors describe a top-down modeling and synthesis methodology for the mVLSI biochips, inspired by microelectronics VLSI methodologies. They introduce a modeling framework for the components and the biochip architecture, and a high-level microfluidic protocol language. Coverage includes a topology graph-based model for the biochip architecture, and a sequencing graph to model for biochemical application, showing how the application model can be obtained from the protocol language. The techniques described facilitate programmability and automation, enabling developers in the emerging, large biochip market. · Presents the current models used for the research on compilation and synthesis techniques of mVLSI biochips in a tutorial fashion; · Includes a set of "benchmarks", that are presented in great detail and includes the source code of several of the techniques p...

  5. Modeling and simulation of cascading contingencies (United States)

    Zhang, Jianfeng

    This dissertation proposes a new approach to model and study cascading contingencies in large power systems. The most important contribution of the work involves the development and validation of a heuristic analytic model to assess the likelihood of cascading contingencies, and the development and validation of a uniform search strategy. We model the probability of cascading contingencies as a function of power flow and power flow changes. Utilizing logistic regression, the proposed model is calibrated using real industry data. This dissertation analyzes random search strategies for Monte Carlo simulations and proposes a new uniform search strategy based on the Metropolis-Hastings Algorithm. The proposed search strategy is capable of selecting the most significant cascading contingencies, and it is capable of constructing an unbiased estimator to provide a measure of system security. This dissertation makes it possible to reasonably quantify system security and justify security operations when economic concerns conflict with reliability concerns in the new competitive power market environment. It can also provide guidance to system operators about actions that may be taken to reduce the risk of major system blackouts. Various applications can be developed to take advantage of the quantitative security measures provided in this dissertation.

  6. Incorporating ecological data and associated uncertainty in bioaccumulation modelling: methodology development and case study

    NARCIS (Netherlands)

    De Laender, F.; Van Oevelen, D.J.; Middelburg, J.J.; Soetaert, K.E.R.


    Bioaccumulation models predict internal concentrations of hydrophobic chemicals by incorporating key gain/loss processes reflecting the ecology of the exposed species and the characteristics of the chemical. Here, we propose a new methodology that uses ecological data and the principle of mass

  7. A Methodological Review of Structural Equation Modelling in Higher Education Research (United States)

    Green, Teegan


    Despite increases in the number of articles published in higher education journals using structural equation modelling (SEM), research addressing their statistical sufficiency, methodological appropriateness and quantitative rigour is sparse. In response, this article provides a census of all covariance-based SEM articles published up until 2013…

  8. Tecnomatix Plant Simulation modeling and programming by means of examples

    CERN Document Server

    Bangsow, Steffen


    This book systematically introduces the development of simulation models as well as the implementation and evaluation of simulation experiments with Tecnomatix Plant Simulation. It deals with all users of Plant Simulation, who have more complex tasks to handle. It also looks for an easy entry into the program. Particular attention has been paid to introduce the simulation flow language SimTalk and its use in various areas of the simulation. The author demonstrates with over 200 examples how to combine the blocks for simulation models and how to deal with SimTalk for complex control and analys

  9. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M


    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  10. Investigation of Radiation Protection Methodologies for Radiation Therapy Shielding Using Monte Carlo Simulation and Measurement (United States)

    Tanny, Sean

    The advent of high-energy linear accelerators for dedicated medical use in the 1950's by Henry Kaplan and the Stanford University physics department began a revolution in radiation oncology. Today, linear accelerators are the standard of care for modern radiation therapy and can generate high-energy beams that can produce tens of Gy per minute at isocenter. This creates a need for a large amount of shielding material to properly protect members of the public and hospital staff. Standardized vault designs and guidance on shielding properties of various materials are provided by the National Council on Radiation Protection (NCRP) Report 151. However, physicists are seeking ways to minimize the footprint and volume of shielding material needed which leads to the use of non-standard vault configurations and less-studied materials, such as high-density concrete. The University of Toledo Dana Cancer Center has utilized both of these methods to minimize the cost and spatial footprint of the requisite radiation shielding. To ensure a safe work environment, computer simulations were performed to verify the attenuation properties and shielding workloads produced by a variety of situations where standard recommendations and guidance documents were insufficient. This project studies two areas of concern that are not addressed by NCRP 151, the radiation shielding workload for the vault door with a non-standard design, and the attenuation properties of high-density concrete for both photon and neutron radiation. Simulations have been performed using a Monte-Carlo code produced by the Los Alamos National Lab (LANL), Monte Carlo Neutrons, Photons 5 (MCNP5). Measurements have been performed using a shielding test port designed into the maze of the Varian Edge treatment vault.

  11. Methodology and Applications in Non-linear Model-based Geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model....... Conditioned by an underlying and unobserved Gaussian process the observations at the measured locations follow a generalised linear model. Concerning inference Markov chain Monte Carlo methods are used. The study of these models is the main topic of the thesis. Construction of priors, and the use of flat...... contains functions for inference in generalised linear spatial models.    ...

  12. Methodology and applications in non-linear model-based geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model....... Conditioned by an underlying and unobserved Gaussian process the observations at the measured locations follow a generalised linear model. Concerning inference Markov chain Monte Carlo methods are used. The study of these models is the main topic of the thesis. Construction of priors, and the use of flat...... contains functions for inference in generalised linear spatial models.    ...

  13. Detached eddy simulation and large eddy simulation models for the simulation of gas entrainment

    Energy Technology Data Exchange (ETDEWEB)

    Merzari, E.; Ninokata, H. [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology, Tokyo (Japan); Baglietto, E. [CD-adapco, New York, NY (United States)


    The eventual entrainment of gas bubbles in the reactor core of a Light Metal Fast Breeder Reactors (LMFBR) may cause an effective increase in reactivity as in the current state of the art since LMFBRs have usually a positive void coefficient. Since this may have a negative effect on safety and operation, the possibility of gas entrainment needs to be evaluated in the thermohydraulics design. Several studies on the gas entrainment in a LMFBR system have been conducted over the years. The most common situations that may lead to gas entrainment have been classified into vortex dimple, concave free surface and breaking wave. Among these, the vortex-induced gas entrainment phenomenon is considered in the present work, because more likely to be present in operating or accidental conditions. The focus is on the issue of turbulence modeling for the simulation of gas-driving vortexes, and in particular for the benchmark case of Moriya. We will propose two different approaches: a large eddy simulation and a detached eddy simulation. Results are in excellent agreement with the experiment for the radial velocity even if no surface model has been employed. (authors)

  14. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  15. RT 24 - Architecture, Modeling & Simulation, and Software Design (United States)


    focus on tool extensions (UPDM, SysML, SoaML, BPMN ) Leverage “best of breed” architecture methodologies Provide tooling to support the methodology DoDAF...Capability 10 Example: BPMN 11 DoDAF 2.0 MetaModel BPMN MetaModel Mapping SysML to DoDAF 2.0 12 DoDAF V2.0 Models OV-2 SysML Diagrams Requirement

  16. Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems (United States)


    xviii M&S modeling and simulation MDP model development process MEU Marine expeditionary unit MODA multi-objective decision analysis MOE measure of...objective decision analysis ( MODA ) 49 techniques. SMEs are still heavily involved in a MODA and have a method of tracing their values to the model

  17. A robust methodology for kinetic model parameter estimation for biocatalytic reactions

    DEFF Research Database (Denmark)

    Al-Haque, Naweed; Andrade Santacoloma, Paloma de Gracia; Lima Afonso Neto, Watson


    Effective estimation of parameters in biocatalytic reaction kinetic expressions are very important when building process models to enable evaluation of process technology options and alternative biocatalysts. The kinetic models used to describe enzyme-catalyzed reactions generally include several...... lead to globally optimized parameter values. In this article, a robust methodology to estimate parameters for biocatalytic reaction kinetic expressions is proposed. The methodology determines the parameters in a systematic manner by exploiting the best features of several of the current approaches....... The parameter estimation problem is decomposed into five hierarchical steps, where the solution of each of the steps becomes the input for the subsequent step to achieve the final model with the corresponding regressed parameters. The model is further used for validating its performance and determining...

  18. An Aproach to a Methodology for the Bussiness Simulator son the Enterprices Management Technologist Teaching-Learning Process

    Directory of Open Access Journals (Sweden)

    Lic. Ángel Gilberto Orellana Carrasco


    Full Text Available The use of the simulation in the processes of professional formation for the Administrative area and Accountant constitutes a means of teaching and of effective learning to achieve in the students the development of competitions that you/they facilitate to reach the performance ways characteristic of the profession. They offer the student the opportunity of carrying out a similar practice to which will carry out in their interaction with the reality in the labor context in which will act professionally. Their use should have a logical concatenation however inside the plan calendar of the subject that belongs together with the necessities and requirements of the academic pensum; it has not been verified in the consulted literature the existence of a proposal for its didactic use. The article offers the approach to a methodology for the use of the shammer of business EMPREWARE in the process of teaching learning in the Technician's formation in Administration of Companies.

  19. Collaborative design for embedded systems co-modelling and co-simulation

    CERN Document Server

    Fitzgerald, John; Verhoef, Marcel


    One of the most significant challenges in the development of embedded and cyber-physical systems is the gap between the disciplines of software and control engineering. In a marketplace, where rapid innovation is essential, engineers from both disciplines need to be able to explore system designs collaboratively, allocating responsibilities to software and physical elements, and analyzing trade-offs between them.To this end, this book presents a framework that allows the very different kinds of design models - discrete-event (DE) models of software and continuous time (CT) models of the physical environment - to be analyzed and simulated jointly, based on common scenarios. The individual chapters provide introductions to both sides of this co-simulation technology, and give a step-by-step guide to the methodology for designing and analyzing co-models. They are grouped into three parts: Part I introduces the technical basis for collaborative modeling and simulation with the Crescendo technology. Part II contin...

  20. Discontinuous Galerkin methodology for Large-Eddy Simulations of wind turbine airfoils

    DEFF Research Database (Denmark)

    Frére, A.; Sørensen, Niels N.; Hillewaert, K.


    at low and high Reynolds numbers and compares the results to state-of-the-art models used in industry, namely the panel method (XFOIL with boundary layer modeling) and Reynolds Averaged Navier-Stokes (RANS). At low Reynolds number (Re = 6 × 104), involving laminar boundary layer separation and transition...... in the detached shear layer, the Eppler 387 airfoil is studied at two angles of attack. The LES results agree slightly better with the experimental chordwise pressure distribution than both XFOIL and RANS results. At high Reynolds number (Re = 1.64 × 106), the NACA4412 airfoil is studied close to stall condition...