WorldWideScience

Sample records for modeling simulations based

  1. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  3. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  4. Agent Based Modelling for Social Simulation

    OpenAIRE

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course of this project two workshops were organized. At these workshops, a wide range of experts, both ABM experts and domain experts, worked on several potential applications of ABM. The results and ins...

  5. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  6. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  7. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  8. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  9. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  10. Simulating individual-based models of epidemics in hierarchical networks

    NARCIS (Netherlands)

    Quax, R.; Bader, D.A.; Sloot, P.M.A.

    2009-01-01

    Current mathematical modeling methods for the spreading of infectious diseases are too simplified and do not scale well. We present the Simulator of Epidemic Evolution in Complex Networks (SEECN), an efficient simulator of detailed individual-based models by parameterizing separate dynamics

  11. Simulation-based modeling of building complexes construction management

    Science.gov (United States)

    Shepelev, Aleksandr; Severova, Galina; Potashova, Irina

    2018-03-01

    The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.

  12. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  13. Modeling and simulation for micro DC motor based on simulink

    Science.gov (United States)

    Shen, Hanxin; Lei, Qiao; Chen, Wenxiang

    2017-09-01

    The micro DC motor has a large market demand but there is a lack of theoretical research for it. Through detailed analysis of the commutation process of micro DC motor commutator, based on micro DC motor electromagnetic torque equation and mechanical torque equation, with the help of Simulink toolkit, a triangle connection micro DC motor simulation model is established. By using the model, a sample micro DC motor are simulated, and an experimental measurements has been carried on the sample micro DC motor. It is found that the simulation results are consistent with theoretical analysis and experimental results.

  14. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  15. Simulating an elastic bipedal robot based on musculoskeletal modeling

    NARCIS (Netherlands)

    Bortoletto, Roberto; Sartori, Massimo; He, Fuben; Pagello, Enrico

    2012-01-01

    Many of the processes involved into the synthesis of human motion have much in common with problems found in robotics research. This paper describes the modeling and the simulation of a novel bipedal robot based on series elastic actuators [1]. The robot model takes in- spiration from the human

  16. Modelling and simulation-based acquisition decision support: present & future

    CSIR Research Space (South Africa)

    Naidoo, S

    2009-10-01

    Full Text Available stream_source_info Naidoo1_2009.pdf.txt stream_content_type text/plain stream_size 24551 Content-Encoding UTF-8 stream_name Naidoo1_2009.pdf.txt Content-Type text/plain; charset=UTF-8 1 Modelling & Simulation...-Based Acquisition Decision Support: Present & Future Shahen Naidoo Abstract The Ground Based Air Defence System (GBADS) Programme, of the South African Army has been applying modelling and simulation (M&S) to provide acquisition decision and doctrine...

  17. An Individual-based Probabilistic Model for Fish Stock Simulation

    Directory of Open Access Journals (Sweden)

    Federico Buti

    2010-08-01

    Full Text Available We define an individual-based probabilistic model of a sole (Solea solea behaviour. The individual model is given in terms of an Extended Probabilistic Discrete Timed Automaton (EPDTA, a new formalism that is introduced in the paper and that is shown to be interpretable as a Markov decision process. A given EPDTA model can be probabilistically model-checked by giving a suitable translation into syntax accepted by existing model-checkers. In order to simulate the dynamics of a given population of soles in different environmental scenarios, an agent-based simulation environment is defined in which each agent implements the behaviour of the given EPDTA model. By varying the probabilities and the characteristic functions embedded in the EPDTA model it is possible to represent different scenarios and to tune the model itself by comparing the results of the simulations with real data about the sole stock in the North Adriatic sea, available from the recent project SoleMon. The simulator is presented and made available for its adaptation to other species.

  18. An electromechanical based deformable model for soft tissue simulation.

    Science.gov (United States)

    Zhong, Yongmin; Shirinzadeh, Bijan; Smith, Julian; Gu, Chengfan

    2009-11-01

    Soft tissue deformation is of great importance to surgery simulation. Although a significant amount of research efforts have been dedicated to simulating the behaviours of soft tissues, modelling of soft tissue deformation is still a challenging problem. This paper presents a new deformable model for simulation of soft tissue deformation from the electromechanical viewpoint of soft tissues. Soft tissue deformation is formulated as a reaction-diffusion process coupled with a mechanical load. The mechanical load applied to a soft tissue to cause a deformation is incorporated into the reaction-diffusion system, and consequently distributed among mass points of the soft tissue. Reaction-diffusion of mechanical load and non-rigid mechanics of motion are combined to govern the simulation dynamics of soft tissue deformation. An improved reaction-diffusion model is developed to describe the distribution of the mechanical load in soft tissues. A three-layer artificial cellular neural network is constructed to solve the reaction-diffusion model for real-time simulation of soft tissue deformation. A gradient based method is established to derive internal forces from the distribution of the mechanical load. Integration with a haptic device has also been achieved to simulate soft tissue deformation with haptic feedback. The proposed methodology does not only predict the typical behaviours of living tissues, but it also accepts both local and large-range deformations. It also accommodates isotropic, anisotropic and inhomogeneous deformations by simple modification of diffusion coefficients.

  19. Cost Effective Community Based Dementia Screening: A Markov Model Simulation

    Directory of Open Access Journals (Sweden)

    Erin Saito

    2014-01-01

    Full Text Available Background. Given the dementia epidemic and the increasing cost of healthcare, there is a need to assess the economic benefit of community based dementia screening programs. Materials and Methods. Markov model simulations were generated using data obtained from a community based dementia screening program over a one-year period. The models simulated yearly costs of caring for patients based on clinical transitions beginning in pre dementia and extending for 10 years. Results. A total of 93 individuals (74 female, 19 male were screened for dementia and 12 meeting clinical criteria for either mild cognitive impairment (n=7 or dementia (n=5 were identified. Assuming early therapeutic intervention beginning during the year of dementia detection, Markov model simulations demonstrated 9.8% reduction in cost of dementia care over a ten-year simulation period, primarily through increased duration in mild stages and reduced time in more costly moderate and severe stages. Discussion. Community based dementia screening can reduce healthcare costs associated with caring for demented individuals through earlier detection and treatment, resulting in proportionately reduced time in more costly advanced stages.

  20. A pedagogical model for simulation-based learning in healthcare

    Directory of Open Access Journals (Sweden)

    Tuulikki Keskitalo

    2015-11-01

    Full Text Available The aim of this study was to design a pedagogical model for a simulation-based learning environment (SBLE in healthcare. Currently, simulation and virtual reality are a major focus in healthcare education. However, when and how these learning environments should be applied is not well-known. The present study tries to fill that gap. We pose the following research question: What kind of pedagogical model supports and facilitates students’ meaningful learning in SBLEs? The study used design-based research (DBR and case study approaches. We report the results from our second case study and how the pedagogical model was developed based on the lessons learned. The study involved nine facilitators and 25 students. Data were collected and analysed using mixed methods. The main result of this study is the refined pedagogical model. The model is based on the socio-cultural theory of learning and characteristics of meaningful learning as well as previous pedagogical models. The model will provide a more holistic and meaningful approach to teaching and learning in SBLEs. However, the model requires evidence and further development.

  1. Optimization Model for Web Based Multimodal Interactive Simulations.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  2. A particle based simulation model for glacier dynamics

    Directory of Open Access Journals (Sweden)

    J. A. Åström

    2013-10-01

    Full Text Available A particle-based computer simulation model was developed for investigating the dynamics of glaciers. In the model, large ice bodies are made of discrete elastic particles which are bound together by massless elastic beams. These beams can break, which induces brittle behaviour. At loads below fracture, beams may also break and reform with small probabilities to incorporate slowly deforming viscous behaviour in the model. This model has the advantage that it can simulate important physical processes such as ice calving and fracturing in a more realistic way than traditional continuum models. For benchmarking purposes the deformation of an ice block on a slip-free surface was compared to that of a similar block simulated with a Finite Element full-Stokes continuum model. Two simulations were performed: (1 calving of an ice block partially supported in water, similar to a grounded marine glacier terminus, and (2 fracturing of an ice block on an inclined plane of varying basal friction, which could represent transition to fast flow or surging. Despite several approximations, including restriction to two-dimensions and simplified water-ice interaction, the model was able to reproduce the size distributions of the debris observed in calving, which may be approximated by universal scaling laws. On a moderate slope, a large ice block was stable and quiescent as long as there was enough of friction against the substrate. For a critical length of frictional contact, global sliding began, and the model block disintegrated in a manner suggestive of a surging glacier. In this case the fragment size distribution produced was typical of a grinding process.

  3. Flat Knitting Loop Deformation Simulation Based on Interlacing Point Model

    Directory of Open Access Journals (Sweden)

    Jiang Gaoming

    2017-12-01

    Full Text Available In order to create realistic loop primitives suitable for the faster CAD of the flat-knitted fabric, we have performed research on the model of the loop as well as the variation of the loop surface. This paper proposes an interlacing point-based model for the loop center curve, and uses the cubic Bezier curve to fit the central curve of the regular loop, elongated loop, transfer loop, and irregular deformed loop. In this way, a general model for the central curve of the deformed loop is obtained. The obtained model is then utilized to perform texture mapping, texture interpolation, and brightness processing, simulating a clearly structured and lifelike deformed loop. The computer program LOOP is developed by using the algorithm. The deformed loop is simulated with different yarns, and the deformed loop is applied to design of a cable stitch, demonstrating feasibility of the proposed algorithm. This paper provides a loop primitive simulation method characterized by lifelikeness, yarn material variability, and deformation flexibility, and facilitates the loop-based fast computer-aided design (CAD of the knitted fabric.

  4. A sEMG model with experimentally based simulation parameters.

    Science.gov (United States)

    Wheeler, Katherine A; Shimada, Hiroshima; Kumar, Dinesh K; Arjunan, Sridhar P

    2010-01-01

    A differential, time-invariant, surface electromyogram (sEMG) model has been implemented. While it is based on existing EMG models, the novelty of this implementation is that it assigns more accurate distributions of variables to create realistic motor unit (MU) characteristics. Variables such as muscle fibre conduction velocity, jitter (the change in the interpulse interval between subsequent action potential firings) and motor unit size have been considered to follow normal distributions about an experimentally obtained mean. In addition, motor unit firing frequencies have been considered to have non-linear and type based distributions that are in accordance with experimental results. Motor unit recruitment thresholds have been considered to be related to the MU type. The model has been used to simulate single channel differential sEMG signals from voluntary, isometric contractions of the biceps brachii muscle. The model has been experimentally verified by conducting experiments on three subjects. Comparison between simulated signals and experimental recordings shows that the Root Mean Square (RMS) increases linearly with force in both cases. The simulated signals also show similar values and rates of change of RMS to the experimental signals.

  5. Biologically based modelling and simulation of carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Ouchi, Noriyuki B.

    2003-01-01

    The process of the carcinogenesis is studied by computer simulation. In general, we need a large number of experimental samples to detect mutations at low doses, but in practice it is difficult to get such a large number of data. To satisfy the requirements of the situation at low doses, it is good to study the process of carcinogenesis using biologically based mathematical model. We have mainly studied it by using as known as 'multi-stage model'; the model seems to get complicated, as we adopt the recent new findings of molecular biological experiments. Moreover, the basic idea of the multi-stage model is based on the epidemiologic data of log-log variation of cancer incidence with age, it seems to be difficult to compare with experimental data of irradiated cell culture system, which has been increasing in recent years. Taking above into consideration, we concluded that we had better make new model with following features: 1) a unit of the target system is a cell, 2) the new information of the molecular biology can be easily introduced, 3) having spatial coordinates for checking a colony formation or tumorigenesis. In this presentation, we will show the detail of the model and some simulation results about the carcinogenesis. (author)

  6. A Coupled Simulation Architecture for Agent-Based/Geohydrological Modelling

    Science.gov (United States)

    Jaxa-Rozen, M.

    2016-12-01

    The quantitative modelling of social-ecological systems can provide useful insights into the interplay between social and environmental processes, and their impact on emergent system dynamics. However, such models should acknowledge the complexity and uncertainty of both of the underlying subsystems. For instance, the agent-based models which are increasingly popular for groundwater management studies can be made more useful by directly accounting for the hydrological processes which drive environmental outcomes. Conversely, conventional environmental models can benefit from an agent-based depiction of the feedbacks and heuristics which influence the decisions of groundwater users. From this perspective, this work describes a Python-based software architecture which couples the popular NetLogo agent-based platform with the MODFLOW/SEAWAT geohydrological modelling environment. This approach enables users to implement agent-based models in NetLogo's user-friendly platform, while benefiting from the full capabilities of MODFLOW/SEAWAT packages or reusing existing geohydrological models. The software architecture is based on the pyNetLogo connector, which provides an interface between the NetLogo agent-based modelling software and the Python programming language. This functionality is then extended and combined with Python's object-oriented features, to design a simulation architecture which couples NetLogo with MODFLOW/SEAWAT through the FloPy library (Bakker et al., 2016). The Python programming language also provides access to a range of external packages which can be used for testing and analysing the coupled models, which is illustrated for an application of Aquifer Thermal Energy Storage (ATES).

  7. Discrete Event System Based Pyroprocessing Modeling and Simulation: Oxide Reduction

    International Nuclear Information System (INIS)

    Lee, H. J.; Ko, W. I.; Choi, S. Y.; Kim, S. K.; Hur, J. M.; Choi, E. Y.; Im, H. S.; Park, K. I.; Kim, I. T.

    2014-01-01

    Dynamic changes according to the batch operation cannot be predicted in an equilibrium material flow. This study began to build a dynamic material balance model based on the previously developed pyroprocessing flowsheet. As a mid- and long-term research, an integrated pyroprocessing simulator is being developed at the Korea Atomic Energy Research Institute (KAERI) to cope with a review on the technical feasibility, safeguards assessment, conceptual design of facility, and economic feasibility evaluation. The most fundamental thing in such a simulator development is to establish the dynamic material flow framework. This study focused on the operation modeling of pyroprocessing to implement a dynamic material flow. As a case study, oxide reduction was investigated in terms of a dynamic material flow. DES based modeling was applied to build a pyroprocessing operation model. A dynamic material flow as the basic framework for an integrated pyroprocessing was successfully implemented through ExtendSim's internal database and item blocks. Complex operation logic behavior was verified, for example, an oxide reduction process in terms of dynamic material flow. Compared to the equilibrium material flow, a model-based dynamic material flow provides such detailed information that a careful analysis of every batch is necessary to confirm the dynamic material balance results. With the default scenario of oxide reduction, the batch mass balance was verified in comparison with a one-year equilibrium mass balance. This study is still under progress with a mid-and long-term goal, the development of a multi-purpose pyroprocessing simulator that is able to cope with safeguards assessment, economic feasibility, technical evaluation, conceptual design, and support of licensing for a future pyroprocessing facility

  8. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  9. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  10. Agent-based modeling and simulation Part 3 : desktop ABMS.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  11. IR characteristic simulation of city scenes based on radiosity model

    Science.gov (United States)

    Xiong, Xixian; Zhou, Fugen; Bai, Xiangzhi; Yu, Xiyu

    2013-09-01

    Reliable modeling for thermal infrared (IR) signatures of real-world city scenes is required for signature management of civil and military platforms. Traditional modeling methods generally assume that scene objects are individual entities during the physical processes occurring in infrared range. However, in reality, the physical scene involves convective and conductive interactions between objects as well as the radiations interactions between objects. A method based on radiosity model describes these complex effects. It has been developed to enable an accurate simulation for the radiance distribution of the city scenes. Firstly, the physical processes affecting the IR characteristic of city scenes were described. Secondly, heat balance equations were formed on the basis of combining the atmospheric conditions, shadow maps and the geometry of scene. Finally, finite difference method was used to calculate the kinetic temperature of object surface. A radiosity model was introduced to describe the scattering effect of radiation between surface elements in the scene. By the synthesis of objects radiance distribution in infrared range, we could obtain the IR characteristic of scene. Real infrared images and model predictions were shown and compared. The results demonstrate that this method can realistically simulate the IR characteristic of city scenes. It effectively displays the infrared shadow effects and the radiation interactions between objects in city scenes.

  12. Urban flood simulation based on the SWMM model

    Directory of Open Access Journals (Sweden)

    L. Jiang

    2015-05-01

    Full Text Available China is the nation with the fastest urbanization in the past decades which has caused serious urban flooding. Flood forecasting is regarded as one of the important flood mitigation methods, and is widely used in catchment flood mitigation, but is not widely used in urban flooding mitigation. This paper, employing the SWMM model, one of the widely used urban flood planning and management models, simulates the urban flooding of Dongguan City in the rapidly urbanized southern China. SWMM is first set up based on the DEM, digital map and underground pipeline network, then parameters are derived based on the properties of the subcatchment and the storm sewer conduits; the parameter sensitivity analysis shows the parameter robustness. The simulated results show that with the 1-year return period precipitation, the studied area will have no flooding, but for the 2-, 5-, 10- and 20-year return period precipitation, the studied area will be inundated. The results show the SWMM model is promising for urban flood forecasting, but as it has no surface runoff routing, the urban flooding could not be forecast precisely.

  13. An Agent-Based Simulation Model for Organizational Analysis

    National Research Council Canada - National Science Library

    Ruan, Sui; Gokhale, Swapna S; Pattipati, Krishna R

    2006-01-01

    In many fields, including engineering, management, and organizational science, simulation-based computational organization theory has been used to gain insight into the degree of match ("congruence...

  14. Mesoscale meteorological model based on radioactive explosion cloud simulation

    International Nuclear Information System (INIS)

    Zheng Yi; Zhang Yan; Ying Chuntong

    2008-01-01

    In order to simulate nuclear explosion and dirty bomb radioactive cloud movement and concentration distribution, mesoscale meteorological model RAMS was used. Particles-size, size-active distribution and gravitational fallout in the cloud were considered. The results show that the model can simulate the 'mushroom' clouds of explosion. Three-dimension fluid field and radioactive concentration field were received. (authors)

  15. Design, modeling and simulation of MEMS-based silicon Microneedles

    International Nuclear Information System (INIS)

    Amin, F; Ahmed, S

    2013-01-01

    The advancement in semiconductor process engineering and nano-scale fabrication technology has made it convenient to transport specific biological fluid into or out of human skin with minimum discomfort. Fluid transdermal delivery systems such as Microneedle arrays are one such emerging and exciting Micro-Electro Mechanical System (MEMS) application which could lead to a total painless fluid delivery into skin with controllability and desirable yield. In this study, we aimed to revisit the problem with modeling, design and simulations carried out for MEMS based silicon hollow out of plane microneedle arrays for biomedical applications particularly for transdermal drug delivery. An approximate 200 μm length of microneedle with 40 μm diameter of lumen has been successfully shown formed by isotropic and anisotropic etching techniques using MEMS Pro design tool. These microneedles are arranged in size of 2 × 4 matrix array with center to center spacing of 750 μm. Furthermore, comparisons for fluid flow characteristics through these microneedle channels have been modeled with and without the contribution of the gravitational forces using mathematical models derived from Bernoulli Equation. Physical Process simulations have also been performed on TCAD SILVACO to optimize the design of these microneedles aligned with the standard Si-Fabrication lines.

  16. Design, modeling and simulation of MEMS-based silicon Microneedles

    Science.gov (United States)

    Amin, F.; Ahmed, S.

    2013-06-01

    The advancement in semiconductor process engineering and nano-scale fabrication technology has made it convenient to transport specific biological fluid into or out of human skin with minimum discomfort. Fluid transdermal delivery systems such as Microneedle arrays are one such emerging and exciting Micro-Electro Mechanical System (MEMS) application which could lead to a total painless fluid delivery into skin with controllability and desirable yield. In this study, we aimed to revisit the problem with modeling, design and simulations carried out for MEMS based silicon hollow out of plane microneedle arrays for biomedical applications particularly for transdermal drug delivery. An approximate 200 μm length of microneedle with 40 μm diameter of lumen has been successfully shown formed by isotropic and anisotropic etching techniques using MEMS Pro design tool. These microneedles are arranged in size of 2 × 4 matrix array with center to center spacing of 750 μm. Furthermore, comparisons for fluid flow characteristics through these microneedle channels have been modeled with and without the contribution of the gravitational forces using mathematical models derived from Bernoulli Equation. Physical Process simulations have also been performed on TCAD SILVACO to optimize the design of these microneedles aligned with the standard Si-Fabrication lines.

  17. A simulation-based analytic model of radio galaxies

    Science.gov (United States)

    Hardcastle, M. J.

    2018-04-01

    I derive and discuss a simple semi-analytical model of the evolution of powerful radio galaxies which is not based on assumptions of self-similar growth, but rather implements some insights about the dynamics and energetics of these systems derived from numerical simulations, and can be applied to arbitrary pressure/density profiles of the host environment. The model can qualitatively and quantitatively reproduce the source dynamics and synchrotron light curves derived from numerical modelling. Approximate corrections for radiative and adiabatic losses allow it to predict the evolution of radio spectral index and of inverse-Compton emission both for active and `remnant' sources after the jet has turned off. Code to implement the model is publicly available. Using a standard model with a light relativistic (electron-positron) jet, subequipartition magnetic fields, and a range of realistic group/cluster environments, I simulate populations of sources and show that the model can reproduce the range of properties of powerful radio sources as well as observed trends in the relationship between jet power and radio luminosity, and predicts their dependence on redshift and environment. I show that the distribution of source lifetimes has a significant effect on both the source length distribution and the fraction of remnant sources expected in observations, and so can in principle be constrained by observations. The remnant fraction is expected to be low even at low redshift and low observing frequency due to the rapid luminosity evolution of remnants, and to tend rapidly to zero at high redshift due to inverse-Compton losses.

  18. A SIMULATION OF CONTRACT FARMING USING AGENT BASED MODELING

    Directory of Open Access Journals (Sweden)

    Yuanita Handayati

    2016-12-01

    Full Text Available This study aims to simulate the effects of contract farming and farmer commitment to contract farming on supply chain performance by using agent based modeling as a methodology. Supply chain performance is represented by profits and service levels. The simulation results indicate that farmers should pay attention to customer requirements and plan their agricultural activities in order to fulfill these requirements. Contract farming helps farmers deal with demand and price uncertainties. We also find that farmer commitment is crucial to fulfilling contract requirements. This study contributes to this field from a conceptual as well as a practical point of view. From the conceptual point of view, our simulation results show that different levels of farmer commitment have an impact on farmer performance when implementing contract farming. From a practical point of view, the uncertainty faced by farmers and the market can be managed by implementing cultivation and harvesting scheduling, information sharing, and collective learning as ways of committing to contract farming.

  19. Conceptual modeling for simulation-based serious gaming

    NARCIS (Netherlands)

    van der Zee, D.J.; Holkenborg, Bart; Robinson, Stewart

    2012-01-01

    In recent years many simulation-based serious games have been developed for supporting (future) managers in operations management decision making. They illustrate the high potential of using discrete event simulation for pedagogical purposes. Unfortunately, this potential does not seem to go

  20. Validation techniques of agent based modelling for geospatial simulations

    OpenAIRE

    Darvishi, M.; Ahmadi, G.

    2014-01-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent...

  1. Accurate lithography simulation model based on convolutional neural networks

    Science.gov (United States)

    Watanabe, Yuki; Kimura, Taiki; Matsunawa, Tetsuaki; Nojima, Shigeki

    2017-07-01

    Lithography simulation is an essential technique for today's semiconductor manufacturing process. In order to calculate an entire chip in realistic time, compact resist model is commonly used. The model is established for faster calculation. To have accurate compact resist model, it is necessary to fix a complicated non-linear model function. However, it is difficult to decide an appropriate function manually because there are many options. This paper proposes a new compact resist model using CNN (Convolutional Neural Networks) which is one of deep learning techniques. CNN model makes it possible to determine an appropriate model function and achieve accurate simulation. Experimental results show CNN model can reduce CD prediction errors by 70% compared with the conventional model.

  2. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  3. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  4. Towards a standard model for research in agent-based modeling and simulation

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2015-11-01

    Full Text Available Agent-based modeling (ABM is a bottom-up modeling approach, where each entity of the system being modeled is uniquely represented as an independent decision-making agent. ABMs are very sensitive to implementation details. Thus, it is very easy to inadvertently introduce changes which modify model dynamics. Such problems usually arise due to the lack of transparency in model descriptions, which constrains how models are assessed, implemented and replicated. In this paper, we present PPHPC, a model which aims to serve as a standard in agent based modeling research, namely, but not limited to, conceptual model specification, statistical analysis of simulation output, model comparison and parallelization studies. This paper focuses on the first two aspects (conceptual model specification and statistical analysis of simulation output, also providing a canonical implementation of PPHPC. The paper serves as a complete reference to the presented model, and can be used as a tutorial for simulation practitioners who wish to improve the way they communicate their ABMs.

  5. Modeling and simulation of botnet based cyber-threats

    Directory of Open Access Journals (Sweden)

    Kasprzyk Rafał

    2017-01-01

    Full Text Available The paper presents an analysis of cyber-threats, with particular emphasis on the threats resulting from botnet activity. Botnets are the most common types of threats and often perceived as crucial in terms of national security. Their classification and methods of spreading are the basis for creating cyberspace model including the presence of different types of cyber-threats. A well-designed cyberspace model enables to construct an experimental environment that allows for the analysis of botnet characteristics, testing its resistance to various events and simulation of the spread and evolution. For this purpose, dedicated platforms with capabilities and functional characteristics to meet these requirements have been proposed.

  6. Some computer simulations based on the linear relative risk model

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs

  7. Model-based microwave image reconstruction: simulations and experiments

    International Nuclear Information System (INIS)

    Ciocan, Razvan; Jiang Huabei

    2004-01-01

    We describe an integrated microwave imaging system that can provide spatial maps of dielectric properties of heterogeneous media with tomographically collected data. The hardware system (800-1200 MHz) was built based on a lock-in amplifier with 16 fixed antennas. The reconstruction algorithm was implemented using a Newton iterative method with combined Marquardt-Tikhonov regularizations. System performance was evaluated using heterogeneous media mimicking human breast tissue. Finite element method coupled with the Bayliss and Turkel radiation boundary conditions were applied to compute the electric field distribution in the heterogeneous media of interest. The results show that inclusions embedded in a 76-diameter background medium can be quantitatively reconstructed from both simulated and experimental data. Quantitative analysis of the microwave images obtained suggests that an inclusion of 14 mm in diameter is the smallest object that can be fully characterized presently using experimental data, while objects as small as 10 mm in diameter can be quantitatively resolved with simulated data

  8. Aeroelastic simulation using CFD based reduced order models

    International Nuclear Information System (INIS)

    Zhang, W.; Ye, Z.; Li, H.; Yang, Q.

    2005-01-01

    This paper aims at providing an accurate and efficient method for aeroelastic simulation. System identification is used to get the reduced order models of unsteady aerodynamics. Unsteady Euler codes are used to compute the output signals while 3211 multistep input signals are utilized. LS(Least Squares) method is used to estimate the coefficients of the input-output difference model. The reduced order models are then used in place of the unsteady CFD code for aeroelastic simulation. The aeroelastic equations are marched by an improved 4th order Runge-Kutta method that only needs to compute the aerodynamic loads one time at every time step. The computed results agree well with that of the direct coupling CFD/CSD methods. The computational efficiency is improved 1∼2 orders while still retaining the high accuracy. A standard aeroelastic computing example (isogai wing) with S type flutter boundary is computed and analyzed. It is due to the system has more than one neutral points at the Mach range of 0.875∼0.9. (author)

  9. The knowledge-based economy modeled, measured, simulated

    CERN Document Server

    Leydesdorff, Loet

    2006-01-01

    "Challenging, theoretically rich yet anchored in detailed empirical analysis, Loet Leydesdorff's exploration of the dynamics of the knowledge-economy is a major contribution to the field. Drawing on his expertise in science and technology studies, systems theory, and his internationally respected work on the 'triple helix', the book provides a radically new modelling and simulation of knowledge systems, capturing the articulation of structure, communication, and agency therein. This work will be of immense interest to both theorists of the knowledge-economy and practitioners in science policy." Andrew Webster Science & Technology Studies, University of York, UK

  10. Simulation-Based Dynamic Passenger Flow Assignment Modelling for a Schedule-Based Transit Network

    Directory of Open Access Journals (Sweden)

    Xiangming Yao

    2017-01-01

    Full Text Available The online operation management and the offline policy evaluation in complex transit networks require an effective dynamic traffic assignment (DTA method that can capture the temporal-spatial nature of traffic flows. The objective of this work is to propose a simulation-based dynamic passenger assignment framework and models for such applications in the context of schedule-based rail transit systems. In the simulation framework, travellers are regarded as individual agents who are able to obtain complete information on the current traffic conditions. A combined route selection model integrated with pretrip route selection and entrip route switch is established for achieving the dynamic network flow equilibrium status. The train agent is operated strictly with the timetable and its capacity limitation is considered. A continuous time-driven simulator based on the proposed framework and models is developed, whose performance is illustrated through a large-scale network of Beijing subway. The results indicate that more than 0.8 million individual passengers and thousands of trains can be simulated simultaneously at a speed ten times faster than real time. This study provides an efficient approach to analyze the dynamic demand-supply relationship for large schedule-based transit networks.

  11. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    The rotary regenerator (also called the heat wheel) is an important component of energy intensive sectors, which is used in many heat recovery systems. In this paper, a model-based analysis of a rotary regenerator is carried out with a major emphasis given to the development and implementation of...

  12. Web-Based Modelling and Collaborative Simulation of Declarative Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Marquard, Morten; Shahzad, Muhammad

    2015-01-01

    -user discussions on how knowledge workers really work, by enabling collaborative simulation of processes. In earlier work we reported on the integration of DCR Graphs as a workflow execution formalism in the existing Exformatics ECM products. In this paper we report on the advances we have made over the last two......As a provider of Electronic Case Management solutions to knowledge-intensive businesses and organizations, the Danish company Exformatics has in recent years identified a need for flexible process support in the tools that we provide to our customers. We have addressed this need by adapting DCR...... Graphs, a formal declarative workflow notation developed at the IT University of Copenhagen. Through close collaboration with academia we first integrated execution support for the notation into our existing tools, by leveraging a cloud-based process engine implementing the DCR formalism. Over the last...

  13. A study for production simulation model generation system based on data model at a shipyard

    Directory of Open Access Journals (Sweden)

    Myung-Gi Back

    2016-09-01

    Full Text Available Simulation technology is a type of shipbuilding product lifecycle management solution used to support production planning or decision-making. Normally, most shipbuilding processes are consisted of job shop production, and the modeling and simulation require professional skills and experience on shipbuilding. For these reasons, many shipbuilding companies have difficulties adapting simulation systems, regardless of the necessity for the technology. In this paper, the data model for shipyard production simulation model generation was defined by analyzing the iterative simulation modeling procedure. The shipyard production simulation data model defined in this study contains the information necessary for the conventional simulation modeling procedure and can serve as a basis for simulation model generation. The efficacy of the developed system was validated by applying it to the simulation model generation of the panel block production line. By implementing the initial simulation model generation process, which was performed in the past with a simulation modeler, the proposed system substantially reduced the modeling time. In addition, by reducing the difficulties posed by different modeler-dependent generation methods, the proposed system makes the standardization of the simulation model quality possible.

  14. High tech supply chain simulation based on dynamical systems model

    NARCIS (Netherlands)

    Yuan, X.; Ashayeri, J.

    2013-01-01

    During the last 45 years, system dynamics as a continuous type of simulation has been used for simulating various problems, ranging from economic to engineering and managerial when limited (historical) information is available. Control theory is another alternative for continuous simulation that

  15. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    Science.gov (United States)

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  16. REVIEW OF FLEXIBLE MANUFACTURING SYSTEM BASED ON MODELING AND SIMULATION

    Directory of Open Access Journals (Sweden)

    SAREN Sanjib Kumar

    2016-05-01

    Full Text Available This paper focused on the literature survey of the use of flexible manufacturing system design and operation problems on the basis of simulation tools and their methodology which has been widely used for manufacturing system design and analysis. During this period, simulation has been proving to be an extremely useful analysis and optimization tool, and many articles, papers, and conferences have focused directly on the topic. This paper presents a scenario the use of simulation tools and their methodology in flexible manufacturing system from a period 1982 to 2015.

  17. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    Science.gov (United States)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  18. Pedestrian simulation model based on principles of bounded rationality: results of validation tests

    NARCIS (Netherlands)

    Zhu, W.; Timmermans, H.J.P.; Lo, H.P.; Leung, Stephen C.H.; Tan, Susanna M.L.

    2009-01-01

    Over the years, different modelling approaches to simulating pedestrian movement have been suggested. The majority of pedestrian decision models are based on the concept of utility maximization. To explore alternatives, we developed the heterogeneous heuristic model (HHM), based on principles of

  19. Physics-based simulation models for EBSD: advances and challenges

    Science.gov (United States)

    Winkelmann, A.; Nolze, G.; Vos, M.; Salvat-Pujol, F.; Werner, W. S. M.

    2016-02-01

    EBSD has evolved into an effective tool for microstructure investigations in the scanning electron microscope. The purpose of this contribution is to give an overview of various simulation approaches for EBSD Kikuchi patterns and to discuss some of the underlying physical mechanisms.

  20. An Open-Source Simulation Environment for Model-Based Engineering, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed work is a new spacecraft simulation environment for model-based engineering of flight algorithms and software. The goal is to provide a much faster way...

  1. Risk of portfolio with simulated returns based on copula model

    Science.gov (United States)

    Razak, Ruzanna Ab; Ismail, Noriszura

    2015-02-01

    The commonly used tool for measuring risk of a portfolio with equally weighted stocks is variance-covariance method. Under extreme circumstances, this method leads to significant underestimation of actual risk due to its multivariate normality assumption of the joint distribution of stocks. The purpose of this research is to compare the actual risk of portfolio with the simulated risk of portfolio in which the joint distribution of two return series is predetermined. The data used is daily stock prices from the ASEAN market for the period January 2000 to December 2012. The copula approach is applied to capture the time varying dependence among the return series. The results shows that the chosen copula families are not suitable to present the dependence structures of each bivariate returns. Exception for the Philippines-Thailand pair where by t copula distribution appears to be the appropriate choice to depict its dependence. Assuming that the t copula distribution is the joint distribution of each paired series, simulated returns is generated and value-at-risk (VaR) is then applied to evaluate the risk of each portfolio consisting of two simulated return series. The VaR estimates was found to be symmetrical due to the simulation of returns via elliptical copula-GARCH approach. By comparison, it is found that the actual risks are underestimated for all pairs of portfolios except for Philippines-Thailand. This study was able to show that disregard of the non-normal dependence structure of two series will result underestimation of actual risk of the portfolio.

  2. Model and simulation for melt flow in micro-injection molding based on the PTT model

    International Nuclear Information System (INIS)

    Cao, Wei; Kong, Lingchao; Li, Qian; Ying, Jin; Shen, Changyu

    2011-01-01

    Unsteady viscoelastic flows were studied using the finite element method in this work. The Phan-Thien–Tanner (PTT) model was used to represent the rheological behavior of viscoelastic fluids. To effectively describe the microscale effects, the slip boundary condition and surface tension were added to the mathematical model for melt flow in micro-injection molding. The new variational equation of pressure, including the viscoelastic parameters and slip boundary condition, was generalized using integration by parts. A computer code based on the finite element method and finite difference method was developed to solve the melt flow problem. Numerical simulation revealed that the melt viscoelasticity plays an important role in the prediction of melt pressure, temperature at the gate and the succeeding melt front advancement in the cavity. Using the viscoelastic model one can also control the rapid increase in simulated pressure, temperature, and reduce the filling difference among different cavities. The short shot experiments of micro-motor shaft showed that the predicted melt front from the viscoelastic model is in fair agreement with the corresponding experimental results

  3. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  4. An individual-based simulation model for mottled sculpin (Cottus bairdi) in a southern Appalachian stream

    Science.gov (United States)

    Brenda Rashleigh; Gary D. Grossman

    2005-01-01

    We describe and analyze a spatially explicit, individual-based model for the local population dynamics of mottled sculpin (Cottus bairdi). The model simulated daily growth, mortality, movement and spawning of individuals within a reach of stream. Juvenile and adult growth was based on consumption bioenergetics of benthic macroinvertebrate prey;...

  5. Physics-based statistical model and simulation method of RF propagation in urban environments

    Science.gov (United States)

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  6. Population PK modelling and simulation based on fluoxetine and norfluoxetine concentrations in milk: a milk concentration-based prediction model.

    Science.gov (United States)

    Tanoshima, Reo; Bournissen, Facundo Garcia; Tanigawara, Yusuke; Kristensen, Judith H; Taddio, Anna; Ilett, Kenneth F; Begg, Evan J; Wallach, Izhar; Ito, Shinya

    2014-10-01

    Population pharmacokinetic (pop PK) modelling can be used for PK assessment of drugs in breast milk. However, complex mechanistic modelling of a parent and an active metabolite using both blood and milk samples is challenging. We aimed to develop a simple predictive pop PK model for milk concentration-time profiles of a parent and a metabolite, using data on fluoxetine (FX) and its active metabolite, norfluoxetine (NFX), in milk. Using a previously published data set of drug concentrations in milk from 25 women treated with FX, a pop PK model predictive of milk concentration-time profiles of FX and NFX was developed. Simulation was performed with the model to generate FX and NFX concentration-time profiles in milk of 1000 mothers. This milk concentration-based pop PK model was compared with the previously validated plasma/milk concentration-based pop PK model of FX. Milk FX and NFX concentration-time profiles were described reasonably well by a one compartment model with a FX-to-NFX conversion coefficient. Median values of the simulated relative infant dose on a weight basis (sRID: weight-adjusted daily doses of FX and NFX through breastmilk to the infant, expressed as a fraction of therapeutic FX daily dose per body weight) were 0.028 for FX and 0.029 for NFX. The FX sRID estimates were consistent with those of the plasma/milk-based pop PK model. A predictive pop PK model based on only milk concentrations can be developed for simultaneous estimation of milk concentration-time profiles of a parent (FX) and an active metabolite (NFX). © 2014 The British Pharmacological Society.

  7. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Paul-Eric DOSSOU

    2013-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  8. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Pawel PAWLEWSKI

    2012-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  9. iCrowd: agent-based behavior modeling and crowd simulator

    Science.gov (United States)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  10. Fuzzy delay model based fault simulator for crosstalk delay fault test ...

    Indian Academy of Sciences (India)

    In this paper, a fuzzy delay model based crosstalk delay fault simulator is proposed. As design trends move towards nanometer technologies, more number of new parameters affects the delay of the component. Fuzzy delay models are ideal for modelling the uncertainty found in the design and manufacturing steps.

  11. Multiple Linear Regression Model Based on Neural Network and Its Application in the MBR Simulation

    Directory of Open Access Journals (Sweden)

    Chunqing Li

    2012-01-01

    Full Text Available The computer simulation of the membrane bioreactor MBR has become the research focus of the MBR simulation. In order to compensate for the defects, for example, long test period, high cost, invisible equipment seal, and so forth, on the basis of conducting in-depth study of the mathematical model of the MBR, combining with neural network theory, this paper proposed a three-dimensional simulation system for MBR wastewater treatment, with fast speed, high efficiency, and good visualization. The system is researched and developed with the hybrid programming of VC++ programming language and OpenGL, with a multifactor linear regression model of affecting MBR membrane fluxes based on neural network, applying modeling method of integer instead of float and quad tree recursion. The experiments show that the three-dimensional simulation system, using the above models and methods, has the inspiration and reference for the future research and application of the MBR simulation technology.

  12. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  13. Rapid Simulation of Flat Knitting Loops Based On the Yarn Texture and Loop Geometrical Model

    Directory of Open Access Journals (Sweden)

    Lu Zhiwen

    2017-06-01

    Full Text Available In order to create realistic loop primitives suitable for the fast computer-aided design (CAD of the flat knitted fabric, we have a research on the geometric model of the loop as well as the variation of the loop surface. Establish the texture variation model based on the changing process from the normal yarn to loop that provides the realistic texture of the simulative loop. Then optimize the simulative loop based on illumination variation. This paper develops the computer program with the optimization algorithm and achieves the loop simulation of different yarns to verify the feasibility of the proposed algorithm. Our work provides a fast CAD of the flat knitted fabric with loop simulation, and it is not only more realistic but also material adjustable. Meanwhile it also provides theoretical value for the flat knitted fabric computer simulation.

  14. Large-watershed flood simulation and forecasting based on different-resolution distributed hydrological model

    Science.gov (United States)

    Li, J.

    2017-12-01

    Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.

  15. Nonlinear Model Predictive Control of a Cable-Robot-Based Motion Simulator

    DEFF Research Database (Denmark)

    Katliar, Mikhail; Fischer, Joerg; Frison, Gianluca

    2017-01-01

    In this paper we present the implementation of a model-predictive controller (MPC) for real-time control of a cable-robot-based motion simulator. The controller computes control inputs such that a desired acceleration and angular velocity at a defined point in simulator's cabin are tracked while...... satisfying constraints imposed by working space and allowed cable forces of the robot. In order to fully use the simulator capabilities, we propose an approach that includes the motion platform actuation in the MPC model. The tracking performance and computation time of the algorithm are investigated...

  16. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  17. Land-use change arising from rural land exchange : an agent-based simulation model

    NARCIS (Netherlands)

    Bakker, Martha M.; Alam, Shah Jamal; van Dijk, Jerry|info:eu-repo/dai/nl/29612642X; Rounsevell, Mark D. A.

    Land exchange can be a major factor driving land-use change in regions with high pressure on land, but is generally not incorporated in land-use change models. Here we present an agent-based model to simulate land-use change arising from land exchange between multiple agent types representing

  18. Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.

    Science.gov (United States)

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating…

  19. An IBM PC-based math model for space station solar array simulation

    Science.gov (United States)

    Emanuel, E. M.

    1986-01-01

    This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.

  20. Bee waxes: a model of characterization for using as base simulator tissue in teletherapy with photons

    International Nuclear Information System (INIS)

    Silva, Rogerio Matias Vidal da; Souza, Divanizia do Nascimento

    2011-01-01

    This paper presents a model of characterization and selection of bee waxes which makes possible to certify the usage viability of that base simulator tissue in the manufacture of appropriated objects for external radiotherapy with mega volt photon beams. The work was divide into three stages, where was evaluated physical and chemical properties besides the aspects related to the capacity of beam attenuation. All the process was carefully accompanied related to the wax origin such as the bee specimen and the flora surrounding the beehives. The chemical composition of the waxes is similar to others simulators usually used in radiotherapy. The behavior of mass attenuation coefficient in the radiotherapeutic energy range is comparable to other simulators, and consequently to the soft tissue. The proposed model is efficient and allows the affirmative that the usage of determined bee wax as base simulator tissue is convenient

  1. LISEM: a physically based model to simulate runoff and soil erosion in catchments: model structure

    NARCIS (Netherlands)

    Roo, de A.P.J.; Wesseling, C.G.; Cremers, N.H.D.T.; Verzandvoort, M.A.; Ritsema, C.J.; Oostindie, K.

    1996-01-01

    The Limburg Soil Erosion Model (LISEM) is described as a way of simulating hydrological and soil erosion processes during single rainfall events on the catchment scale. Sensitivity analysis of the model shows that the initial matric pressure potentialthe hydraulic conductivity of the soil and

  2. Design-based research in designing the model for educating simulation facilitators.

    Science.gov (United States)

    Koivisto, Jaana-Maija; Hannula, Leena; Bøje, Rikke Buus; Prescott, Stephen; Bland, Andrew; Rekola, Leena; Haho, Päivi

    2018-03-01

    The purpose of this article is to introduce the concept of design-based research, its appropriateness in creating education-based models, and to describe the process of developing such a model. The model was designed as part of the Nurse Educator Simulation based learning project, funded by the EU's Lifelong Learning program (2013-1-DK1-LEO05-07053). The project partners were VIA University College, Denmark, the University of Huddersfield, UK and Metropolia University of Applied Sciences, Finland. As an outcome of the development process, "the NESTLED model for educating simulation facilitators" (NESTLED model) was generated. This article also illustrates five design principles that could be applied to other pedagogies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Improved social force model based on exit selection for microscopic pedestrian simulation in subway station

    Institute of Scientific and Technical Information of China (English)

    郑勋; 李海鹰; 孟令云; 许心越; 陈旭

    2015-01-01

    An improved social force model based on exit selection is proposed to simulate pedestrians’ microscopic behaviors in subway station. The modification lies in considering three factors of spatial distance, occupant density and exit width. In addition, the problem of pedestrians selecting exit frequently is solved as follows: not changing to other exits in the affected area of one exit, using the probability of remaining preceding exit and invoking function of exit selection after several simulation steps. Pedestrians in subway station have some special characteristics, such as explicit destinations, different familiarities with subway station. Finally, Beijing Zoo Subway Station is taken as an example and the feasibility of the model results is verified through the comparison of the actual data and simulation data. The simulation results show that the improved model can depict the microscopic behaviors of pedestrians in subway station.

  4. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    Science.gov (United States)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  5. SIMULATING AN EVOLUTIONARY MULTI-AGENT BASED MODEL OF THE STOCK MARKET

    Directory of Open Access Journals (Sweden)

    Diana MARICA

    2015-08-01

    Full Text Available The paper focuses on artificial stock market simulations using a multi-agent model incorporating 2,000 heterogeneous agents interacting on the artificial market. The agents interaction is due to trading activity on the market through a call auction trading mechanism. The multi-agent model uses evolutionary techniques such as genetic programming in order to generate an adaptive and evolving population of agents. Each artificial agent is endowed with wealth and a genetic programming induced trading strategy. The trading strategy evolves and adapts to the new market conditions through a process called breeding, which implies that at each simulation step, new agents with better trading strategies are generated by the model, from recombining the best performing trading strategies and replacing the agents which have the worst performing trading strategies. The simulation model was build with the help of the simulation software Altreva Adaptive Modeler which offers a suitable platform for financial market simulations of evolutionary agent based models, the S&P500 composite index being used as a benchmark for the simulation results.

  6. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    Science.gov (United States)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of

  7. Simulating train movement in an urban railway based on an improved car-following model

    International Nuclear Information System (INIS)

    Ye Jing-Jing; Jin Xin-Min; Li Ke-Ping

    2013-01-01

    Based on the optimal velocity car-following model, in this paper, we propose an improved model for simulating train movement in an urban railway in which the regenerative energy of a train is considered. Here a new additional term is introduced into a traditional car-following model. Our aim is to analyze and discuss the dynamic characteristics of the train movement when the regenerative energy is utilized by the electric locomotive. The simulation results indicate that the improved car-following model is suitable for simulating the train movement. Further, some qualitative relationships between regenerative energy and dynamic characteristics of a train are investigated, such as the measurement data of regenerative energy presents a power-law distribution. Our results are useful for optimizing the design and plan of urban railway systems. (general)

  8. An individual-based probabilistic model for simulating fisheries population dynamics

    Directory of Open Access Journals (Sweden)

    Jie Cao

    2016-12-01

    Full Text Available The purpose of stock assessment is to support managers to provide intelligent decisions regarding removal from fish populations. Errors in assessment models may have devastating impacts on the population fitness and negative impacts on the economy of the resource users. Thus, accuracte estimations of population size, growth rates are critical for success. Evaluating and testing the behavior and performance of stock assessment models and assessing the consequences of model mis-specification and the impact of management strategies requires an operating model that accurately describe the dynamics of the target species, and can resolve spatial and seasonal changes. In addition, the most thorough evaluations of assessment models use an operating model that takes a different form than the assessment model. This paper presents an individual-based probabilistic model used to simulate the complex dynamics of populations and their associated fisheries. Various components of population dynamics are expressed as random Bernoulli trials in the model and detailed life and fishery histories of each individual are tracked over their life span. The simulation model is designed to be flexible so it can be used for different species and fisheries. It can simulate mixing among multiple stocks and link stock-recruit relationships to environmental factors. Furthermore, the model allows for flexibility in sub-models (e.g., growth and recruitment and model assumptions (e.g., age- or size-dependent selectivity. This model enables the user to conduct various simulation studies, including testing the performance of assessment models under different assumptions, assessing the impacts of model mis-specification and evaluating management strategies.

  9. Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation

    NARCIS (Netherlands)

    Minasny, B.; Vrugt, J.A.; McBratney, A.B.

    2011-01-01

    This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior

  10. Stall Recovery in a Centrifuge-Based Flight Simulator With an Extended Aerodynamic Model

    NARCIS (Netherlands)

    Ledegang, W.D.; Groen, E.L.

    2015-01-01

    We investigated the performance of 12 airline pilots in recovering from an asymmetrical stall in a flight simulator featuring an extended aerodynamic model of a transport-category aircraft, and a centrifuge-based motion platform capable of generating enhanced buffet motion and g-cueing. All pilots

  11. Fuzzy delay model based fault simulator for crosstalk delay fault test ...

    Indian Academy of Sciences (India)

    In this paper, a fuzzy delay model based crosstalk delay fault simulator is proposed. As design .... To find the quality of non-robust tests, a fuzzy delay ..... Dubois D and Prade H 1989 Processing Fuzzy temporal knowledge. IEEE Transactions ...

  12. Nonlinear simulation of tearing mode and m=1 kink mode based on kinetic RMHD model

    International Nuclear Information System (INIS)

    Yagi, M.; Yoshida, S.; Itoh, S.-I.; Naitou, H.; Nagahara, H.; Leboeuf, J.-N.; Itoh, K.; Matsumoto, T.; Tokuda, S.; Azumi, M.

    2005-01-01

    In this paper, we investigate dynamics of sawtooth oscillation and neoclassical tearing modes based on kinetic RMHD model, putting an emphasis on interaction with microscopic and transport processes. The simulation results show that the assumption in the conventional theory of neoclassical tearing mode is rather rude. (author)

  13. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  14. Impact of Loss Synchronization on Reliable High Speed Networks: A Model Based Simulation

    Directory of Open Access Journals (Sweden)

    Suman Kumar

    2014-01-01

    Full Text Available Contemporary nature of network evolution demands for simulation models which are flexible, scalable, and easily implementable. In this paper, we propose a fluid based model for performance analysis of reliable high speed networks. In particular, this paper aims to study the dynamic relationship between congestion control algorithms and queue management schemes, in order to develop a better understanding of the causal linkages between the two. We propose a loss synchronization module which is user configurable. We validate our model through simulations under controlled settings. Also, we present a performance analysis to provide insights into two important issues concerning 10 Gbps high speed networks: (i impact of bottleneck buffer size on the performance of 10 Gbps high speed network and (ii impact of level of loss synchronization on link utilization-fairness tradeoffs. The practical impact of the proposed work is to provide design guidelines along with a powerful simulation tool to protocol designers and network developers.

  15. Simulation model for centrifugal pump in flow networks based on internal characteristics

    International Nuclear Information System (INIS)

    Sun, Ji-Lin; Xue, Ruo-Jun; Peng, Min-Jun

    2018-01-01

    For the simulation of centrifugal pump in flow network system, in general three approaches can be used, the fitting model, the numerical method and the internal characteristics model. The fitting model is simple and rapid thus widely used. The numerical method can provide more detailed information in comparison with the fitting model, but increases implementation complexity and computational cost. In real-time simulations of flow networks, to simulate the condition out of the rated condition, especially for the volume flow rate, which the accuracy of fitting model is incredible, a new method for simulating centrifugal pumps was proposed in this research. The method based on the theory head and hydraulic loss in centrifugal pumps, and cavitation is also to be considered. The simulation results are verified with experimental benchmark data from an actual pump. The comparison confirms that the proposed method could fit the flow-head curves well, and the responses of main parameters in dynamic-state operations are consistent with theoretical analyses.

  16. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  17. Simulation of tropical cyclone activity over the western North Pacific based on CMIP5 models

    Science.gov (United States)

    Shen, Haibo; Zhou, Weican; Zhao, Haikun

    2017-09-01

    Based on the Coupled Model Inter-comparison Project 5 (CMIP5) models, the tropical cyclone (TC) activity in the summers of 1965-2005 over the western North Pacific (WNP) is simulated by a TC dynamically downscaling system. In consideration of diversity among climate models, Bayesian model averaging (BMA) and equal-weighed model averaging (EMA) methods are applied to produce the ensemble large-scale environmental factors of the CMIP5 model outputs. The environmental factors generated by BMA and EMA methods are compared, as well as the corresponding TC simulations by the downscaling system. Results indicate that BMA method shows a significant advantage over the EMA. In addition, impacts of model selections on BMA method are examined. To each factor, ten models with better performance are selected from 30 CMIP5 models and then conduct BMA, respectively. As a consequence, the ensemble environmental factors and simulated TC activity are similar with the results from the 30 models' BMA, which verifies the BMA method can afford corresponding weight for each model in the ensemble based on the model's predictive skill. Thereby, the existence of poor performance models will not particularly affect the BMA effectiveness and the ensemble outcomes are improved. Finally, based upon the BMA method and downscaling system, we analyze the sensitivity of TC activity to three important environmental factors, i.e., sea surface temperature (SST), large-scale steering flow, and vertical wind shear. Among three factors, SST and large-scale steering flow greatly affect TC tracks, while average intensity distribution is sensitive to all three environmental factors. Moreover, SST and vertical wind shear jointly play a critical role in the inter-annual variability of TC lifetime maximum intensity and frequency of intense TCs.

  18. A Simulation Model for Tensile Fracture Procedure Analysis of Graphite Material based on Damage Evolution

    International Nuclear Information System (INIS)

    Zhao Erqiang; Ma Shaopeng; Wang Hongtao

    2014-01-01

    Graphite material is generally easy to be damaged by the widely distributed micro-cracks when subjects to load. For numerically analyzing of the structure made of graphite material, the influences of the degradation of the material in damaged areas need to be considered. In this paper, an axial tension test method is proposed to obtain the dynamic damage evolution rule of the material. Using the degradation rule (variation of elastic modulus), the finite element model is then constructed to analyze the tensile fracture process of the L-shaped graphite specimen. An axial tension test of graphite is performed to obtain the stress-strain curve. Based on the variation of the measured curve, the damage evolution rule of the material are fitted out. A simulation model based on the above measured results is then constructed on ABAQUS by user subroutine. Using this simulation model, the tension failure process of L-shaped graphite specimen with fillet are simulated. The calculated and experimental results on fracture load are in good agreement. The damage simulation model based on the stress-strain curve of axial tensile test can be used in other tensile fracture analysis. (author)

  19. Agent-Based Modeling of Taxi Behavior Simulation with Probe Vehicle Data

    Directory of Open Access Journals (Sweden)

    Saurav Ranjit

    2018-05-01

    Full Text Available Taxi behavior is a spatial–temporal dynamic process involving discrete time dependent events, such as customer pick-up, customer drop-off, cruising, and parking. Simulation models, which are a simplification of a real-world system, can help understand the effects of change of such dynamic behavior. In this paper, agent-based modeling and simulation is proposed, that describes the dynamic action of an agent, i.e., taxi, governed by behavior rules and properties, which emulate the taxi behavior. Taxi behavior simulations are fundamentally done for optimizing the service level for both taxi drivers as well as passengers. Moreover, simulation techniques, as such, could be applied to another field of application as well, where obtaining real raw data are somewhat difficult due to privacy issues, such as human mobility data or call detail record data. This paper describes the development of an agent-based simulation model which is based on multiple input parameters (taxi stay point cluster; trip information (origin and destination; taxi demand information; free taxi movement; and network travel time that were derived from taxi probe GPS data. As such, agent’s parameters were mapped into grid network, and the road network, for which the grid network was used as a base for query/search/retrieval of taxi agent’s parameters, while the actual movement of taxi agents was on the road network with routing and interpolation. The results obtained from the simulated taxi agent data and real taxi data showed a significant level of similarity of different taxi behavior, such as trip generation; trip time; trip distance as well as trip occupancy, based on its distribution. As for efficient data handling, a distributed computing platform for large-scale data was used for extracting taxi agent parameter from the probe data by utilizing both spatial and non-spatial indexing technique.

  20. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  1. Leukocyte Motility Models Assessed through Simulation and Multi-objective Optimization-Based Model Selection.

    Directory of Open Access Journals (Sweden)

    Mark N Read

    2016-09-01

    Full Text Available The advent of two-photon microscopy now reveals unprecedented, detailed spatio-temporal data on cellular motility and interactions in vivo. Understanding cellular motility patterns is key to gaining insight into the development and possible manipulation of the immune response. Computational simulation has become an established technique for understanding immune processes and evaluating hypotheses in the context of experimental data, and there is clear scope to integrate microscopy-informed motility dynamics. However, determining which motility model best reflects in vivo motility is non-trivial: 3D motility is an intricate process requiring several metrics to characterize. This complicates model selection and parameterization, which must be performed against several metrics simultaneously. Here we evaluate Brownian motion, Lévy walk and several correlated random walks (CRWs against the motility dynamics of neutrophils and lymph node T cells under inflammatory conditions by simultaneously considering cellular translational and turn speeds, and meandering indices. Heterogeneous cells exhibiting a continuum of inherent translational speeds and directionalities comprise both datasets, a feature significantly improving capture of in vivo motility when simulated as a CRW. Furthermore, translational and turn speeds are inversely correlated, and the corresponding CRW simulation again improves capture of our in vivo data, albeit to a lesser extent. In contrast, Brownian motion poorly reflects our data. Lévy walk is competitive in capturing some aspects of neutrophil motility, but T cell directional persistence only, therein highlighting the importance of evaluating models against several motility metrics simultaneously. This we achieve through novel application of multi-objective optimization, wherein each model is independently implemented and then parameterized to identify optimal trade-offs in performance against each metric. The resultant Pareto

  2. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.

    Science.gov (United States)

    Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M

    2015-09-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods.

  3. SPATKIN: a simulator for rule-based modeling of biomolecular site dynamics on surfaces.

    Science.gov (United States)

    Kochanczyk, Marek; Hlavacek, William S; Lipniacki, Tomasz

    2017-11-15

    Rule-based modeling is a powerful approach for studying biomolecular site dynamics. Here, we present SPATKIN, a general-purpose simulator for rule-based modeling in two spatial dimensions. The simulation algorithm is a lattice-based method that tracks Brownian motion of individual molecules and the stochastic firing of rule-defined reaction events. Because rules are used as event generators, the algorithm is network-free, meaning that it does not require to generate the complete reaction network implied by rules prior to simulation. In a simulation, each molecule (or complex of molecules) is taken to occupy a single lattice site that cannot be shared with another molecule (or complex). SPATKIN is capable of simulating a wide array of membrane-associated processes, including adsorption, desorption and crowding. Models are specified using an extension of the BioNetGen language, which allows to account for spatial features of the simulated process. The C ++ source code for SPATKIN is distributed freely under the terms of the GNU GPLv3 license. The source code can be compiled for execution on popular platforms (Windows, Mac and Linux). An installer for 64-bit Windows and a macOS app are available. The source code and precompiled binaries are available at the SPATKIN Web site (http://pmbm.ippt.pan.pl/software/spatkin). spatkin.simulator@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    Science.gov (United States)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  5. GIS based generation of dynamic hydrological and land patch simulation models for rural watershed areas

    Directory of Open Access Journals (Sweden)

    M. Varga

    2016-03-01

    Full Text Available This paper introduces a GIS based methodology to generate dynamic process model for the simulation based analysis of a sensitive rural watershed. The Direct Computer Mapping (DCM based solution starts from GIS layers and, via the graph interpretation and graphical edition of the process network, the expert interface is able to integrate the field experts’ knowledge in the computer aided generation of the simulation model. The methodology was applied and tested for the Southern catchment basin of Lake Balaton, Hungary. In the simplified hydrological model the GIS description of nine watercourses, 121 water sections, 57 small lakes and 20 Lake Balaton compartments were mapped through the expert interface to the dynamic databases of the DCM model. The hydrological model involved precipitation, evaporation, transpiration, runoff, infiltration. The COoRdination of INformation on the Environment (CORINE land cover based simplified “land patch” model considered the effect of meteorological and hydrological scenarios on freshwater resources in the land patches, rivers and lakes. The first results show that the applied model generation methodology helps to build complex models, which, after validation can support the analysis of various land use, with the consideration of environmental aspects.

  6. Knowledge representation requirements for model sharing between model-based reasoning and simulation in process flow domains

    Science.gov (United States)

    Throop, David R.

    1992-01-01

    The paper examines the requirements for the reuse of computational models employed in model-based reasoning (MBR) to support automated inference about mechanisms. Areas in which the theory of MBR is not yet completely adequate for using the information that simulations can yield are identified, and recent work in these areas is reviewed. It is argued that using MBR along with simulations forces the use of specific fault models. Fault models are used so that a particular fault can be instantiated into the model and run. This in turn implies that the component specification language needs to be capable of encoding any fault that might need to be sensed or diagnosed. It also means that the simulation code must anticipate all these faults at the component level.

  7. Echo simulation of lunar penetrating radar: based on a model of inhomogeneous multilayer lunar regolith structure

    International Nuclear Information System (INIS)

    Dai Shun; Su Yan; Xiao Yuan; Feng Jian-Qing; Xing Shu-Guo; Ding Chun-Yu

    2014-01-01

    Lunar Penetrating Radar (LPR) based on the time domain Ultra-Wideband (UWB) technique onboard China's Chang'e-3 (CE-3) rover, has the goal of investigating the lunar subsurface structure and detecting the depth of lunar regolith. An inhomogeneous multi-layer microwave transfer inverse-model is established. The dielectric constant of the lunar regolith, the velocity of propagation, the reflection, refraction and transmission at interfaces, and the resolution are discussed. The model is further used to numerically simulate and analyze temporal variations in the echo obtained from the LPR attached on CE-3's rover, to reveal the location and structure of lunar regolith. The thickness of the lunar regolith is calculated by a comparison between the simulated radar B-scan images based on the model and the detected result taken from the CE-3 lunar mission. The potential scientific return from LPR echoes taken from the landing region is also discussed

  8. Echo simulation of lunar penetrating radar: based on a model of inhomogeneous multilayer lunar regolith structure

    Science.gov (United States)

    Dai, Shun; Su, Yan; Xiao, Yuan; Feng, Jian-Qing; Xing, Shu-Guo; Ding, Chun-Yu

    2014-12-01

    Lunar Penetrating Radar (LPR) based on the time domain Ultra-Wideband (UWB) technique onboard China's Chang'e-3 (CE-3) rover, has the goal of investigating the lunar subsurface structure and detecting the depth of lunar regolith. An inhomogeneous multi-layer microwave transfer inverse-model is established. The dielectric constant of the lunar regolith, the velocity of propagation, the reflection, refraction and transmission at interfaces, and the resolution are discussed. The model is further used to numerically simulate and analyze temporal variations in the echo obtained from the LPR attached on CE-3's rover, to reveal the location and structure of lunar regolith. The thickness of the lunar regolith is calculated by a comparison between the simulated radar B-scan images based on the model and the detected result taken from the CE-3 lunar mission. The potential scientific return from LPR echoes taken from the landing region is also discussed.

  9. Metoprolol Dose Equivalence in Adult Men and Women Based on Gender Differences: Pharmacokinetic Modeling and Simulations

    Directory of Open Access Journals (Sweden)

    Andy R. Eugene

    2016-11-01

    Full Text Available Recent meta-analyses and publications over the past 15 years have provided evidence showing there are considerable gender differences in the pharmacokinetics of metoprolol. Throughout this time, there have not been any research articles proposing a gender stratified dose-adjustment resulting in an equivalent total drug exposure. Metoprolol pharmacokinetic data was obtained from a previous publication. Data was modeled using nonlinear mixed effect modeling using the MONOLIX software package to quantify metoprolol concentration–time data. Gender-stratified dosing simulations were conducted to identify equivalent total drug exposure based on a 100 mg dose in adults. Based on the pharmacokinetic modeling and simulations, a 50 mg dose in adult women provides an approximately similar metoprolol drug exposure to a 100 mg dose in adult men.

  10. Simulating Transport and Land Use Interdependencies for Strategic Urban Planning—An Agent Based Modelling Approach

    Directory of Open Access Journals (Sweden)

    Nam Huynh

    2015-10-01

    Full Text Available Agent based modelling has been widely accepted as a promising tool for urban planning purposes thanks to its capability to provide sophisticated insights into the social behaviours and the interdependencies that characterise urban systems. In this paper, we report on an agent based model, called TransMob, which explicitly simulates the mutual dynamics between demographic evolution, transport demands, housing needs and the eventual change in the average satisfaction of the residents of an urban area. The ability to reproduce such dynamics is a unique feature that has not been found in many of the like agent based models in the literature. TransMob, is constituted by six major modules: synthetic population, perceived liveability, travel diary assignment, traffic micro-simulator, residential location choice, and travel mode choice. TransMob is used to simulate the dynamics of a metropolitan area in South East of Sydney, Australia, in 2006 and 2011, with demographic evolution. The results are favourably compared against survey data for the area in 2011, therefore validating the capability of TransMob to reproduce the observed complexity of an urban area. We also report on the application of TransMob to simulate various hypothetical scenarios of urban planning policies. We conclude with discussions on current limitations of TransMob, which serve as suggestions for future developments.

  11. Improving Simulations of Extreme Flows by Coupling a Physically-based Hydrologic Model with a Machine Learning Model

    Science.gov (United States)

    Mohammed, K.; Islam, A. S.; Khan, M. J. U.; Das, M. K.

    2017-12-01

    With the large number of hydrologic models presently available along with the global weather and geographic datasets, streamflows of almost any river in the world can be easily modeled. And if a reasonable amount of observed data from that river is available, then simulations of high accuracy can sometimes be performed after calibrating the model parameters against those observed data through inverse modeling. Although such calibrated models can succeed in simulating the general trend or mean of the observed flows very well, more often than not they fail to adequately simulate the extreme flows. This causes difficulty in tasks such as generating reliable projections of future changes in extreme flows due to climate change, which is obviously an important task due to floods and droughts being closely connected to people's lives and livelihoods. We propose an approach where the outputs of a physically-based hydrologic model are used as an input to a machine learning model to try and better simulate the extreme flows. To demonstrate this offline-coupling approach, the Soil and Water Assessment Tool (SWAT) was selected as the physically-based hydrologic model, the Artificial Neural Network (ANN) as the machine learning model and the Ganges-Brahmaputra-Meghna (GBM) river system as the study area. The GBM river system, located in South Asia, is the third largest in the world in terms of freshwater generated and forms the largest delta in the world. The flows of the GBM rivers were simulated separately in order to test the performance of this proposed approach in accurately simulating the extreme flows generated by different basins that vary in size, climate, hydrology and anthropogenic intervention on stream networks. Results show that by post-processing the simulated flows of the SWAT models with ANN models, simulations of extreme flows can be significantly improved. The mean absolute errors in simulating annual maximum/minimum daily flows were minimized from 4967

  12. Full modelling of the MOSAIC animal PET system based on the GATE Monte Carlo simulation code

    International Nuclear Information System (INIS)

    Merheb, C; Petegnief, Y; Talbot, J N

    2007-01-01

    within 9%. For a 410-665 keV energy window, the measured sensitivity for a centred point source was 1.53% and mouse and rat scatter fractions were respectively 12.0% and 18.3%. The scattered photons produced outside the rat and mouse phantoms contributed to 24% and 36% of total simulated scattered coincidences. Simulated and measured single and prompt count rates agreed well for activities up to the electronic saturation at 110 MBq for the mouse and rat phantoms. Volumetric spatial resolution was 17.6 μL at the centre of the FOV with differences less than 6% between experimental and simulated spatial resolution values. The comprehensive evaluation of the Monte Carlo modelling of the Mosaic(TM) system demonstrates that the GATE package is adequately versatile and appropriate to accurately describe the response of an Anger logic based animal PET system

  13. A geographic information system-based 3D city estate modeling and simulation system

    Science.gov (United States)

    Chong, Xiaoli; Li, Sha

    2015-12-01

    This paper introduces a 3D city simulation system which is based on geographic information system (GIS), covering all commercial housings of the city. A regional- scale, GIS-based approach is used to capture, describe, and track the geographical attributes of each house in the city. A sorting algorithm of "Benchmark + Parity Rate" is developed to cluster houses with similar spatial and construction attributes. This system is applicable for digital city modeling, city planning, housing evaluation, housing monitoring, and visualizing housing transaction. Finally, taking Jingtian area of Shenzhen as an example, the each unit of 35,997 houses in the area could be displayed, tagged, and easily tracked by the GIS-based city modeling and simulation system. The match market real conditions well and can be provided to house buyers as reference.

  14. Extending simulation modeling to activity-based costing for clinical procedures.

    Science.gov (United States)

    Glick, N D; Blackmore, C C; Zelman, W N

    2000-04-01

    A simulation model was developed to measure costs in an Emergency Department setting for patients presenting with possible cervical-spine injury who needed radiological imaging. Simulation, a tool widely used to account for process variability but typically focused on utilization and throughput analysis, is being introduced here as a realistic means to perform an activity-based-costing (ABC) analysis, because traditional ABC methods have difficulty coping with process variation in healthcare. Though the study model has a very specific application, it can be generalized to other settings simply by changing the input parameters. In essence, simulation was found to be an accurate and viable means to conduct an ABC analysis; in fact, the output provides more complete information than could be achieved through other conventional analyses, which gives management more leverage with which to negotiate contractual reimbursements.

  15. 3D printed simulation models based on real patient situations for hands-on practice.

    Science.gov (United States)

    Kröger, E; Dekiff, M; Dirksen, D

    2017-11-01

    During the last few years, the curriculum of many dentistry schools in Germany has been reorganised. Two key aspects of the applied changes are the integration of up-to-date teaching methods and the promotion of interdisciplinarity. To support these efforts, an approach to fabricating individualised simulation models for hands-on courses employing 3D printing is presented. The models are based on real patients, thus providing students a more realistic preparation for real clinical situations. As a wide variety of dental procedures can be implemented, the simulation models can also contribute to a more interdisciplinary dental education. The data used for the construction of the models were acquired by 3D surface scanning. The data were further processed with 3D modelling software. Afterwards, the models were fabricated by 3D printing with the PolyJet technique. Three models serve as examples: a prosthodontic model for training veneer preparation, a conservative model for practicing dental bonding and an interdisciplinary model featuring carious teeth and an insufficient crown. The third model was evaluated in a hands-on course with 22 fourth-year dental students. The students answered a questionnaire and gave their personal opinion. Whilst the concept of the model received very positive feedback, some aspects of the implementation were criticised. We discuss these observations and suggest ways for further improvement. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. SAR Imagery Simulation of Ship Based on Electromagnetic Calculations and Sea Clutter Modelling for Classification Applications

    International Nuclear Information System (INIS)

    Ji, K F; Zhao, Z; Xing, X W; Zou, H X; Zhou, S L

    2014-01-01

    Ship detection and classification with space-borne SAR has many potential applications within the maritime surveillance, fishery activity management, monitoring ship traffic, and military security. While ship detection techniques with SAR imagery are well established, ship classification is still an open issue. One of the main reasons may be ascribed to the difficulties on acquiring the required quantities of real data of vessels under different observation and environmental conditions with precise ground truth. Therefore, simulation of SAR images with high scenario flexibility and reasonable computation costs is compulsory for ship classification algorithms development. However, the simulation of SAR imagery of ship over sea surface is challenging. Though great efforts have been devoted to tackle this difficult problem, it is far from being conquered. This paper proposes a novel scheme for SAR imagery simulation of ship over sea surface. The simulation is implemented based on high frequency electromagnetic calculations methods of PO, MEC, PTD and GO. SAR imagery of sea clutter is modelled by the representative K-distribution clutter model. Then, the simulated SAR imagery of ship can be produced by inserting the simulated SAR imagery chips of ship into the SAR imagery of sea clutter. The proposed scheme has been validated with canonical and complex ship targets over a typical sea scene

  17. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans.

    Science.gov (United States)

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2017-03-07

    The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients' CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.

  18. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans

    Science.gov (United States)

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2017-03-01

    The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.

  19. A Monte Carlo-based model for simulation of digital chest tomo-synthesis

    International Nuclear Information System (INIS)

    Ullman, G.; Dance, D. R.; Sandborg, M.; Carlsson, G. A.; Svalkvist, A.; Baath, M.

    2010-01-01

    The aim of this work was to calculate synthetic digital chest tomo-synthesis projections using a computer simulation model based on the Monte Carlo method. An anthropomorphic chest phantom was scanned in a computed tomography scanner, segmented and included in the computer model to allow for simulation of realistic high-resolution X-ray images. The input parameters to the model were adapted to correspond to the VolumeRAD chest tomo-synthesis system from GE Healthcare. Sixty tomo-synthesis projections were calculated with projection angles ranging from + 15 to -15 deg. The images from primary photons were calculated using an analytical model of the anti-scatter grid and a pre-calculated detector response function. The contributions from scattered photons were calculated using an in-house Monte Carlo-based model employing a number of variance reduction techniques such as the collision density estimator. Tomographic section images were reconstructed by transferring the simulated projections into the VolumeRAD system. The reconstruction was performed for three types of images using: (i) noise-free primary projections, (ii) primary projections including contributions from scattered photons and (iii) projections as in (ii) with added correlated noise. The simulated section images were compared with corresponding section images from projections taken with the real, anthropomorphic phantom from which the digital voxel phantom was originally created. The present article describes a work in progress aiming towards developing a model intended for optimisation of chest tomo-synthesis, allowing for simulation of both existing and future chest tomo-synthesis systems. (authors)

  20. Comparison of nonstationary generalized logistic models based on Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    S. Kim

    2015-06-01

    Full Text Available Recently, the evidences of climate change have been observed in hydrologic data such as rainfall and flow data. The time-dependent characteristics of statistics in hydrologic data are widely defined as nonstationarity. Therefore, various nonstationary GEV and generalized Pareto models have been suggested for frequency analysis of nonstationary annual maximum and POT (peak-over-threshold data, respectively. However, the alternative models are required for nonstatinoary frequency analysis because of analyzing the complex characteristics of nonstationary data based on climate change. This study proposed the nonstationary generalized logistic model including time-dependent parameters. The parameters of proposed model are estimated using the method of maximum likelihood based on the Newton-Raphson method. In addition, the proposed model is compared by Monte Carlo simulation to investigate the characteristics of models and applicability.

  1. Agent-based modeling and simulation of clean heating system adoption in Norway

    Energy Technology Data Exchange (ETDEWEB)

    Sopha, Bertha Maya

    2011-03-15

    A sound climate policy encouraging clean energy investment is important to mitigate global warming. Previous research has demonstrated that consumer choice indeed plays an important role in adoption of sustainable technologies. This thesis strives to gain a better understanding of consumers' decision-making on heating systems and to explore the potential application of agent-based modeling (ABM) in exploring mechanism underlying adoption in which heating system adoption by Norwegian households is taken up as a case study. An interdisciplinary approach, applying various established theories including those of psychology, is applied to create a model for consumer behavior and implement this behavior in an Agent-Based Model (ABM) to simulate heating technology diffusion. A mail-survey, carried out in autumn 2008, is a means to collect information for parameterizing the agent-based model, for gaining empirical facts, and for validating the developed model at micro-level. Survey sample consisted of 1500 Norwegian households drawn from population register and 1500 wood pellet users in Norway. The response rates were 10.3% and 34.6% for population sample and wood pellet sample respectively. This study is divided into two parts; empirical analysis and agent-based simulation. The empirical analysis aims at fully understanding the important aspects of adoption decision and their implications, in order to assist simulation. The analysis particularly contributes to the identification of differences/similarities between adopters and non adopters of wood pellet heating with respects to some key points of adoption derived from different theories, psychological factors underlying the adoption-decision of wood pellet heating, and the rationales underlying Norwegian households' decisions regarding their future heating system. The simulation study aims at exploring the mechanism of heterogeneous household decision-making giving rise to the diffusion of heating systems, and

  2. An Exospheric Temperature Model Based On CHAMP Observations and TIEGCM Simulations

    Science.gov (United States)

    Ruan, Haibing; Lei, Jiuhou; Dou, Xiankang; Liu, Siqing; Aa, Ercha

    2018-02-01

    In this work, thermospheric densities from the accelerometer measurement on board the CHAMP satellite during 2002-2009 and the simulations from the National Center for Atmospheric Research Thermosphere Ionosphere Electrodynamics General Circulation Model (NCAR-TIEGCM) are employed to develop an empirical exospheric temperature model (ETM). The two-dimensional basis functions of the ETM are first provided from the principal component analysis of the TIEGCM simulations. Based on the exospheric temperatures derived from CHAMP thermospheric densities, a global distribution of the exospheric temperatures is reconstructed. A parameterization is conducted for each basis function amplitude as a function of solar-geophysical and seasonal conditions. Thus, the ETM can be utilized to model the thermospheric temperature and mass density under a specified condition. Our results showed that the averaged standard deviation of the ETM is generally less than 10% than approximately 30% in the MSIS model. Besides, the ETM reproduces the global thermospheric evolutions including the equatorial thermosphere anomaly.

  3. A Simulation-Based Dynamic Stochastic Route Choice Model for Evacuation

    Directory of Open Access Journals (Sweden)

    Xing Zhao

    2012-01-01

    Full Text Available This paper establishes a dynamic stochastic route choice model for evacuation to simulate the propagation process of traffic flow and estimate the stochastic route choice under evacuation situations. The model contains a lane-group-based cell transmission model (CTM which sets different traffic capacities for links with different turning movements to flow out in an evacuation situation, an actual impedance model which is to obtain the impedance of each route in time units at each time interval and a stochastic route choice model according to the probit-based stochastic user equilibrium. In this model, vehicles loading at each origin at each time interval are assumed to choose an evacuation route under determinate road network, signal design, and OD demand. As a case study, the proposed model is validated on the network nearby Nanjing Olympic Center after the opening ceremony of the 10th National Games of the People's Republic of China. The traffic volumes and clearing time at five exit points of the evacuation zone are calculated by the model to compare with survey data. The results show that this model can appropriately simulate the dynamic route choice and evolution process of the traffic flow on the network in an evacuation situation.

  4. Simulation of spatiotemporal CT data sets using a 4D MRI-based lung motion model.

    Science.gov (United States)

    Marx, Mirko; Ehrhardt, Jan; Werner, René; Schlemmer, Heinz-Peter; Handels, Heinz

    2014-05-01

    Four-dimensional CT imaging is widely used to account for motion-related effects during radiotherapy planning of lung cancer patients. However, 4D CT often contains motion artifacts, cannot be used to measure motion variability, and leads to higher dose exposure. In this article, we propose using 4D MRI to acquire motion information for the radiotherapy planning process. From the 4D MRI images, we derive a time-continuous model of the average patient-specific respiratory motion, which is then applied to simulate 4D CT data based on a static 3D CT. The idea of the motion model is to represent the average lung motion over a respiratory cycle by cyclic B-spline curves. The model generation consists of motion field estimation in the 4D MRI data by nonlinear registration, assigning respiratory phases to the motion fields, and applying a B-spline approximation on a voxel-by-voxel basis to describe the average voxel motion over a breathing cycle. To simulate a patient-specific 4D CT based on a static CT of the patient, a multi-modal registration strategy is introduced to transfer the motion model from MRI to the static CT coordinates. Differences between model-based estimated and measured motion vectors are on average 1.39 mm for amplitude-based binning of the 4D MRI data of three patients. In addition, the MRI-to-CT registration strategy is shown to be suitable for the model transformation. The application of our 4D MRI-based motion model for simulating 4D CT images provides advantages over standard 4D CT (less motion artifacts, radiation-free). This makes it interesting for radiotherapy planning.

  5. A functional model for simulator based training in the pacific basin

    International Nuclear Information System (INIS)

    Lam, K.; MacBeth, M.J.

    1998-01-01

    According to expert estimate, the nuclear power installed capacity in the Pacific Basin region may reach 20 GWe by the year 2010. Facing a phenomenal growth in nuclear power development in the region, the development of high quality nuclear human resources for 'nuclear power ready' developing countries in the Pacific Basin is an important issue at this time. This paper recommends a timely and cost-effective functional training model to the Pacific Basin countries. The model utilizes high quality simulation executed on low cost and readily available PCs to deliver desktop simulator based training programs, as an efficient and economical complement to full scope simulators, which may not be available for initial training until five years after the NPP project has started. The objective is to ensure the goals of self-reliance and the transfer of necessary NPP knowledge at the onset of the project, to build up a technological infrastructure in areas vital for subsequent technical support of the NPP in design, commissioning, and operator training: comprehension of control systems; familiarization of plant responses to accident conditions; man machine interface (MMI) functions and interactions; early guide to commissioning and operating procedures; presentation to safety reviewers, etc. An example of this model is demonstrated with the use of the (1) CANDU 9 (CANada Deuterium Uranium 900 MW Pressurized Heavy Water Reactor) desktop nuclear simulator and (2) CASSIM (CASsiopeia SIMulation development system). (author)

  6. An open, object-based modeling approach for simulating subsurface heterogeneity

    Science.gov (United States)

    Bennett, J.; Ross, M.; Haslauer, C. P.; Cirpka, O. A.

    2017-12-01

    Characterization of subsurface heterogeneity with respect to hydraulic and geochemical properties is critical in hydrogeology as their spatial distribution controls groundwater flow and solute transport. Many approaches of characterizing subsurface heterogeneity do not account for well-established geological concepts about the deposition of the aquifer materials; those that do (i.e. process-based methods) often require forcing parameters that are difficult to derive from site observations. We have developed a new method for simulating subsurface heterogeneity that honors concepts of sequence stratigraphy, resolves fine-scale heterogeneity and anisotropy of distributed parameters, and resembles observed sedimentary deposits. The method implements a multi-scale hierarchical facies modeling framework based on architectural element analysis, with larger features composed of smaller sub-units. The Hydrogeological Virtual Reality simulator (HYVR) simulates distributed parameter models using an object-based approach. Input parameters are derived from observations of stratigraphic morphology in sequence type-sections. Simulation outputs can be used for generic simulations of groundwater flow and solute transport, and for the generation of three-dimensional training images needed in applications of multiple-point geostatistics. The HYVR algorithm is flexible and easy to customize. The algorithm was written in the open-source programming language Python, and is intended to form a code base for hydrogeological researchers, as well as a platform that can be further developed to suit investigators' individual needs. This presentation will encompass the conceptual background and computational methods of the HYVR algorithm, the derivation of input parameters from site characterization, and the results of groundwater flow and solute transport simulations in different depositional settings.

  7. PhreeqcRM: A reaction module for transport simulators based on the geochemical model PHREEQC

    Science.gov (United States)

    Parkhurst, David L.; Wissmeier, Laurin

    2015-01-01

    PhreeqcRM is a geochemical reaction module designed specifically to perform equilibrium and kinetic reaction calculations for reactive transport simulators that use an operator-splitting approach. The basic function of the reaction module is to take component concentrations from the model cells of the transport simulator, run geochemical reactions, and return updated component concentrations to the transport simulator. If multicomponent diffusion is modeled (e.g., Nernst–Planck equation), then aqueous species concentrations can be used instead of component concentrations. The reaction capabilities are a complete implementation of the reaction capabilities of PHREEQC. In each cell, the reaction module maintains the composition of all of the reactants, which may include minerals, exchangers, surface complexers, gas phases, solid solutions, and user-defined kinetic reactants.PhreeqcRM assigns initial and boundary conditions for model cells based on standard PHREEQC input definitions (files or strings) of chemical compositions of solutions and reactants. Additional PhreeqcRM capabilities include methods to eliminate reaction calculations for inactive parts of a model domain, transfer concentrations and other model properties, and retrieve selected results. The module demonstrates good scalability for parallel processing by using multiprocessing with MPI (message passing interface) on distributed memory systems, and limited scalability using multithreading with OpenMP on shared memory systems. PhreeqcRM is written in C++, but interfaces allow methods to be called from C or Fortran. By using the PhreeqcRM reaction module, an existing multicomponent transport simulator can be extended to simulate a wide range of geochemical reactions. Results of the implementation of PhreeqcRM as the reaction engine for transport simulators PHAST and FEFLOW are shown by using an analytical solution and the reactive transport benchmark of MoMaS.

  8. Numerical Simulation of Recycled Concrete Using Convex Aggregate Model and Base Force Element Method

    Directory of Open Access Journals (Sweden)

    Yijiang Peng

    2016-01-01

    Full Text Available By using the Base Force Element Method (BFEM on potential energy principle, a new numerical concrete model, random convex aggregate model, is presented in this paper to simulate the experiment under uniaxial compression for recycled aggregate concrete (RAC which can also be referred to as recycled concrete. This model is considered as a heterogeneous composite which is composed of five mediums, including natural coarse aggregate, old mortar, new mortar, new interfacial transition zone (ITZ, and old ITZ. In order to simulate the damage processes of RAC, a curve damage model was adopted as the damage constitutive model and the strength theory of maximum tensile strain was used as the failure criterion in the BFEM on mesomechanics. The numerical results obtained in this paper which contained the uniaxial compressive strengths, size effects on strength, and damage processes of RAC are in agreement with experimental observations. The research works show that the random convex aggregate model and the BFEM with the curve damage model can be used for simulating the relationship between microstructure and mechanical properties of RAC.

  9. Transport simulations TFTR: Theoretically-based transport models and current scaling

    International Nuclear Information System (INIS)

    Redi, M.H.; Cummings, J.C.; Bush, C.E.; Fredrickson, E.; Grek, B.; Hahm, T.S.; Hill, K.W.; Johnson, D.W.; Mansfield, D.K.; Park, H.; Scott, S.D.; Stratton, B.C.; Synakowski, E.J.; Tang, W.M.; Taylor, G.

    1991-12-01

    In order to study the microscopic physics underlying observed L-mode current scaling, 1-1/2-d BALDUR has been used to simulate density and temperature profiles for high and low current, neutral beam heated discharges on TFTR with several semi-empirical, theoretically-based models previously compared for TFTR, including several versions of trapped electron drift wave driven transport. Experiments at TFTR, JET and D3-D show that I p scaling of τ E does not arise from edge modes as previously thought, and is most likely to arise from nonlocal processes or from the I p -dependence of local plasma core transport. Consistent with this, it is found that strong current scaling does not arise from any of several edge models of resistive ballooning. Simulations with the profile consistent drift wave model and with a new model for toroidal collisionless trapped electron mode core transport in a multimode formalism, lead to strong current scaling of τ E for the L-mode cases on TFTR. None of the theoretically-based models succeeded in simulating the measured temperature and density profiles for both high and low current experiments

  10. The Simulation of Financial Markets by Agent-Based Mix-Game Models

    OpenAIRE

    Chengling Gou

    2006-01-01

    This paper studies the simulation of financial markets using an agent-based mix-game model which is a variant of the minority game (MG). It specifies the spectra of parameters of mix-game models that fit financial markets by investigating the dynamic behaviors of mix-game models under a wide range of parameters. The main findings are (a) in order to approach efficiency, agents in a real financial market must be heterogeneous, boundedly rational and subject to asymmetric information; (b) an ac...

  11. The Simulation of Financial Markets by an Agent-Based Mix-Game Model

    OpenAIRE

    Chengling Gou

    2006-01-01

    This paper studies the simulation of financial markets using an agent-based mix-game model which is a variant of the minority game (MG). It specifies the spectra of parameters of mix-game models that fit financial markets by investigating the dynamic behaviors of mix-game models under a wide range of parameters. The main findings are (a) in order to approach efficiency, agents in a real financial market must be heterogeneous, boundedly rational and subject to asymmetric information; (b) an ac...

  12. An Agent-Based Model of New Venture Creation: Conceptual Design for Simulating Entrepreneurship

    Science.gov (United States)

    Provance, Mike; Collins, Andrew; Carayannis, Elias

    2012-01-01

    There is a growing debate over the means by which regions can foster the growth of entrepreneurial activity in order to stimulate recovery and growth of their economies. On one side, agglomeration theory suggests the regions grow because of strong clusters that foster knowledge spillover locally; on the other side, the entrepreneurial action camp argues that innovative business models are generated by entrepreneurs with unique market perspectives who draw on knowledge from more distant domains. We will show you the design for a novel agent-based model of new venture creation that will demonstrate the relationship between agglomeration and action. The primary focus of this model is information exchange as the medium for these agent interactions. Our modeling and simulation study proposes to reveal interesting relationships in these perspectives, offer a foundation on which these disparate theories from economics and sociology can find common ground, and expand the use of agent-based modeling into entrepreneurship research.

  13. Coupling process-based models and plant architectural models: A key issue for simulating crop production

    NARCIS (Netherlands)

    Reffye, de P.; Heuvelink, E.; Guo, Y.; Hu, B.G.; Zhang, B.G.

    2009-01-01

    Process-Based Models (PBMs) can successfully predict the impact of environmental factors (temperature, light, CO2, water and nutrients) on crop growth and yield. These models are used widely for yield prediction and optimization of water and nutrient supplies. Nevertheless, PBMs do not consider

  14. An Agent-Based Modeling Framework for Simulating Human Exposure to Environmental Stresses in Urban Areas

    Directory of Open Access Journals (Sweden)

    Liang Emlyn Yang

    2018-04-01

    Full Text Available Several approaches have been used to assess potential human exposure to environmental stresses and achieve optimal results under various conditions, such as for example, for different scales, groups of people, or points in time. A thorough literature review in this paper identifies the research gap regarding modeling approaches for assessing human exposure to environment stressors, and it indicates that microsimulation tools are becoming increasingly important in human exposure assessments of urban environments, in which each person is simulated individually and continuously. The paper further describes an agent-based model (ABM framework that can dynamically simulate human exposure levels, along with their daily activities, in urban areas that are characterized by environmental stresses such as air pollution and heat stress. Within the framework, decision-making processes can be included for each individual based on rule-based behavior in order to achieve goals under changing environmental conditions. The ideas described in this paper are implemented in a free and open source NetLogo platform. A basic modeling scenario of the ABM framework in Hamburg, Germany, demonstrates its utility in various urban environments and individual activity patterns, as well as its portability to other models, programs, and frameworks. The prototype model can potentially be extended to support environmental incidence management through exploring the daily routines of different groups of citizens, and comparing the effectiveness of different strategies. Further research is needed to fully develop an operational version of the model.

  15. A New Hybrid Viscoelastic Soft Tissue Model based on Meshless Method for Haptic Surgical Simulation

    Science.gov (United States)

    Bao, Yidong; Wu, Dongmei; Yan, Zhiyuan; Du, Zhijiang

    2013-01-01

    This paper proposes a hybrid soft tissue model that consists of a multilayer structure and many spheres for surgical simulation system based on meshless. To improve accuracy of the model, tension is added to the three-parameter viscoelastic structure that connects the two spheres. By using haptic device, the three-parameter viscoelastic model (TPM) produces accurate deformationand also has better stress-strain, stress relaxation and creep properties. Stress relaxation and creep formulas have been obtained by mathematical formula derivation. Comparing with the experimental results of the real pig liver which were reported by Evren et al. and Amy et al., the curve lines of stress-strain, stress relaxation and creep of TPM are close to the experimental data of the real liver. Simulated results show that TPM has better real-time, stability and accuracy. PMID:24339837

  16. MULTI AGENT-BASED ENVIRONMENTAL LANDSCAPE (MABEL) - AN ARTIFICIAL INTELLIGENCE SIMULATION MODEL: SOME EARLY ASSESSMENTS

    OpenAIRE

    Alexandridis, Konstantinos T.; Pijanowski, Bryan C.

    2002-01-01

    The Multi Agent-Based Environmental Landscape model (MABEL) introduces a Distributed Artificial Intelligence (DAI) systemic methodology, to simulate land use and transformation changes over time and space. Computational agents represent abstract relations among geographic, environmental, human and socio-economic variables, with respect to land transformation pattern changes. A multi-agent environment is developed providing task-nonspecific problem-solving abilities, flexibility on achieving g...

  17. A model partitioning method based on dynamic decoupling for the efficient simulation of multibody systems

    Energy Technology Data Exchange (ETDEWEB)

    Papadopoulos, Alessandro Vittorio, E-mail: alessandro.papadopoulos@control.lth.se [Lund University, Department of Automatic Control (Sweden); Leva, Alberto, E-mail: alberto.leva@polimi.it [Politecnico di Milano, Dipartimento di Elettronica, Informazione e Bioingegneria (Italy)

    2015-06-15

    The presence of different time scales in a dynamic model significantly hampers the efficiency of its simulation. In multibody systems the fact is particularly relevant, as the mentioned time scales may be very different, due, for example, to the coexistence of mechanical components controled by electronic drive units, and may also appear in conjunction with significant nonlinearities. This paper proposes a systematic technique, based on the principles of dynamic decoupling, to partition a model based on the time scales that are relevant for the particular simulation studies to be performed and as transparently as possible for the user. In accordance with said purpose, peculiar to the technique is its neat separation into two parts: a structural analysis of the model, which is general with respect to any possible simulation scenario, and a subsequent decoupled integration, which can conversely be (easily) tailored to the study at hand. Also, since the technique does not aim at reducing but rather at partitioning the model, the state space and the physical interpretation of the dynamic variables are inherently preserved. Moreover, the proposed analysis allows us to define some novel indices relative to the separability of the system, thereby extending the idea of “stiffness” in a way that is particularly keen to its use for the improvement of simulation efficiency, be the envisaged integration scheme monolithic, parallel, or even based on cosimulation. Finally, thanks to the way the analysis phase is conceived, the technique is naturally applicable to both linear and nonlinear models. The paper contains a methodological presentation of the proposed technique, which is related to alternatives available in the literature so as to evidence the peculiarities just sketched, and some application examples illustrating the achieved advantages and motivating the major design choice from an operational viewpoint.

  18. Simulating GenCo bidding strategies in electricity markets with an agent-based model

    International Nuclear Information System (INIS)

    Botterud, Audun; Thimmapuram, Prakash R.; Yamakado, Malo

    2005-01-01

    In this paper we use an agent-based simulation model, EMCAS, to analyze market power in electricity markets. We focus on the effect of congestion management on the ability of generating companies (GenCos) to raise prices beyond competitive levels. An 11-node test power system is used to compare a market design based on locational marginal pricing with a market design that uses system marginal pricing and congestion management by counter trading. Bidding strategies based on both physical and economic withholding are compared to a base case with production cost bidding. The results show that unilateral market power is exercised under both pricing mechanisms. However, the largest changes in consumer costs and GenCo profits due to strategic bidding occur under the locational marginal pricing scheme. The analysis also illustrates that agent-based modeling can contribute important insights into the complex interactions between the participants in transmission-constrained electricity markets. (Author)

  19. Hedging Rules for Water Supply Reservoir Based on the Model of Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Yi Ji

    2016-06-01

    Full Text Available This study proposes a hedging rule model which is composed of a two-period reservior operation model considering the damage depth and hedging rule parameter optimization model. The former solves hedging rules based on a given poriod’s water supply weighting factor and carryover storage target, while the latter optimization model is used to optimize the weighting factor and carryover storage target based on the hedging rules. The coupling model gives the optimal poriod’s water supply weighting factor and carryover storage target to guide release. The conclusions achieved from this study as follows: (1 the water supply weighting factor and carryover storage target have a direct impact on the three elements of the hedging rule; (2 parameters can guide reservoirs to supply water reasonably after optimization of the simulation and optimization model; and (3 in order to verify the utility of the hedging rule, the Heiquan reservoir is used as a case study and particle swarm optimization algorithm with a simulation model is adopted for optimizing the parameter. The results show that the proposed hedging rule can improve the operation performances of the water supply reservoir.

  20. Study of visualized simulation and analysis of nuclear fuel cycle system based on multilevel flow model

    International Nuclear Information System (INIS)

    Liu Jingquan; Yoshikawa, H.; Zhou Yangping

    2005-01-01

    Complex energy and environment system, especially nuclear fuel cycle system recently raised social concerns about the issues of economic competitiveness, environmental effect and nuclear proliferation. Only under the condition that those conflicting issues are gotten a consensus between stakeholders with different knowledge background, can nuclear power industry be continuingly developed. In this paper, a new analysis platform has been developed to help stakeholders to recognize and analyze various socio-technical issues in the nuclear fuel cycle sys- tem based on the functional modeling method named Multilevel Flow Models (MFM) according to the cognition theory of human being, Its character is that MFM models define a set of mass, energy and information flow structures on multiple levels of abstraction to describe the functional structure of a process system and its graphical symbol representation and the means-end and part-whole hierarchical flow structure to make the represented process easy to be understood. Based upon this methodology, a micro-process and a macro-process of nuclear fuel cycle system were selected to be simulated and some analysis processes such as economics analysis, environmental analysis and energy balance analysis related to those flows were also integrated to help stakeholders to understand the process of decision-making with the introduction of some new functions for the improved Multilevel Flow Models Studio, and finally the simple simulation such as spent fuel management process simulation and money flow of nuclear fuel cycle and its levelised cost analysis will be represented as feasible examples. (authors)

  1. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    Science.gov (United States)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  2. Study of visualized simulation and analysis of nuclear fuel cycle system based on multilevel flow model

    Institute of Scientific and Technical Information of China (English)

    LIU Jing-Quan; YOSHIKAWA Hidekazu; ZHOU Yang-Ping

    2005-01-01

    Complex energy and environment system, especially nuclear fuel cycle system recently raised social concerns about the issues of economic competitiveness, environmental effect and nuclear proliferation. Only under the condition that those conflicting issues are gotten a consensus between stakeholders with different knowledge background, can nuclear power industry be continuingly developed. In this paper, a new analysis platform has been developed to help stakeholders to recognize and analyze various socio-technical issues in the nuclear fuel cycle system based on the functional modeling method named Multilevel Flow Models (MFM) according to the cognition theory of human being. Its character is that MFM models define a set of mass, energy and information flow structures on multiple levels of abstraction to describe the functional structure of a process system and its graphical symbol representation and the means-end and part-whole hierarchical flow structure to make the represented process easy to be understood. Based upon this methodology, a micro-process and a macro-process of nuclear fuel cycle system were selected to be simulated and some analysis processes such as economics analysis, environmental analysis and energy balance analysis related to those flows were also integrated to help stakeholders to understand the process of decision-making with the introduction of some new functions for the improved Multilevel Flow Models Studio, and finally the simple simulation such as spent fuel management process simulation and money flow of nuclear fuel cycle and its levelised cost analysis will be represented as feasible examples.

  3. Modelling of scintillator based flat-panel detectors with Monte-Carlo simulations

    International Nuclear Information System (INIS)

    Reims, N; Sukowski, F; Uhlmann, N

    2011-01-01

    Scintillator based flat panel detectors are state of the art in the field of industrial X-ray imaging applications. Choosing the proper system and setup parameters for the vast range of different applications can be a time consuming task, especially when developing new detector systems. Since the system behaviour cannot always be foreseen easily, Monte-Carlo (MC) simulations are keys to gain further knowledge of system components and their behaviour for different imaging conditions. In this work we used two Monte-Carlo based models to examine an indirect converting flat panel detector, specifically the Hamamatsu C9312SK. We focused on the signal generation in the scintillation layer and its influence on the spatial resolution of the whole system. The models differ significantly in their level of complexity. The first model gives a global description of the detector based on different parameters characterizing the spatial resolution. With relatively small effort a simulation model can be developed which equates the real detector regarding signal transfer. The second model allows a more detailed insight of the system. It is based on the well established cascade theory, i.e. describing the detector as a cascade of elemental gain and scattering stages, which represent the built in components and their signal transfer behaviour. In comparison to the first model the influence of single components especially the important light spread behaviour in the scintillator can be analysed in a more differentiated way. Although the implementation of the second model is more time consuming both models have in common that a relatively small amount of system manufacturer parameters are needed. The results of both models were in good agreement with the measured parameters of the real system.

  4. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    International Nuclear Information System (INIS)

    Kelly, Dana L.; Boring, Ronald L.; Mosleh, Ali; Smidts, Carol

    2011-01-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  5. Universal block diagram based modeling and simulation schemes for fractional-order control systems.

    Science.gov (United States)

    Bai, Lu; Xue, Dingyü

    2017-05-08

    Universal block diagram based schemes are proposed for modeling and simulating the fractional-order control systems in this paper. A fractional operator block in Simulink is designed to evaluate the fractional-order derivative and integral. Based on the block, the fractional-order control systems with zero initial conditions can be modeled conveniently. For modeling the system with nonzero initial conditions, the auxiliary signal is constructed in the compensation scheme. Since the compensation scheme is very complicated, therefore the integrator chain scheme is further proposed to simplify the modeling procedures. The accuracy and effectiveness of the schemes are assessed in the examples, the computation results testify the block diagram scheme is efficient for all Caputo fractional-order ordinary differential equations (FODEs) of any complexity, including the implicit Caputo FODEs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Modeling and Simulation of Membrane-Based Dehumidification and Energy Recovery Process

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Zhiming [ORNL; Abdelaziz, Omar [ORNL; Qu, Ming [ORNL

    2017-01-01

    This paper introduces a first-order physics-based model that accounts for the fundamental heat and mass transfer between a humid-air vapor stream on feed side to another flow stream on permeate side. The model comprises a few optional submodels for membrane mass transport; and it adopts a segment-by-segment method for discretizing heat and mass transfer governing equations for flow streams on feed and permeate sides. The model is able to simulate both dehumidifiers and energy recovery ventilators in parallel-flow, cross-flow, and counter-flow configurations. The predicted tresults are compared reasonably well with the measurements. The open-source codes are written in C++. The model and open-source codes are expected to become a fundament tool for the analysis of membrane-based dehumidification in the future.

  7. Model simulations and proxy-based reconstructions for the European region in the past millennium (Invited)

    Science.gov (United States)

    Zorita, E.

    2009-12-01

    One of the objectives when comparing simulations of past climates to proxy-based climate reconstructions is to asses the skill of climate models to simulate climate change. This comparison may accomplished at large spatial scales, for instance the evolution of simulated and reconstructed Northern Hemisphere annual temperature, or at regional or point scales. In both approaches a 'fair' comparison has to take into account different aspects that affect the inevitable uncertainties and biases in the simulations and in the reconstructions. These efforts face a trade-off: climate models are believed to be more skillful at large hemispheric scales, but climate reconstructions are these scales are burdened by the spatial distribution of available proxies and by methodological issues surrounding the statistical method used to translate the proxy information into large-spatial averages. Furthermore, the internal climatic noise at large hemispheric scales is low, so that the sampling uncertainty tends to be also low. On the other hand, the skill of climate models at regional scales is limited by the coarse spatial resolution, which hinders a faithful representation of aspects important for the regional climate. At small spatial scales, the reconstruction of past climate probably faces less methodological problems if information from different proxies is available. The internal climatic variability at regional scales is, however, high. In this contribution some examples of the different issues faced when comparing simulation and reconstructions at small spatial scales in the past millennium are discussed. These examples comprise reconstructions from dendrochronological data and from historical documentary data in Europe and climate simulations with global and regional models. These examples indicate that the centennial climate variations can offer a reasonable target to assess the skill of global climate models and of proxy-based reconstructions, even at small spatial scales

  8. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  9. Mesoscale modeling and simulation of microstructure evolution during dynamic recrystallization of a Ni-based superalloy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Fei [University of Nottingham, Department of Mechanical, Materials and Manufacturing Engineering, Nottingham (United Kingdom); Shanghai Jiao Tong University, Institute of Forming Technology and Equipment, Shanghai (China); Cui, Zhenshan [Shanghai Jiao Tong University, Institute of Forming Technology and Equipment, Shanghai (China); Ou, Hengan [University of Nottingham, Department of Mechanical, Materials and Manufacturing Engineering, Nottingham (United Kingdom); Long, Hui [University of Sheffield, Department of Mechanical Engineering, Sheffield (United Kingdom)

    2016-10-15

    Microstructural evolution and plastic flow characteristics of a Ni-based superalloy were investigated using a simulative model that couples the basic metallurgical principle of dynamic recrystallization (DRX) with the two-dimensional (2D) cellular automaton (CA). Variation of dislocation density with local strain of deformation is considered for accurate determination of the microstructural evolution during DRX. The grain topography, the grain size and the recrystallized fraction can be well predicted by using the developed CA model, which enables to the establishment of the relationship between the flow stress, dislocation density, recrystallized fraction volume, recrystallized grain size and the thermomechanical parameters. (orig.)

  10. Performance of process-based models for simulation of grain N in crop rotations across Europe

    DEFF Research Database (Denmark)

    Yin, Xiaogang; Kersebaum, KC; Kollas, C

    2017-01-01

    The accurate estimation of crop grain nitrogen (N; N in grain yield) is crucial for optimizing agricultural N management, especially in crop rotations. In the present study, 12 process-based models were applied to simulate the grain N of i) seven crops in rotations, ii) across various pedo...... (Brassica napus L.). These differences are linked to the intensity of parameterization with better parameterized crops showing lower prediction errors. The model performance was influenced by N fertilization and irrigation treatments, and a majority of the predictions were more accurate under low N...

  11. Impacts of Satellite-Based Snow Albedo Assimilation on Offline and Coupled Land Surface Model Simulations.

    Directory of Open Access Journals (Sweden)

    Tao Wang

    Full Text Available Seasonal snow cover in the Northern Hemisphere is the largest component of the terrestrial cryosphere and plays a major role in the climate system through strong positive feedbacks related to albedo. The snow-albedo feedback is invoked as an important cause for the polar amplification of ongoing and projected climate change, and its parameterization across models is an important source of uncertainty in climate simulations. Here, instead of developing a physical snow albedo scheme, we use a direct insertion approach to assimilate satellite-based surface albedo during the snow season (hereafter as snow albedo assimilation into the land surface model ORCHIDEE (ORganizing Carbon and Hydrology In Dynamic EcosystEms and assess the influences of such assimilation on offline and coupled simulations. Our results have shown that snow albedo assimilation in both ORCHIDEE and ORCHIDEE-LMDZ (a general circulation model of Laboratoire de Météorologie Dynamique improve the simulation accuracy of mean seasonal (October throughout May snow water equivalent over the region north of 40 degrees. The sensitivity of snow water equivalent to snow albedo assimilation is more pronounced in the coupled simulation than the offline simulation since the feedback of albedo on air temperature is allowed in ORCHIDEE-LMDZ. We have also shown that simulations of air temperature at 2 meters in ORCHIDEE-LMDZ due to snow albedo assimilation are significantly improved during the spring in particular over the eastern Siberia region. This is a result of the fact that high amounts of shortwave radiation during the spring can maximize its snow albedo feedback, which is also supported by the finding that the spatial sensitivity of temperature change to albedo change is much larger during the spring than during the autumn and winter. In addition, the radiative forcing at the top of the atmosphere induced by snow albedo assimilation during the spring is estimated to be -2.50 W m-2, the

  12. A model for self-diffusion of guanidinium-based ionic liquids: a molecular simulation study.

    Science.gov (United States)

    Klähn, Marco; Seduraman, Abirami; Wu, Ping

    2008-11-06

    We propose a novel self-diffusion model for ionic liquids on an atomic level of detail. The model is derived from molecular dynamics simulations of guanidinium-based ionic liquids (GILs) as a model case. The simulations are based on an empirical molecular mechanical force field, which has been developed in our preceding work, and it relies on the charge distribution in the actual liquid. The simulated GILs consist of acyclic and cyclic cations that were paired with nitrate and perchlorate anions. Self-diffusion coefficients are calculated at different temperatures from which diffusive activation energies between 32-40 kJ/mol are derived. Vaporization enthalpies between 174-212 kJ/mol are calculated, and their strong connection with diffusive activation energies is demonstrated. An observed formation of cavities in GILs of up to 6.5% of the total volume does not facilitate self-diffusion. Instead, the diffusion of ions is found to be determined primarily by interactions with their immediate environment via electrostatic attraction between cation hydrogen and anion oxygen atoms. The calculated average time between single diffusive transitions varies between 58-107 ps and determines the speed of diffusion, in contrast to diffusive displacement distances, which were found to be similar in all simulated GILs. All simulations indicate that ions diffuse by using a brachiation type of movement: a diffusive transition is initiated by cleaving close contacts to a coordinated counterion, after which the ion diffuses only about 2 A until new close contacts are formed with another counterion in its vicinity. The proposed diffusion model links all calculated energetic and dynamic properties of GILs consistently and explains their molecular origin. The validity of the model is confirmed by providing an explanation for the variation of measured ratios of self-diffusion coefficients of cations and paired anions over a wide range of values, encompassing various ionic liquid classes

  13. Perception Modelling of Visitors in Vargas Museum Using Agent-Based Simulation and Visibility Analysis

    Science.gov (United States)

    Carcellar, B. G., III

    2017-10-01

    Museum exhibit management is one of the usual undertakings of museum facilitators. Art works must be strategically placed to achieve maximum viewing from the visitors. The positioning of the artworks also highly influences the quality of experience of the visitors. One solution in such problems is to utilize GIS and Agent-Based Modelling (ABM). In ABM, persistent interacting objects are modelled as agents. These agents are given attributes and behaviors that describe their properties as well as their motion. In this study, ABM approach that incorporates GIS is utilized to perform analyticcal assessment on the placement of the artworks in the Vargas Museum. GIS serves as the backbone for the spatial aspect of the simulation such as the placement of the artwork exhibits, as well as possible obstructions to perception such as the columns, walls, and panel boards. Visibility Analysis is also done to the model in GIS to assess the overall visibility of the artworks. The ABM is done using the initial GIS outputs and GAMA, an open source ABM software. Visitors are modelled as agents, moving inside the museum following a specific decision tree. The simulation is done in three use cases: the 10 %, 20 %, and 30 % chance of having a visitor in the next minute. For the case of the said museum, the 10 % chance is determined to be the closest simulation case to the actual and the recommended minimum time to achieve a maximum artwork perception is 1 hour and 40 minutes. Initial assessment of the results shows that even after 3 hours of simulation, small parts of the exhibit show lack of viewers, due to its distance from the entrance. A more detailed decision tree for the visitor agents can be incorporated to have a more realistic simulation.

  14. Modeling and simulation of grid connected permanent magnet generator based small wind energy conversion systems

    Energy Technology Data Exchange (ETDEWEB)

    Arifujjaman, Md.

    2011-07-01

    In order to recover the maximum energy from small scale wind turbine systems many parameters have to be controlled. The aim of this paper is to propose a control strategy for the grid connected PMG-based small wind turbine systems. A mathematical model of small wind turbine systems was developed and the system simulated. Results show demonstrated that the control strategy is highly efficient. Sure enough it reduces the dependence on system variables, diminishes the system complexity, its furling and maximum power point controllers are efficient and it provides a stable operation for multiple wind speeds. This study developed a modeling and control strategy which was proved to be feasible by simulation results.

  15. Accurate Simulation of 802.11 Indoor Links: A "Bursty" Channel Model Based on Real Measurements

    Directory of Open Access Journals (Sweden)

    Agüero Ramón

    2010-01-01

    Full Text Available We propose a novel channel model to be used for simulating indoor wireless propagation environments. An extensive measurement campaign was carried out to assess the performance of different transport protocols over 802.11 links. This enabled us to better adjust our approach, which is based on an autoregressive filter. One of the main advantages of this proposal lies in its ability to reflect the "bursty" behavior which characterizes indoor wireless scenarios, having a great impact on the behavior of upper layer protocols. We compare this channel model, integrated within the Network Simulator (ns-2 platform, with other traditional approaches, showing that it is able to better reflect the real behavior which was empirically assessed.

  16. Fast simulation approaches for power fluctuation model of wind farm based on frequency domain

    DEFF Research Database (Denmark)

    Lin, Jin; Gao, Wen-zhong; Sun, Yuan-zhang

    2012-01-01

    This paper discusses one model developed by Riso, DTU, which is capable of simulating the power fluctuation of large wind farms in frequency domain. In the original design, the “frequency-time” transformations are time-consuming and might limit the computation speed for a wind farm of large size....... Under this background, this paper proposes four efficient approaches to accelerate the simulation speed. Two of them are based on physical model simplifications, and the other two improve the numerical computation. The case study demonstrates the efficiency of these approaches. The acceleration ratio...... is more than 300 times if all these approaches are adopted, in any low, medium and high wind speed test scenarios....

  17. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  18. Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation

    Science.gov (United States)

    Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.

    2014-12-01

    Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.

  19. A Novel Haptic Interactive Approach to Simulation of Surgery Cutting Based on Mesh and Meshless Models

    Science.gov (United States)

    Liu, Peter X.; Lai, Pinhua; Xu, Shaoping; Zou, Yanni

    2018-01-01

    In the present work, the majority of implemented virtual surgery simulation systems have been based on either a mesh or meshless strategy with regard to soft tissue modelling. To take full advantage of the mesh and meshless models, a novel coupled soft tissue cutting model is proposed. Specifically, the reconstructed virtual soft tissue consists of two essential components. One is associated with surface mesh that is convenient for surface rendering and the other with internal meshless point elements that is used to calculate the force feedback during cutting. To combine two components in a seamless way, virtual points are introduced. During the simulation of cutting, the Bezier curve is used to characterize smooth and vivid incision on the surface mesh. At the same time, the deformation of internal soft tissue caused by cutting operation can be treated as displacements of the internal point elements. Furthermore, we discussed and proved the stability and convergence of the proposed approach theoretically. The real biomechanical tests verified the validity of the introduced model. And the simulation experiments show that the proposed approach offers high computational efficiency and good visual effect, enabling cutting of soft tissue with high stability. PMID:29850006

  20. A Dynamic Operation Permission Technique Based on an MFM Model and Numerical Simulation

    International Nuclear Information System (INIS)

    Akio, Gofuku; Masahiro, Yonemura

    2011-01-01

    It is important to support operator activities to an abnormal plant situation where many counter actions are taken in relatively short time. The authors proposed a technique called dynamic operation permission to decrease human errors without eliminating creative idea of operators to cope with an abnormal plant situation by checking if the counter action taken is consistent with emergency operation procedure. If the counter action is inconsistent, a dynamic operation permission system warns it to operators. It also explains how and why the counter action is inconsistent and what influence will appear on the future plant behavior by a qualitative influence inference technique based on a model by the Mf (Multilevel Flow Modeling). However, the previous dynamic operation permission is not able to explain quantitative effects on plant future behavior. Moreover, many possible influence paths are derived because a qualitative reasoning does not give a solution when positive and negative influences are propagated to the same node. This study extends the dynamic operation permission by combining the qualitative reasoning and the numerical simulation technique. The qualitative reasoning based on an Mf model of plant derives all possible influence propagation paths. Then, a numerical simulation gives a prediction of plant future behavior in the case of taking a counter action. The influence propagation that does not coincide with the simulation results is excluded from possible influence paths. The extended technique is implemented in a dynamic operation permission system for an oil refinery plant. An MFM model and a static numerical simulator are developed. The results of dynamic operation permission for some abnormal plant situations show the improvement of the accuracy of dynamic operation permission and the quality of explanation for the effects of the counter action taken

  1. Flash Floods Simulation using a Physical-Based Hydrological Model at Different Hydroclimatic Regions

    Science.gov (United States)

    Saber, Mohamed; Kamil Yilmaz, Koray

    2016-04-01

    Currently, flash floods are seriously increasing and affecting many regions over the world. Therefore, this study will focus on two case studies; Wadi Abu Subeira, Egypt as arid environment, and Karpuz basin, Turkey as Mediterranean environment. The main objective of this work is to simulate flash floods at both catchments considering the hydrometeorological differences between them which in turn effect their flash flood behaviors. An integrated methodology incorporating Hydrological River Basin Environmental Assessment Model (Hydro-BEAM) and remote sensing observations was devised. Global Satellite Mapping of Precipitation (GSMAP) were compared with the rain gauge network at the target basins to estimate the bias in an effort to further use it effectively in simulation of flash floods. Based on the preliminary results of flash floods simulation on both basins, we found that runoff behaviors of flash floods are different due to the impacts of climatology, hydrological and topographical conditions. Also, the simulated surface runoff hydrographs are reasonably coincide with the simulated ones. Consequently, some mitigation strategies relying on this study could be introduced to help in reducing the flash floods disasters at different climate regions. This comparison of different climatic basins would be a reasonable implication for the potential impact of climate change on the flash floods frequencies and occurrences.

  2. [Simulation and data analysis of stereological modeling based on virtual slices].

    Science.gov (United States)

    Wang, Hao; Shen, Hong; Bai, Xiao-yan

    2008-05-01

    To establish a computer-assisted stereological model for simulating the process of slice section and evaluate the relationship between section surface and estimated three-dimensional structure. The model was designed by mathematic method as a win32 software based on the MFC using Microsoft visual studio as IDE for simulating the infinite process of sections and analysis of the data derived from the model. The linearity of the fitting of the model was evaluated by comparison with the traditional formula. The win32 software based on this algorithm allowed random sectioning of the particles distributed randomly in an ideal virtual cube. The stereological parameters showed very high throughput (>94.5% and 92%) in homogeneity and independence tests. The data of density, shape and size of the section were tested to conform to normal distribution. The output of the model and that from the image analysis system showed statistical correlation and consistency. The algorithm we described can be used for evaluating the stereologic parameters of the structure of tissue slices.

  3. Image-based model of the spectrin cytoskeleton for red blood cell simulation.

    Science.gov (United States)

    Fai, Thomas G; Leo-Macias, Alejandra; Stokes, David L; Peskin, Charles S

    2017-10-01

    We simulate deformable red blood cells in the microcirculation using the immersed boundary method with a cytoskeletal model that incorporates structural details revealed by tomographic images. The elasticity of red blood cells is known to be supplied by both their lipid bilayer membranes, which resist bending and local changes in area, and their cytoskeletons, which resist in-plane shear. The cytoskeleton consists of spectrin tetramers that are tethered to the lipid bilayer by ankyrin and by actin-based junctional complexes. We model the cytoskeleton as a random geometric graph, with nodes corresponding to junctional complexes and with edges corresponding to spectrin tetramers such that the edge lengths are given by the end-to-end distances between nodes. The statistical properties of this graph are based on distributions gathered from three-dimensional tomographic images of the cytoskeleton by a segmentation algorithm. We show that the elastic response of our model cytoskeleton, in which the spectrin polymers are treated as entropic springs, is in good agreement with the experimentally measured shear modulus. By simulating red blood cells in flow with the immersed boundary method, we compare this discrete cytoskeletal model to an existing continuum model and predict the extent to which dynamic spectrin network connectivity can protect against failure in the case of a red cell subjected to an applied strain. The methods presented here could form the basis of disease- and patient-specific computational studies of hereditary diseases affecting the red cell cytoskeleton.

  4. Simulating the elimination of sleeping sickness with an agent-based model

    Directory of Open Access Journals (Sweden)

    Grébaut Pascal

    2016-01-01

    Full Text Available Although Human African Trypanosomiasis is largely considered to be in the process of extinction today, the persistence of human and animal reservoirs, as well as the vector, necessitates a laborious elimination process. In this context, modeling could be an effective tool to evaluate the ability of different public health interventions to control the disease. Using the Cormas® system, we developed HATSim, an agent-based model capable of simulating the possible endemic evolutions of sleeping sickness and the ability of National Control Programs to eliminate the disease. This model takes into account the analysis of epidemiological, entomological, and ecological data from field studies conducted during the last decade, making it possible to predict the evolution of the disease within this area over a 5-year span. In this article, we first present HATSim according to the Overview, Design concepts, and Details (ODD protocol that is classically used to describe agent-based models, then, in a second part, we present predictive results concerning the evolution of Human African Trypanosomiasis in the village of Lambi (Cameroon, in order to illustrate the interest of such a tool. Our results are consistent with what was observed in the field by the Cameroonian National Control Program (CNCP. Our simulations also revealed that regular screening can be sufficient, although vector control applied to all areas with human activities could be significantly more efficient. Our results indicate that the current model can already help decision-makers in planning the elimination of the disease in foci.

  5. A voxel-based multiscale model to simulate the radiation response of hypoxic tumors.

    Science.gov (United States)

    Espinoza, I; Peschke, P; Karger, C P

    2015-01-01

    In radiotherapy, it is important to predict the response of tumors to irradiation prior to the treatment. This is especially important for hypoxic tumors, which are known to be highly radioresistant. Mathematical modeling based on the dose distribution, biological parameters, and medical images may help to improve this prediction and to optimize the treatment plan. A voxel-based multiscale tumor response model for simulating the radiation response of hypoxic tumors was developed. It considers viable and dead tumor cells, capillary and normal cells, as well as the most relevant biological processes such as (i) proliferation of tumor cells, (ii) hypoxia-induced angiogenesis, (iii) spatial exchange of cells leading to tumor growth, (iv) oxygen-dependent cell survival after irradiation, (v) resorption of dead cells, and (vi) spatial exchange of cells leading to tumor shrinkage. Oxygenation is described on a microscopic scale using a previously published tumor oxygenation model, which calculates the oxygen distribution for each voxel using the vascular fraction as the most important input parameter. To demonstrate the capabilities of the model, the dependence of the oxygen distribution on tumor growth and radiation-induced shrinkage is investigated. In addition, the impact of three different reoxygenation processes is compared and tumor control probability (TCP) curves for a squamous cells carcinoma of the head and neck (HNSSC) are simulated under normoxic and hypoxic conditions. The model describes the spatiotemporal behavior of the tumor on three different scales: (i) on the macroscopic scale, it describes tumor growth and shrinkage during radiation treatment, (ii) on a mesoscopic scale, it provides the cell density and vascular fraction for each voxel, and (iii) on the microscopic scale, the oxygen distribution may be obtained in terms of oxygen histograms. With increasing tumor size, the simulated tumors develop a hypoxic core. Within the model, tumor shrinkage was

  6. A voxel-based multiscale model to simulate the radiation response of hypoxic tumors

    International Nuclear Information System (INIS)

    Espinoza, I.; Peschke, P.; Karger, C. P.

    2015-01-01

    Purpose: In radiotherapy, it is important to predict the response of tumors to irradiation prior to the treatment. This is especially important for hypoxic tumors, which are known to be highly radioresistant. Mathematical modeling based on the dose distribution, biological parameters, and medical images may help to improve this prediction and to optimize the treatment plan. Methods: A voxel-based multiscale tumor response model for simulating the radiation response of hypoxic tumors was developed. It considers viable and dead tumor cells, capillary and normal cells, as well as the most relevant biological processes such as (i) proliferation of tumor cells, (ii) hypoxia-induced angiogenesis, (iii) spatial exchange of cells leading to tumor growth, (iv) oxygen-dependent cell survival after irradiation, (v) resorption of dead cells, and (vi) spatial exchange of cells leading to tumor shrinkage. Oxygenation is described on a microscopic scale using a previously published tumor oxygenation model, which calculates the oxygen distribution for each voxel using the vascular fraction as the most important input parameter. To demonstrate the capabilities of the model, the dependence of the oxygen distribution on tumor growth and radiation-induced shrinkage is investigated. In addition, the impact of three different reoxygenation processes is compared and tumor control probability (TCP) curves for a squamous cells carcinoma of the head and neck (HNSSC) are simulated under normoxic and hypoxic conditions. Results: The model describes the spatiotemporal behavior of the tumor on three different scales: (i) on the macroscopic scale, it describes tumor growth and shrinkage during radiation treatment, (ii) on a mesoscopic scale, it provides the cell density and vascular fraction for each voxel, and (iii) on the microscopic scale, the oxygen distribution may be obtained in terms of oxygen histograms. With increasing tumor size, the simulated tumors develop a hypoxic core. Within the

  7. An agent-based simulation model to study accountable care organizations.

    Science.gov (United States)

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions.

  8. Safety evaluation model of urban cross-river tunnel based on driving simulation.

    Science.gov (United States)

    Ma, Yingqi; Lu, Linjun; Lu, Jian John

    2017-09-01

    Currently, Shanghai urban cross-river tunnels have three principal characteristics: increased traffic, a high accident rate and rapidly developing construction. Because of their complex geographic and hydrological characteristics, the alignment conditions in urban cross-river tunnels are more complicated than in highway tunnels, so a safety evaluation of urban cross-river tunnels is necessary to suggest follow-up construction and changes in operational management. A driving risk index (DRI) for urban cross-river tunnels was proposed in this study. An index system was also constructed, combining eight factors derived from the output of a driving simulator regarding three aspects of risk due to following, lateral accidents and driver workload. Analytic hierarchy process methods and expert marking and normalization processing were applied to construct a mathematical model for the DRI. The driving simulator was used to simulate 12 Shanghai urban cross-river tunnels and a relationship was obtained between the DRI for the tunnels and the corresponding accident rate (AR) via a regression analysis. The regression analysis results showed that the relationship between the DRI and the AR mapped to an exponential function with a high degree of fit. In the absence of detailed accident data, a safety evaluation model based on factors derived from a driving simulation can effectively assess the driving risk in urban cross-river tunnels constructed or in design.

  9. A condensed-mass advection based model for the simulation of liquid polar stratospheric clouds

    Directory of Open Access Journals (Sweden)

    D. Lowe

    2003-01-01

    Full Text Available We present a condensed-mass advection based model (MADVEC designed to simulate the condensation/evaporation of liquid polar stratospheric cloud (PSC particles. A (Eulerian-in-radius discretization scheme is used, making the model suitable for use in global or mesoscale chemistry and transport models (CTMs. The mass advection equations are solved using an adaption of the weighted average flux (WAF scheme. We validate the numerical scheme using an analytical solution for multicomponent aerosols. The physics of the model are tested using a test case designed by Meilinger et al. (1995. The results from this test corroborate the composition gradients across the size distribution under rapid cooling conditions that were reported in earlier studies.

  10. A simulation model of hospital management based on cost accounting analysis according to disease.

    Science.gov (United States)

    Tanaka, Koji; Sato, Junzo; Guo, Jinqiu; Takada, Akira; Yoshihara, Hiroyuki

    2004-12-01

    Since a little before 2000, hospital cost accounting has been increasingly performed at Japanese national university hospitals. At Kumamoto University Hospital, for instance, departmental costs have been analyzed since 2000. And, since 2003, the cost balance has been obtained according to certain diseases for the preparation of Diagnosis-Related Groups and Prospective Payment System. On the basis of these experiences, we have constructed a simulation model of hospital management. This program has worked correctly at repeated trials and with satisfactory speed. Although there has been room for improvement of detailed accounts and cost accounting engine, the basic model has proved satisfactory. We have constructed a hospital management model based on the financial data of an existing hospital. We will later improve this program from the viewpoint of construction and using more various data of hospital management. A prospective outlook may be obtained for the practical application of this hospital management model.

  11. PEM fuel cell model and simulation in Matlab–Simulink based on physical parameters

    International Nuclear Information System (INIS)

    Abdin, Z.; Webb, C.J.; Gray, E.MacA.

    2016-01-01

    An advanced PEM fuel cell mathematical model is described and realised in four ancillaries in the Matlab–Simulink environment. Where possible, the model is based on parameters with direct physical meaning, with the aim of going beyond empirically describing the characteristics of the fuel cell. The model can therefore be used to predict enhanced performance owing to, for instance, improved electrode materials, and to relate changes in the measured performance to internal changes affecting influential physical parameters. Some simplifying assumptions make the model fairly light in computational demand and therefore amenable to extension to simulate an entire fuel-cell stack as part of an energy system. Despite these assumptions, the model emulates experimental data well, especially at high current density. The influences of pressure, temperature, humidification and reactant partial pressure on cell performance are explored. The dominating effect of membrane hydration is clearly revealed. - Highlights: • Model based on physical parameters where possible. • Effective binary diffusion modelled in detail on an atomistic basis. • The dominating effect of membrane hydration is clearly revealed. • Documented Simulink model so others can use it. • Conceived as a research tool for exploring enhanced fuel cell performance and diagnosing problems.

  12. Simulation of seagrass bed mapping by satellite images based on the radiative transfer model

    Science.gov (United States)

    Sagawa, Tatsuyuki; Komatsu, Teruhisa

    2015-06-01

    Seagrass and seaweed beds play important roles in coastal marine ecosystems. They are food sources and habitats for many marine organisms, and influence the physical, chemical, and biological environment. They are sensitive to human impacts such as reclamation and pollution. Therefore, their management and preservation are necessary for a healthy coastal environment. Satellite remote sensing is a useful tool for mapping and monitoring seagrass beds. The efficiency of seagrass mapping, seagrass bed classification in particular, has been evaluated by mapping accuracy using an error matrix. However, mapping accuracies are influenced by coastal environments such as seawater transparency, bathymetry, and substrate type. Coastal management requires sufficient accuracy and an understanding of mapping limitations for monitoring coastal habitats including seagrass beds. Previous studies are mainly based on case studies in specific regions and seasons. Extensive data are required to generalise assessments of classification accuracy from case studies, which has proven difficult. This study aims to build a simulator based on a radiative transfer model to produce modelled satellite images and assess the visual detectability of seagrass beds under different transparencies and seagrass coverages, as well as to examine mapping limitations and classification accuracy. Our simulations led to the development of a model of water transparency and the mapping of depth limits and indicated the possibility for seagrass density mapping under certain ideal conditions. The results show that modelling satellite images is useful in evaluating the accuracy of classification and that establishing seagrass bed monitoring by remote sensing is a reliable tool.

  13. Modeling and Simulation of Voids in Composite Tape Winding Process Based on Domain Superposition Technique

    Science.gov (United States)

    Deng, Bo; Shi, Yaoyao

    2017-11-01

    The tape winding technology is an effective way to fabricate rotationally composite materials. Nevertheless, some inevitable defects will seriously influence the performance of winding products. One of the crucial ways to identify the quality of fiber-reinforced composite material products is examining its void content. Significant improvement in products' mechanical properties can be achieved by minimizing the void defect. Two methods were applied in this study, finite element analysis and experimental testing, respectively, to investigate the mechanism of how void forming in composite tape winding processing. Based on the theories of interlayer intimate contact and Domain Superposition Technique (DST), a three-dimensional model of prepreg tape void with SolidWorks has been modeled in this paper. Whereafter, ABAQUS simulation software was used to simulate the void content change with pressure and temperature. Finally, a series of experiments were performed to determine the accuracy of the model-based predictions. The results showed that the model is effective for predicting the void content in the composite tape winding process.

  14. Models and Methods for Adaptive Management of Individual and Team-Based Training Using a Simulator

    Science.gov (United States)

    Lisitsyna, L. S.; Smetyuh, N. P.; Golikov, S. P.

    2017-05-01

    Research of adaptive individual and team-based training has been analyzed and helped find out that both in Russia and abroad, individual and team-based training and retraining of AASTM operators usually includes: production training, training of general computer and office equipment skills, simulator training including virtual simulators which use computers to simulate real-world manufacturing situation, and, as a rule, the evaluation of AASTM operators’ knowledge determined by completeness and adequacy of their actions under the simulated conditions. Such approach to training and re-training of AASTM operators stipulates only technical training of operators and testing their knowledge based on assessing their actions in a simulated environment.

  15. Simulating the Reproductive Behavior of a Region’s Population with an Agent-Based Model

    Directory of Open Access Journals (Sweden)

    Valeriy Leonidovich Makarov

    2015-09-01

    Full Text Available The research analyses the impact of the inequality of demographic transition on socio-demographic characteristics of the regional population and on the dynamics of these characteristics. The study was conducted with the help of computer-based experiments (simulations, which was run on the original agent-based model. The model is an artificial society, and personal characteristics of its members are set so that they could represent age-demographic structure of a simulate region. The agents are divided into two subgroups, which differ in their reproductive strategy. The first group has traditional strategy with high birth rate. The second group has considerably lower birth rate, observed in the modern developed societies. The model uses stochastic approaches to imitate the principle processes of population growth: mortality and morbidity. Mortality is set according to age-sex specific mortality coefficients, which do not differ across the population as a whole. New agents (child births appear as a choice of agents – women of reproductive age, and the choice depends on the subgroup. The overall age and social structure of the region is aggregated across individual agents. A number of experiments has been carried out with the model utilization. This allowed forecasting the size and structure of the population of a given region. The results of the experiments have revealed that despite its simplicity, the developed agent-based model well predicts the initial conditions in the region (e.g. age-demographic and social structure. The model shows good fit in terms of estimating the dynamics of major characteristics of the population.

  16. Volcano Modelling and Simulation gateway (VMSg): A new web-based framework for collaborative research in physical modelling and simulation of volcanic phenomena

    Science.gov (United States)

    Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.

    2009-12-01

    Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.

  17. Development of a compartment model based on CFD simulations for description of mixing in bioreactors

    Directory of Open Access Journals (Sweden)

    Crine, M.

    2010-01-01

    Full Text Available Understanding and modeling the complex interactions between biological reaction and hydrodynamics are a key problem when dealing with bioprocesses. It is fundamental to be able to accurately predict the hydrodynamics behavior of bioreactors of different size and its interaction with the biological reaction. CFD can provide detailed modeling about hydrodynamics and mixing. However, it is computationally intensive, especially when reactions are taken into account. Another way to predict hydrodynamics is the use of "Compartment" or "Multi-zone" models which are much less demanding in computation time than CFD. However, compartments and fluxes between them are often defined by considering global quantities not representative of the flow. To overcome the limitations of these two methods, a solution is to combine compartment modeling and CFD simulations. Therefore, the aim of this study is to develop a methodology in order to propose a compartment model based on CFD simulations of a bioreactor. The flow rate between two compartments can be easily computed from the velocity fields obtained by CFD. The difficulty lies in the definition of the zones in such a way they can be considered as perfectly mixed. The creation of the model compartments from CFD cells can be achieved manually or automatically. The manual zoning consists in aggregating CFD cells according to the user's wish. The automatic zoning defines compartments as regions within which the value of one or several properties are uniform with respect to a given tolerance. Both manual and automatic zoning methods have been developed and compared by simulating the mixing of an inert scalar. For the automatic zoning, several algorithms and different flow properties have been tested as criteria for the compartment creation.

  18. Progress and improvement of KSTAR plasma control using model-based control simulators

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, Sang-hee, E-mail: hahn76@nfri.re.kr [National Fusion Research Institute, 169-148 Gwahak-ro, yuseong-gu, Daejeon (Korea, Republic of); Welander, A.S. [General Atomics, San Diego, CA (United States); Yoon, S.W.; Bak, J.G. [National Fusion Research Institute, 169-148 Gwahak-ro, yuseong-gu, Daejeon (Korea, Republic of); Eidietis, N.W. [General Atomics, San Diego, CA (United States); Han, H.S. [National Fusion Research Institute, 169-148 Gwahak-ro, yuseong-gu, Daejeon (Korea, Republic of); Humphreys, D.A.; Hyatt, A. [General Atomics, San Diego, CA (United States); Jeon, Y.M. [National Fusion Research Institute, 169-148 Gwahak-ro, yuseong-gu, Daejeon (Korea, Republic of); Johnson, R.D. [General Atomics, San Diego, CA (United States); Kim, H.S.; Kim, J. [National Fusion Research Institute, 169-148 Gwahak-ro, yuseong-gu, Daejeon (Korea, Republic of); Kolemen, E.; Mueller, D. [Princeton Plasma Physics Laboratory, Princeton, NJ (United States); Penaflor, B.G.; Piglowski, D.A. [General Atomics, San Diego, CA (United States); Shin, G.W. [University of Science and Technology, Daejeon (Korea, Republic of); Walker, M.L. [General Atomics, San Diego, CA (United States); Woo, M.H. [National Fusion Research Institute, 169-148 Gwahak-ro, yuseong-gu, Daejeon (Korea, Republic of)

    2014-05-15

    Superconducting tokamaks like KSTAR, EAST and ITER need elaborate magnetic controls mainly due to either the demanding experiment schedule or tighter hardware limitations caused by the superconducting coils. In order to reduce the operation runtime requirements, two types of plasma simulators for the KSTAR plasma control system (PCS) have been developed for improving axisymmetric magnetic controls. The first one is an open-loop type, which can reproduce the control done in an old shot by loading the corresponding diagnostics data and PCS setup. The other one, a closed-loop simulator based on a linear nonrigid plasma model, is designed to simulate dynamic responses of the plasma equilibrium and plasma current (I{sub p}) due to changes of the axisymmetric poloidal field (PF) coil currents, poloidal beta, and internal inductance. The closed-loop simulator is the one that actually can test and enable alteration of the feedback control setup for the next shot. The simulators have been used routinely in 2012 plasma campaign, and the experimental performances of the axisymmetric shape control algorithm are enhanced. Quality of the real-time EFIT has been enhanced by utilizations of the open-loop type. Using the closed-loop type, the decoupling scheme of the plasma current control and axisymmetric shape controls are verified through both the simulations and experiments. By combining with the relay feedback tuning algorithm, the improved controls helped to maintain the shape suitable for longer H-mode (10–16 s) with the number of required commissioning shots largely reduced.

  19. Impacts of radiation exposure on the experimental microbial ecosystem: a particle-based model simulation approach

    International Nuclear Information System (INIS)

    Doi, M.; Tanaka, N.; Fuma, S.; Kawabata, Z.

    2004-01-01

    Well-designed experimental model ecosystem could be a simple reference of the actual environment and complex ecological systems. For ecological toxicity test of radiation and other environmental toxicants, we investigated and aquatic microbial ecosystem (closed microcosm) in the test tube with initial substrates,autotroph flagellate algae (Euglena, G.), heterotroph ciliate protozoa (Tetrahymena T.) and saprotroph bacteria (E, coli). These species organizes by itself to construct the ecological system, that keeps the sustainable population dynamics for more than 2 years after inoculation only by adding light diurnally and controlling temperature at 25 degree Celsius. Objective of the study is to develop the particle-based computer simulation by reviewing interactions among microbes and environment, and analyze the ecological toxicities of radiation on the microcosm by replicating experimental results in the computer simulation. (Author) 14 refs

  20. Performance evaluation of RANS-based turbulence models in simulating a honeycomb heat sink

    Science.gov (United States)

    Subasi, Abdussamet; Ozsipahi, Mustafa; Sahin, Bayram; Gunes, Hasan

    2017-07-01

    As well-known, there is not a universal turbulence model that can be used to model all engineering problems. There are specific applications for each turbulence model that make it appropriate to use, and it is vital to select an appropriate model and wall function combination that matches the physics of the problem considered. Therefore, in this study, performance of six well-known Reynolds-Averaged Navier-Stokes ( RANS) based turbulence models which are the Standard k {{-}} ɛ, the Renormalized Group k- ɛ, the Realizable k- ɛ, the Reynolds Stress Model, the k- ω and the Shear Stress Transport k- ω and accompanying wall functions which are the standard, the non-equilibrium and the enhanced are evaluated via 3D simulation of a honeycomb heat sink. The CutCell method is used to generate grid for the part including heat sink called test section while a hexahedral mesh is employed to discretize to inlet and outlet sections. A grid convergence study is conducted for verification process while experimental data and well-known correlations are used to validate the numerical results. Prediction of pressure drop along the test section, mean base plate temperature of the heat sink and temperature at the test section outlet are regarded as a measure of the performance of employed models and wall functions. The results indicate that selection of turbulence models and wall functions has a great influence on the results and, therefore, need to be selected carefully. Hydraulic and thermal characteristics of the honeycomb heat sink can be determined in a reasonable accuracy using RANS- based turbulence models provided that a suitable turbulence model and wall function combination is selected.

  1. Modelling and simulation of electrical energy systems through a complex systems approach using agent-based models

    Energy Technology Data Exchange (ETDEWEB)

    Kremers, Enrique

    2013-10-01

    Complexity science aims to better understand the processes of both natural and man-made systems which are composed of many interacting entities at different scales. A disaggregated approach is proposed for simulating electricity systems, by using agent-based models coupled to continuous ones. The approach can help in acquiring a better understanding of the operation of the system itself, e.g. on emergent phenomena or scale effects; as well as in the improvement and design of future smart grids.

  2. Diffusion dynamics and concentration of toxic materials from quantum dots-based nanotechnologies: an agent-based modeling simulation framework

    Energy Technology Data Exchange (ETDEWEB)

    Agusdinata, Datu Buyung, E-mail: bagusdinata@niu.edu; Amouie, Mahbod [Northern Illinois University, Department of Industrial & Systems Engineering and Environment, Sustainability, & Energy Institute (United States); Xu, Tao [Northern Illinois University, Department of Chemistry and Biochemistry (United States)

    2015-01-15

    Due to their favorable electrical and optical properties, quantum dots (QDs) nanostructures have found numerous applications including nanomedicine and photovoltaic cells. However, increased future production, use, and disposal of engineered QD products also raise concerns about their potential environmental impacts. The objective of this work is to establish a modeling framework for predicting the diffusion dynamics and concentration of toxic materials released from Trioctylphosphine oxide-capped CdSe. To this end, an agent-based model simulation with reaction kinetics and Brownian motion dynamics was developed. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd{sup 2+} ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd{sup 2+} ions under different values of proxy environmental factor parameters. With the only chemistry considered was oxidation, the simulation was able to replicate Cd{sup 2+} ion release from Thiol-capped QDs in aerated water. The agent-based method is the first to be developed in the QDs application domain. It adds both simplicity of the solubility and rate of release of Cd{sup 2+} ions and complexity of tracking of individual atoms of Cd at the same time.

  3. Diffusion dynamics and concentration of toxic materials from quantum dots-based nanotechnologies: an agent-based modeling simulation framework

    International Nuclear Information System (INIS)

    Agusdinata, Datu Buyung; Amouie, Mahbod; Xu, Tao

    2015-01-01

    Due to their favorable electrical and optical properties, quantum dots (QDs) nanostructures have found numerous applications including nanomedicine and photovoltaic cells. However, increased future production, use, and disposal of engineered QD products also raise concerns about their potential environmental impacts. The objective of this work is to establish a modeling framework for predicting the diffusion dynamics and concentration of toxic materials released from Trioctylphosphine oxide-capped CdSe. To this end, an agent-based model simulation with reaction kinetics and Brownian motion dynamics was developed. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd 2+ ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd 2+ ions under different values of proxy environmental factor parameters. With the only chemistry considered was oxidation, the simulation was able to replicate Cd 2+ ion release from Thiol-capped QDs in aerated water. The agent-based method is the first to be developed in the QDs application domain. It adds both simplicity of the solubility and rate of release of Cd 2+ ions and complexity of tracking of individual atoms of Cd at the same time

  4. Collaborative Management of Complex Major Construction Projects: AnyLogic-Based Simulation Modelling

    Directory of Open Access Journals (Sweden)

    Na Zhao

    2016-01-01

    Full Text Available Complex supply chain system collaborative management of major construction projects effectively integrates the different participants in the construction project. This paper establishes a simulation model based on AnyLogic to reveal the collaborative elements in the complex supply chain management system and the modes of action as well as the transmission problems of the intent information. Thus it is promoting the participants to become an organism with coordinated development and coevolution. This study can help improve the efficiency and management of the complex system of major construction projects.

  5. Using an Agent-Based Modeling Simulation and Game to Teach Socio-Scientific Topics

    Directory of Open Access Journals (Sweden)

    Lori L. Scarlatos

    2014-02-01

    Full Text Available In our modern world, where science, technology and society are tightly interwoven, it is essential that all students be able to evaluate scientific evidence and make informed decisions. Energy Choices, an agent-based simulation with a multiplayer game interface, was developed as a learning tool that models the interdependencies between the energy choices that are made, growth in local economies, and climate change on a global scale. This paper presents the results of pilot testing Energy Choices in two different settings, using two different modes of delivery.

  6. Customer social network affects marketing strategy: A simulation analysis based on competitive diffusion model

    Science.gov (United States)

    Hou, Rui; Wu, Jiawen; Du, Helen S.

    2017-03-01

    To explain the competition phenomenon and results between QQ and MSN (China) in the Chinese instant messaging software market, this paper developed a new population competition model based on customer social network. The simulation results show that the firm whose product with greater network externality effect will gain more market share than its rival when the same marketing strategy is used. The firm with the advantage of time, derived from the initial scale effect will become more competitive than its rival when facing a group of common penguin customers within a social network, verifying the winner-take-all phenomenon in this case.

  7. A CAD based geometry model for simulation and analysis of particle detector data

    Energy Technology Data Exchange (ETDEWEB)

    Milde, Michael; Losekamm, Martin; Poeschl, Thomas; Greenwald, Daniel; Paul, Stephan [Technische Universitaet Muenchen, 85748 Garching (Germany)

    2016-07-01

    The development of a new particle detector requires a good understanding of its setup. A detailed model of the detector's geometry is not only needed during construction, but also for simulation and data analysis. To arrive at a consistent description of the detector geometry a representation is needed that can be easily implemented in different software tools used during data analysis. We developed a geometry representation based on CAD files that can be easily used within the Geant4 simulation framework and analysis tools based on the ROOT framework. This talk presents the structure of the geometry model and show its implementation using the example of the event reconstruction developed for the Multi-purpose Active-target Particle Telescope (MAPT). The detector consists of scintillating plastic fibers and can be used as a tracking detector and calorimeter with omnidirectional acceptance. To optimize the angular resolution and the energy reconstruction of measured particles, a detailed detector model is needed at all stages of the reconstruction.

  8. Advances in model-based software for simulating ultrasonic immersion inspections of metal components

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; Taylor, Jared L.; Engle, Brady J.; Roberts, Ronald A.

    2018-04-01

    Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was initiated in 2015 to repackage existing research-grade software into user-friendly tools for the rapid estimation of signal-to-noise ratio (SNR) for ultrasonic inspections of metals. The software combines: (1) a Python-based graphical user interface for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signals and backscattered grain noise characteristics. The later makes use the Thompson-Gray measurement model for the response from an internal defect, and the Thompson-Margetan independent scatterer model for backscattered grain noise. This paper, the third in the series [1-2], provides an overview of the ongoing modeling effort with emphasis on recent developments. These include the ability to: (1) treat microstructures where grain size, shape and tilt relative to the incident sound direction can all vary with depth; and (2) simulate C-scans of defect signals in the presence of backscattered grain noise. The simulation software can now treat both normal and oblique-incidence immersion inspections of curved metal components. Both longitudinal and shear-wave inspections are treated. The model transducer can either be planar, spherically-focused, or bi-cylindrically-focused. A calibration (or reference) signal is required and is used to deduce the measurement system efficiency function. This can be "invented" by the software using center frequency and bandwidth information specified by the user, or, alternatively, a measured calibration signal can be used. Defect types include flat-bottomed-hole reference reflectors, and spherical pores and inclusions. Simulation outputs include estimated defect signal amplitudes, root-mean-square values of grain noise amplitudes, and SNR as functions of the depth of the defect within the metal component. At any particular depth, the user can view

  9. A Forward GPS Multipath Simulator Based on the Vegetation Radiative Transfer Equation Model.

    Science.gov (United States)

    Wu, Xuerui; Jin, Shuanggen; Xia, Junming

    2017-06-05

    Global Navigation Satellite Systems (GNSS) have been widely used in navigation, positioning and timing. Nowadays, the multipath errors may be re-utilized for the remote sensing of geophysical parameters (soil moisture, vegetation and snow depth), i.e., GPS-Multipath Reflectometry (GPS-MR). However, bistatic scattering properties and the relation between GPS observables and geophysical parameters are not clear, e.g., vegetation. In this paper, a new element on bistatic scattering properties of vegetation is incorporated into the traditional GPS-MR model. This new element is the first-order radiative transfer equation model. The new forward GPS multipath simulator is able to explicitly link the vegetation parameters with GPS multipath observables (signal-to-noise-ratio (SNR), code pseudorange and carrier phase observables). The trunk layer and its corresponding scattering mechanisms are ignored since GPS-MR is not suitable for high forest monitoring due to the coherence of direct and reflected signals. Based on this new model, the developed simulator can present how the GPS signals (L1 and L2 carrier frequencies, C/A, P(Y) and L2C modulations) are transmitted (scattered and absorbed) through vegetation medium and received by GPS receivers. Simulation results show that the wheat will decrease the amplitudes of GPS multipath observables (SNR, phase and code), if we increase the vegetation moisture contents or the scatters sizes (stem or leaf). Although the Specular-Ground component dominates the total specular scattering, vegetation covered ground soil moisture has almost no effects on the final multipath signatures. Our simulated results are consistent with previous results for environmental parameter detections by GPS-MR.

  10. Improvement of the physically-based groundwater model simulations through complementary correction of its errors

    Directory of Open Access Journals (Sweden)

    Jorge Mauricio Reyes Alcalde

    2017-04-01

    Full Text Available Physically-Based groundwater Models (PBM, such MODFLOW, are used as groundwater resources evaluation tools supposing that the produced differences (residuals or errors are white noise. However, in the facts these numerical simulations usually show not only random errors but also systematic errors. For this work it has been developed a numerical procedure to deal with PBM systematic errors, studying its structure in order to model its behavior and correct the results by external and complementary means, trough a framework called Complementary Correction Model (CCM. The application of CCM to PBM shows a decrease in local biases, better distribution of errors and reductions in its temporal and spatial correlations, with 73% of reduction in global RMSN over an original PBM. This methodology seems an interesting chance to update a PBM avoiding the work and costs of interfere its internal structure.

  11. A physics based method for combining multiple anatomy models with application to medical simulation.

    Science.gov (United States)

    Zhu, Yanong; Magee, Derek; Ratnalingam, Rishya; Kessel, David

    2009-01-01

    We present a physics based approach to the construction of anatomy models by combining components from different sources; different image modalities, protocols, and patients. Given an initial anatomy, a mass-spring model is generated which mimics the physical properties of the solid anatomy components. This helps maintain valid spatial relationships between the components, as well as the validity of their shapes. Combination can be either replacing/modifying an existing component, or inserting a new component. The external forces that deform the model components to fit the new shape are estimated from Gradient Vector Flow and Distance Transform maps. We demonstrate the applicability and validity of the described approach in the area of medical simulation, by showing the processes of non-rigid surface alignment, component replacement, and component insertion.

  12. Reconstruction of the external dose of evacuees from the contaminated areas based on simulation modelling

    International Nuclear Information System (INIS)

    Meckbach, R.; Chumak, V.V.

    1996-01-01

    Model calculations are being performed for the reconstruction of individual external gamma doses of population evacuated during the Chernobyl accident from the city of Pripyat and other settlements of the 30-km zone. The models are based on sets of dose rate measurements performed during the accident, on individual behavior histories of more than 30000 evacuees obtained by questionnaire survey and on location factors determined for characteristic housing buildings. Location factors were calculated by Monte Carlo simulations of photon transport for a typical housing block and village houses. Stochastic models for individual external dose reconstruction are described. Using Monte Carlo methods, frequency distributions representing the uncertainty of doses are calculated from an assessment of the uncertainty of the data. The determination of dose rate distributions in Pripyat is discussed. Exemplary results for individual external doses are presented

  13. Numerical simulation of countercurrent flow based on two-fluid model

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H.D. [Institute of Nuclear Engineering and Technology, Sun Yat-sen University, Zhuhai 519082 (China); School of Electric Power, South China University of Technology, Guangzhou 510640 (China); Zhang, X.Y., E-mail: zxiaoying@mail.sysu.edu.cn [Institute of Nuclear Engineering and Technology, Sun Yat-sen University, Zhuhai 519082 (China)

    2017-03-15

    Highlights: • Using one-dimensional two-fluid model to help understanding counter-current flow two-phase flows. • Using surface tension model to make the one-dimensional two-fluid flow model well-posed. • Solving the governing equations with a modified SIMPLE algorithm. • Validating code with experimental data and applying it to vertical air/steam countercurrent flow condition - Abstract: In order to improve the understanding of counter-current two-phase flows, a transient analysis code is developed based on one-dimensional two-fluid model. A six equation model has been established and a two phase pressure model with surface tension term, wall drag force and interface shear terms have been used. Taking account of transport phenomenon, heat and mass transfer models of interface were incorporated. The staggered grids have been used in discretization of equations. For validation of the model and code, a countercurrent air-water problem in one experimental horizontal stratified flow has been considered firstly. Comparison of the computed results and the experimental one shows satisfactory agreement. As the full problem for investigation, one vertical pipe with countercurrent flow of steam-water and air-water at same boundary condition has been taken for study. The transient distribution of liquid fraction, liquid velocity and gas velocity for selected positions of steam-water and air-water problem were presented and discussed. The results show that these two simulations have similar transient behavior except that the distribution of gas velocity for steam-water problem have larger oscillation than the one for air-water. The effect of mesh size on wavy characteristics of interface surface was also investigated. The mesh size has significant influence on the simulated results. With the increased refinement, the oscillation gets stronger.

  14. Fission yield calculation using toy model based on Monte Carlo simulation

    International Nuclear Information System (INIS)

    Jubaidah; Kurniadi, Rizal

    2015-01-01

    Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R c ), mean of left curve (μ L ) and mean of right curve (μ R ), deviation of left curve (σ L ) and deviation of right curve (σ R ). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90

  15. Fission yield calculation using toy model based on Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Jubaidah, E-mail: jubaidah@student.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia); Physics Department, Faculty of Mathematics and Natural Science – State University of Medan. Jl. Willem Iskandar Pasar V Medan Estate – North Sumatera, Indonesia 20221 (Indonesia); Kurniadi, Rizal, E-mail: rijalk@fi.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia)

    2015-09-30

    Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R{sub c}), mean of left curve (μ{sub L}) and mean of right curve (μ{sub R}), deviation of left curve (σ{sub L}) and deviation of right curve (σ{sub R}). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90

  16. Study on driver model for hybrid truck based on driving simulator experimental results

    Directory of Open Access Journals (Sweden)

    Dam Hoang Phuc

    2018-04-01

    Full Text Available In this paper, a proposed car-following driver model taking into account some features of both the compensatory and anticipatory model representing the human pedal operation has been verified by driving simulator experiments with several real drivers. The comparison between computer simulations performed by determined model parameters with the experimental results confirm the correctness of this mathematical driver model and identified model parameters. Then the driver model is joined to a hybrid vehicle dynamics model and the moderate car following maneuver simulations with various driver parameters are conducted to investigate influences of driver parameters on vehicle dynamics response and fuel economy. Finally, major driver parameters involved in the longitudinal control of drivers are clarified. Keywords: Driver model, Driver-vehicle closed-loop system, Car Following, Driving simulator/hybrid electric vehicle (B1

  17. Fluid, solid and fluid-structure interaction simulations on patient-based abdominal aortic aneurysm models.

    Science.gov (United States)

    Kelly, Sinead; O'Rourke, Malachy

    2012-04-01

    This article describes the use of fluid, solid and fluid-structure interaction simulations on three patient-based abdominal aortic aneurysm geometries. All simulations were carried out using OpenFOAM, which uses the finite volume method to solve both fluid and solid equations. Initially a fluid-only simulation was carried out on a single patient-based geometry and results from this simulation were compared with experimental results. There was good qualitative and quantitative agreement between the experimental and numerical results, suggesting that OpenFOAM is capable of predicting the main features of unsteady flow through a complex patient-based abdominal aortic aneurysm geometry. The intraluminal thrombus and arterial wall were then included, and solid stress and fluid-structure interaction simulations were performed on this, and two other patient-based abdominal aortic aneurysm geometries. It was found that the solid stress simulations resulted in an under-estimation of the maximum stress by up to 5.9% when compared with the fluid-structure interaction simulations. In the fluid-structure interaction simulations, flow induced pressure within the aneurysm was found to be up to 4.8% higher than the value of peak systolic pressure imposed in the solid stress simulations, which is likely to be the cause of the variation in the stress results. In comparing the results from the initial fluid-only simulation with results from the fluid-structure interaction simulation on the same patient, it was found that wall shear stress values varied by up to 35% between the two simulation methods. It was concluded that solid stress simulations are adequate to predict the maximum stress in an aneurysm wall, while fluid-structure interaction simulations should be performed if accurate prediction of the fluid wall shear stress is necessary. Therefore, the decision to perform fluid-structure interaction simulations should be based on the particular variables of interest in a given

  18. Simulation-based Education for Endoscopic Third Ventriculostomy : A Comparison Between Virtual and Physical Training Models

    NARCIS (Netherlands)

    Breimer, Gerben E.; Haji, Faizal A.; Bodani, Vivek; Cunningham, Melissa S.; Lopez-Rios, Adriana-Lucia; Okrainec, Allan; Drake, James M.

    BACKGROUND: The relative educational benefits of virtual reality (VR) and physical simulation models for endoscopic third ventriculostomy (ETV) have not been evaluated "head to head." OBJECTIVE: To compare and identify the relative utility of a physical and VR ETV simulation model for use in

  19. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    Energy Technology Data Exchange (ETDEWEB)

    Rich Johnson; Kimberlyn C. Mousseau; Hyung Lee

    2011-09-01

    NE-KAMS knowledge base will assist computational analysts, physics model developers, experimentalists, nuclear reactor designers, and federal regulators by: (1) Establishing accepted standards, requirements and best practices for V&V and UQ of computational models and simulations, (2) Establishing accepted standards and procedures for qualifying and classifying experimental and numerical benchmark data, (3) Providing readily accessible databases for nuclear energy related experimental and numerical benchmark data that can be used in V&V assessments and computational methods development, (4) Providing a searchable knowledge base of information, documents and data on V&V and UQ, and (5) Providing web-enabled applications, tools and utilities for V&V and UQ activities, data assessment and processing, and information and data searches. From its inception, NE-KAMS will directly support nuclear energy research, development and demonstration programs within the U.S. Department of Energy (DOE), including the Consortium for Advanced Simulation of Light Water Reactors (CASL), the Nuclear Energy Advanced Modeling and Simulation (NEAMS), the Light Water Reactor Sustainability (LWRS), the Small Modular Reactors (SMR), and the Next Generation Nuclear Power Plant (NGNP) programs. These programs all involve computational modeling and simulation (M&S) of nuclear reactor systems, components and processes, and it is envisioned that NE-KAMS will help to coordinate and facilitate collaboration and sharing of resources and expertise for V&V and UQ across these programs. In addition, from the outset, NE-KAMS will support the use of computational M&S in the nuclear industry by developing guidelines and recommended practices aimed at quantifying the uncertainty and assessing the applicability of existing analysis models and methods. The NE-KAMS effort will initially focus on supporting the use of computational fluid dynamics (CFD) and thermal hydraulics (T/H) analysis for M&S of nuclear

  20. Combining integrated river modelling and agent based social simulation for river management; The case study of the Grensmaas project

    NARCIS (Netherlands)

    Valkering, P.; Krywkow, Jorg; Rotmans, J.; van der Veen, A.; Douben, N.; van Os, A.G.

    2003-01-01

    In this paper we present a coupled Integrated River Model – Agent Based Social Simulation model (IRM-ABSS) for river management. The models represent the case of the ongoing river engineering project “Grensmaas”. In the ABSS model stakeholders are represented as computer agents negotiating a river

  1. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    Energy Technology Data Exchange (ETDEWEB)

    Kimberlyn C. Mousseau

    2011-10-01

    The Nuclear Energy Computational Fluid Dynamics Advanced Modeling and Simulation (NE-CAMS) system is being developed at the Idaho National Laboratory (INL) in collaboration with Bettis Laboratory, Sandia National Laboratory (SNL), Argonne National Laboratory (ANL), Utah State University (USU), and other interested parties with the objective of developing and implementing a comprehensive and readily accessible data and information management system for computational fluid dynamics (CFD) verification and validation (V&V) in support of nuclear energy systems design and safety analysis. The two key objectives of the NE-CAMS effort are to identify, collect, assess, store and maintain high resolution and high quality experimental data and related expert knowledge (metadata) for use in CFD V&V assessments specific to the nuclear energy field and to establish a working relationship with the U.S. Nuclear Regulatory Commission (NRC) to develop a CFD V&V database, including benchmark cases, that addresses and supports the associated NRC regulations and policies on the use of CFD analysis. In particular, the NE-CAMS system will support the Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program, which aims to develop and deploy advanced modeling and simulation methods and computational tools for reliable numerical simulation of nuclear reactor systems for design and safety analysis. Primary NE-CAMS Elements There are four primary elements of the NE-CAMS knowledge base designed to support computer modeling and simulation in the nuclear energy arena as listed below. Element 1. The database will contain experimental data that can be used for CFD validation that is relevant to nuclear reactor and plant processes, particularly those important to the nuclear industry and the NRC. Element 2. Qualification standards for data evaluation and classification will be incorporated and applied such that validation data sets will result in well

  2. Electromagnetic Computation and Visualization of Transmission Particle Model and Its Simulation Based on GPU

    Directory of Open Access Journals (Sweden)

    Yingnian Wu

    2014-01-01

    Full Text Available Electromagnetic calculation plays an important role in both military and civic fields. Some methods and models proposed for calculation of electromagnetic wave propagation in a large range bring heavy burden in CPU computation and also require huge amount of memory. Using the GPU to accelerate computation and visualization can reduce the computational burden on the CPU. Based on forward ray-tracing method, a transmission particle model (TPM for calculating electromagnetic field is presented to combine the particle method. The movement of a particle obeys the principle of the propagation of electromagnetic wave, and then the particle distribution density in space reflects the electromagnetic distribution status. The algorithm with particle transmission, movement, reflection, and diffraction is described in detail. Since the particles in TPM are completely independent, it is very suitable for the parallel computing based on GPU. Deduction verification of TPM with the electric dipole antenna as the transmission source is conducted to prove that the particle movement itself represents the variation of electromagnetic field intensity caused by diffusion. Finally, the simulation comparisons are made against the forward and backward ray-tracing methods. The simulation results verified the effectiveness of the proposed method.

  3. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  4. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    Science.gov (United States)

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  5. A real option-based simulation model to evaluate investments in pump storage plants

    International Nuclear Information System (INIS)

    Muche, Thomas

    2009-01-01

    Investments in pump storage plants are expected to grow especially due to their ability to store an excess of supply from wind power plants. In order to evaluate these investments correctly the peculiarities of pump storage plants and the characteristics of liberalized power markets have to be considered. The main characteristics of power markets are the strong power price volatility and the occurrence of prices spikes. In this article a valuation model is developed capturing these aspects using power price simulation, optimization of unit commitment and capital market theory. This valuation model is able to value a future price-based unit commitment planning that corresponds to future scope of actions also called real options. The resulting real option value for the pump storage plant is compared with the traditional net present value approach. Because this approach is not able to evaluate scope of actions correctly it results in strongly smaller investment values and forces wrong investment decisions.

  6. A hierarchical lattice spring model to simulate the mechanics of 2-D materials-based composites

    Directory of Open Access Journals (Sweden)

    Lucas eBrely

    2015-07-01

    Full Text Available In the field of engineering materials, strength and toughness are typically two mutually exclusive properties. Structural biological materials such as bone, tendon or dentin have resolved this conflict and show unprecedented damage tolerance, toughness and strength levels. The common feature of these materials is their hierarchical heterogeneous structure, which contributes to increased energy dissipation before failure occurring at different scale levels. These structural properties are the key to exceptional bioinspired material mechanical properties, in particular for nanocomposites. Here, we develop a numerical model in order to simulate the mechanisms involved in damage progression and energy dissipation at different size scales in nano- and macro-composites, which depend both on the heterogeneity of the material and on the type of hierarchical structure. Both these aspects have been incorporated into a 2-dimensional model based on a Lattice Spring Model, accounting for geometrical nonlinearities and including statistically-based fracture phenomena. The model has been validated by comparing numerical results to continuum and fracture mechanics results as well as finite elements simulations, and then employed to study how structural aspects impact on hierarchical composite material properties. Results obtained with the numerical code highlight the dependence of stress distributions on matrix properties and reinforcement dispersion, geometry and properties, and how failure of sacrificial elements is directly involved in the damage tolerance of the material. Thanks to the rapidly developing field of nanocomposite manufacture, it is already possible to artificially create materials with multi-scale hierarchical reinforcements. The developed code could be a valuable support in the design and optimization of these advanced materials, drawing inspiration and going beyond biological materials with exceptional mechanical properties.

  7. A simulation model for reliability-based appraisal of an energy policy: The case of Lebanon

    International Nuclear Information System (INIS)

    Hamdan, H.A.; Ghajar, R.F.; Chedid, R.B.

    2012-01-01

    The Lebanese Electric Power System (LEPS) has been suffering from technical and financial deficiencies for decades and mirrors the problems encountered in many developing countries suffering from inadequate or no power systems planning resulting in incomplete and ill-operating infrastructure, and suffering from effects of political instability, huge debts, unavailability of financing desired projects and inefficiency in operation. The upgrade and development of the system necessitate the adoption of a comprehensive energy policy that introduces solutions to a diversity of problems addressing the technical, financial, administrative and governance aspects of the system. In this paper, an energy policy for Lebanon is proposed and evaluated based on integration between energy modeling and financial modeling. The paper utilizes the Load Modification Technique (LMT) as a probabilistic tool to assess the impact of policy implementation on energy production, overall cost, technical/commercial losses and reliability. Scenarios reflecting implementation of policy projects are assessed and their impacts are compared with business-as-usual scenarios which assume no new investment is to take place in the sector. Conclusions are drawn on the usefulness of the proposed evaluation methodology and the effectiveness of the adopted energy policy for Lebanon and other developing countries suffering from similar power system problems. - Highlights: ► Evaluation methodology based on a probabilistic simulation tool is proposed. ► A business-as-usual scenario for a given study period of the LEPS was modeled. ► Mitigation scenarios reflecting implementation of the energy policy are modeled. ► Policy simulated and compared with business-as-usual scenarios of the LEPS. ► Results reflect usefulness of proposed methodology and the adopted energy policy.

  8. A Model-Based Framework for Legal Policy Simulation and Compliance Checking

    OpenAIRE

    Soltana, Ghanem

    2017-01-01

    Information systems implementing requirements from laws and regulations, such as taxes and social benefits, need to be thoroughly verified to demonstrate their compliance. Several Verification and Validation (V&V) techniques, such as reliability testing, and modeling and simulation, can be used for assessing that such systems meet their legal. Typically, one has to model the expected (legal) behavior of the system in a form that can be executed (simulated), subject the resulting models and th...

  9. Sensitivity analysis of an individual-based model for simulation of influenza epidemics.

    Directory of Open Access Journals (Sweden)

    Elaine O Nsoesie

    Full Text Available Individual-based epidemiology models are increasingly used in the study of influenza epidemics. Several studies on influenza dynamics and evaluation of intervention measures have used the same incubation and infectious period distribution parameters based on the natural history of influenza. A sensitivity analysis evaluating the influence of slight changes to these parameters (in addition to the transmissibility would be useful for future studies and real-time modeling during an influenza pandemic.In this study, we examined individual and joint effects of parameters and ranked parameters based on their influence on the dynamics of simulated epidemics. We also compared the sensitivity of the model across synthetic social networks for Montgomery County in Virginia and New York City (and surrounding metropolitan regions with demographic and rural-urban differences. In addition, we studied the effects of changing the mean infectious period on age-specific epidemics. The research was performed from a public health standpoint using three relevant measures: time to peak, peak infected proportion and total attack rate. We also used statistical methods in the design and analysis of the experiments. The results showed that: (i minute changes in the transmissibility and mean infectious period significantly influenced the attack rate; (ii the mean of the incubation period distribution appeared to be sufficient for determining its effects on the dynamics of epidemics; (iii the infectious period distribution had the strongest influence on the structure of the epidemic curves; (iv the sensitivity of the individual-based model was consistent across social networks investigated in this study and (v age-specific epidemics were sensitive to changes in the mean infectious period irrespective of the susceptibility of the other age groups. These findings suggest that small changes in some of the disease model parameters can significantly influence the uncertainty

  10. Modelling, Simulation and Testing of a Reconfigurable Cable-Based Parallel Manipulator as Motion Aiding System

    Directory of Open Access Journals (Sweden)

    Gianni Castelli

    2010-01-01

    Full Text Available This paper presents results on the modelling, simulation and experimental tests of a cable-based parallel manipulator to be used as an aiding or guiding system for people with motion disabilities. There is a high level of motivation for people with a motion disability or the elderly to perform basic daily-living activities independently. Therefore, it is of great interest to design and implement safe and reliable motion assisting and guiding devices that are able to help end-users. In general, a robot for a medical application should be able to interact with a patient in safety conditions, i.e. it must not damage people or surroundings; it must be designed to guarantee high accuracy and low acceleration during the operation. Furthermore, it should not be too bulky and it should exert limited wrenches after close interaction with people. It can be advisable to have a portable system which can be easily brought into and assembled in a hospital or a domestic environment. Cable-based robotic structures can fulfil those requirements because of their main characteristics that make them light and intrinsically safe. In this paper, a reconfigurable four-cable-based parallel manipulator has been proposed as a motion assisting and guiding device to help people to accomplish a number of tasks, such as an aiding or guiding system to move the upper and lower limbs or the whole body. Modelling and simulation are presented in the ADAMS environment. Moreover, experimental tests are reported as based on an available laboratory prototype.

  11. Flux based modeling and simulation of dry etching for fabrication of silicon deep trench structures

    Energy Technology Data Exchange (ETDEWEB)

    Malik Rizwan [State Key Laboratory of Digital Manufacturing Equipment and technology, Huazhong University of Science and Technology, 1037 Luoyu road, Wuhan, China 43007 (China); Shi Tielin; Tang Zirong; Liu Shiyuan, E-mail: zirong@mail.hust.edu.cn, E-mail: rizwanmalik@smail.hust.edu.cn [Wuhan National Laboratory for Optoelectronics, Huazhong University of Science and Technology, 1037 Luoyu road Wuhan, 430074 (China)

    2011-02-01

    Deep reactive ion etching (DRIE) process is a key growth for fabrication of micro-electromechanical system (MEMS) devices. Due to complexity of this process, including interaction of the process steps, full analytical modeling is complex. Plasma process holds deficiency of understanding because it is very easy to measure the results empirically. However, as device parameters shrink, this issue is more critical. In this paper, our process was modeled qualitatively based on 'High Density Plasma Etch Model'. Deep trench solutions of etch rate based on continuity equation were successfully generated first time through mathematical analysis. It was also proved that the product of fluorine and gas phase concentration in SF{sub 6} remains identical during both deposition and etching stages. The etching process was treated as a combination of isotropic, directional and angle-dependent component parts. It exploited a synergistic balance of chemical as well as physical etching for promoting silicon trenches and high aspect ratio structures. Simulations were performed for comprehensive analysis of fluxes coming towards the surface during chemical reaction of gas. It is observed that near the surface, the distribution of the arrival flux follows a cosine distribution. Our model is feasible to analyze various parameters like gas delivery, reactor volume and temperature that help to assert large scale effects and to optimize equipment design.

  12. How the ownership structures cause epidemics in financial markets: A network-based simulation model

    Science.gov (United States)

    Dastkhan, Hossein; Gharneh, Naser Shams

    2018-02-01

    Analysis of systemic risks and contagions is one of the main challenges of policy makers and researchers in the recent years. Network theory is introduced as a main approach in the modeling and simulation of financial and economic systems. In this paper, a simulation model is introduced based on the ownership network to analyze the contagion and systemic risk events. For this purpose, different network structures with different values for parameters are considered to investigate the stability of the financial system in the presence of different kinds of idiosyncratic and aggregate shocks. The considered network structures include Erdos-Renyi, core-periphery, segregated and power-law networks. Moreover, the results of the proposed model are also calculated for a real ownership network. The results show that the network structure has a significant effect on the probability and the extent of contagion in the financial systems. For each network structure, various values for the parameters results in remarkable differences in the systemic risk measures. The results of real case show that the proposed model is appropriate in the analysis of systemic risk and contagion in financial markets, identification of systemically important firms and estimation of market loss when the initial failures occur. This paper suggests a new direction in the modeling of contagion in the financial markets, in particular that the effects of new kinds of financial exposure are clarified. This paper's idea and analytical results may also be useful for the financial policy makers, portfolio managers and the firms to conduct their investment in the right direction.

  13. Hybrid model based unified scheme for endoscopic Cerenkov and radio-luminescence tomography: Simulation demonstration

    Science.gov (United States)

    Wang, Lin; Cao, Xin; Ren, Qingyun; Chen, Xueli; He, Xiaowei

    2018-05-01

    Cerenkov luminescence imaging (CLI) is an imaging method that uses an optical imaging scheme to probe a radioactive tracer. Application of CLI with clinically approved radioactive tracers has opened an opportunity for translating optical imaging from preclinical to clinical applications. Such translation was further improved by developing an endoscopic CLI system. However, two-dimensional endoscopic imaging cannot identify accurate depth and obtain quantitative information. Here, we present an imaging scheme to retrieve the depth and quantitative information from endoscopic Cerenkov luminescence tomography, which can also be applied for endoscopic radio-luminescence tomography. In the scheme, we first constructed a physical model for image collection, and then a mathematical model for characterizing the luminescent light propagation from tracer to the endoscopic detector. The mathematical model is a hybrid light transport model combined with the 3rd order simplified spherical harmonics approximation, diffusion, and radiosity equations to warrant accuracy and speed. The mathematical model integrates finite element discretization, regularization, and primal-dual interior-point optimization to retrieve the depth and the quantitative information of the tracer. A heterogeneous-geometry-based numerical simulation was used to explore the feasibility of the unified scheme, which demonstrated that it can provide a satisfactory balance between imaging accuracy and computational burden.

  14. Simulation of leaf area index on site scale based on model data fusion

    Science.gov (United States)

    Yang, Y.; Wang, J. B.

    2017-12-01

    The world's grassland area is about 24 × 108hm2, accounting for about one-fifth of the global land area. It is one of the most widely distributed terrestrial ecosystems on Earth. And currently, it is the most affected area of human activity. A considerable portion of the global CO2 emissions are fixed by grassland, and the grassland carbon cycle plays an important role in the global carbon cycle (Li Bo, Yongshen Peng, Li Yao, China's Prairie, 1990). In recent years, the carbon cycle and its influencing factors of grassland ecosystems have become one of the hotspots in ecology, geology, botany and agronomy under the background of global change ( Mu Shaojie, 2014) . And the model is now as a popular and effective method of research. However, there are still some uncertainties in this approach. CEVSA ( Carbon Exchange between Vegetation, Soil and Atmosphere) is a biogeochemical cycle model based on physiological and ecological processes to simulate plant-soil-atmosphere system energy exchange and water-carbon-nitrogen coupling cycles (Cao at al., 1998a; 1998b; Woodward et al., 1995). In this paper, the remote sensing observation data of leaf area index are integrated into the model, and the CEVSA model of site version is optimized by Markov chain-Monte Carlo method to achieve the purpose of increasing the accuracy of model results.

  15. Multi-Agent Based Microscopic Simulation Modeling for Urban Traffic Flow

    Directory of Open Access Journals (Sweden)

    Xianyan Kuang

    2014-10-01

    Full Text Available Traffic simulation plays an important role in the evaluation of traffic decisions. The movement of vehicles essentially is the operating process of drivers, in order to reproduce the urban traffic flow from the micro-aspect on computer, this paper establishes an urban traffic flow microscopic simulation system (UTFSim based on multi-agent. The system is seen as an intelligent virtual environment system (IVES, and the four-layer structure of it is built. The road agent, vehicle agent and signal agent are modeled. The concept of driving trajectory which is divided into LDT (Lane Driving Trajectory and VDDT (Vehicle Dynamic Driving Trajectory is introduced. The “Link-Node” road network model is improved. The driving behaviors including free driving, following driving, lane changing, slowing down, vehicle stop, etc. are analyzed. The results of the signal control experiments utilizing the UTFSim developed in the platform of Visual Studio. NET indicates that it plays a good performance and can be used in the evaluation of traffic management and control.

  16. Simulation on scattering features of biological tissue based on generated refractive-index model

    International Nuclear Information System (INIS)

    Wang Baoyong; Ding Zhihua

    2011-01-01

    Important information on morphology of biological tissue can be deduced from elastic scattering spectra, and their analyses are based on the known refractive-index model of tissue. In this paper, a new numerical refractive-index model is put forward, and its scattering properties are intensively studied. Spectral decomposition [1] is a widely used method to generate random medium in geology, but it is never used in biology. Biological tissue is different from geology in the sense of random medium. Autocorrelation function describe almost all of features in geology, but biological tissue is not as random as geology, its structure is regular in the sense of fractal geometry [2] , and fractal dimension can be used to describe its regularity under random. Firstly scattering theories of this fractal media are reviewed. Secondly the detailed generation process of refractive-index is presented. Finally the scattering features are simulated in FDTD (Finite Difference Time Domain) Solutions software. From the simulation results, we find that autocorrelation length and fractal dimension controls scattering feature of biological tissue.

  17. Teaching Supply Chain Management Complexities: A SCOR Model Based Classroom Simulation

    Science.gov (United States)

    Webb, G. Scott; Thomas, Stephanie P.; Liao-Troth, Sara

    2014-01-01

    The SCOR (Supply Chain Operations Reference) Model Supply Chain Classroom Simulation is an in-class experiential learning activity that helps students develop a holistic understanding of the processes and challenges of supply chain management. The simulation has broader learning objectives than other supply chain related activities such as the…

  18. Wake modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, G.C.; Aagaard Madsen, H.; Larsen, T.J.; Troldborg, N.

    2008-07-15

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, however, have the potential to include also mutual wake interaction phenomenons. The basic conjecture behind the dynamic wake meandering (DWM) model is that wake transportation in the atmospheric boundary layer is driven by the large scale lateral- and vertical turbulence components. Based on this conjecture a stochastic model of the downstream wake meandering is formulated. In addition to the kinematic formulation of the dynamics of the 'meandering frame of reference', models characterizing the mean wake deficit as well as the added wake turbulence, described in the meandering frame of reference, are an integrated part the DWM model complex. For design applications, the computational efficiency of wake deficit prediction is a key issue. A computationally low cost model is developed for this purpose. Likewise, the character of the added wake turbulence, generated by the up-stream turbine in the form of shed and trailed vorticity, has been approached by a simple semi-empirical model essentially based on an eddy viscosity philosophy. Contrary to previous attempts to model wake loading, the DWM approach opens for a unifying description in the sense that turbine power- and load aspects can be treated simultaneously. This capability is a direct and attractive consequence of the model being based on the underlying physical process, and it potentially opens for optimization of wind farm topology, of wind farm operation as well as of control strategies for the individual turbine. To establish an integrated modeling tool, the DWM methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjaereborg wind farm, have

  19. Technological progress and effects of (supra) regional innovation and production collaboration. An agent-based model simulation study.

    NARCIS (Netherlands)

    Vermeulen, B.; Pyka, A.; Serguieva, A.; Maringer, D.; Palade, V.; Almeida, R.J.

    2014-01-01

    We provide a novel technology development model in which economic agents search for transformations to build artifacts. Using this technology development model, we conduct an agent-based model simulation study on the effect of (supra-)regional collaboration in production and innovation on

  20. Developing and testing transferability and feasibility of a model for educators using simulation-based learning - A European collaboration

    DEFF Research Database (Denmark)

    Bøje, Rikke Buus; Bland, Andrew; Sutton, Andrew

    2017-01-01

    of the study were to develop a model to educate the educators who deliver simulation-based learning and to test to which extent this model could be transferred to education providers in different national settings. METHODS: This model, its transferability and feasibility, was tested across three European...

  1. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  2. An agent-based simulation model for Clostridium difficile infection control.

    Science.gov (United States)

    Codella, James; Safdar, Nasia; Heffernan, Rick; Alagoz, Oguzhan

    2015-02-01

    Control of Clostridium difficile infection (CDI) is an increasingly difficult problem for health care institutions. There are commonly recommended strategies to combat CDI transmission, such as oral vancomycin for CDI treatment, increased hand hygiene with soap and water for health care workers, daily environmental disinfection of infected patient rooms, and contact isolation of diseased patients. However, the efficacy of these strategies, particularly for endemic CDI, has not been well studied. The objective of this research is to develop a valid, agent-based simulation model (ABM) to study C. difficile transmission and control in a midsized hospital. We develop an ABM of a midsized hospital with agents such as patients, health care workers, and visitors. We model the natural progression of CDI in a patient using a Markov chain and the transmission of CDI through agent and environmental interactions. We derive input parameters from aggregate patient data from the 2007-2010 Wisconsin Hospital Association and published medical literature. We define a calibration process, which we use to estimate transition probabilities of the Markov model by comparing simulation results to benchmark values found in published literature. In a comparison of CDI control strategies implemented individually, routine bleach disinfection of CDI-positive patient rooms provides the largest reduction in nosocomial asymptomatic colonization (21.8%) and nosocomial CDIs (42.8%). Additionally, vancomycin treatment provides the largest reduction in relapse CDIs (41.9%), CDI-related mortalities (68.5%), and total patient length of stay (21.6%). We develop a generalized ABM for CDI control that can be customized and further expanded to specific institutions and/or scenarios. Additionally, we estimate transition probabilities for a Markov model of natural CDI progression in a patient through calibration. © The Author(s) 2014.

  3. Numeric, Agent-based or System dynamics model? Which modeling approach is the best for vast population simulation?

    Science.gov (United States)

    Cimler, Richard; Tomaskova, Hana; Kuhnova, Jitka; Dolezal, Ondrej; Pscheidl, Pavel; Kuca, Kamil

    2018-02-01

    Alzheimer's disease is one of the most common mental illnesses. It is posited that more than 25 % of the population is affected by some mental disease during their lifetime. Treatment of each patient draws resources from the economy concerned. Therefore, it is important to quantify the potential economic impact. Agent-based, system dynamics and numerical approaches to dynamic modeling of the population of the European Union and its patients with Alzheimer's disease are presented in this article. Simulations, their characteristics, and the results from different modeling tools are compared. The results of these approaches are compared with EU population growth predictions from the statistical office of the EU by Eurostat. The methodology of a creation of the models is described and all three modeling approaches are compared. The suitability of each modeling approach for the population modeling is discussed. In this case study, all three approaches gave us the results corresponding with the EU population prediction. Moreover, we were able to predict the number of patients with AD and, based on the modeling method, we were also able to monitor different characteristics of the population. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Measurement-Based Hybrid Fluid-Flow Models for Fast Multi-Scale Simulation and Control

    National Research Council Canada - National Science Library

    Sohraby, Khosrow

    2004-01-01

    .... We point out that traditional queuing models are intractable or provide poor fit to real-life networks, while discrete-event simulation at the packet level can consume prohibitive amounts of CPU times...

  5. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation.

    Directory of Open Access Journals (Sweden)

    Misbah Razzaq

    Full Text Available Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered (SEIDQR(S/I along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.

  6. Simulating Serious Games: A Discrete-Time Computational Model Based on Cognitive Flow Theory

    Science.gov (United States)

    Westera, Wim

    2018-01-01

    This paper presents a computational model for simulating how people learn from serious games. While avoiding the combinatorial explosion of a games micro-states, the model offers a meso-level pathfinding approach, which is guided by cognitive flow theory and various concepts from learning sciences. It extends a basic, existing model by exposing…

  7. A Bézier-Spline-based Model for the Simulation of Hysteresis in Variably Saturated Soil

    Science.gov (United States)

    Cremer, Clemens; Peche, Aaron; Thiele, Luisa-Bianca; Graf, Thomas; Neuweiler, Insa

    2017-04-01

    Most transient variably saturated flow models neglect hysteresis in the p_c-S-relationship (Beven, 2012). Such models tend to inadequately represent matrix potential and saturation distribution. Thereby, when simulating flow and transport processes, fluid and solute fluxes might be overestimated (Russo et al., 1989). In this study, we present a simple, computationally efficient and easily applicable model that enables to adequately describe hysteresis in the p_c-S-relationship for variably saturated flow. This model can be seen as an extension to the existing play-type model (Beliaev and Hassanizadeh, 2001), where scanning curves are simplified as vertical lines between main imbibition and main drainage curve. In our model, we use continuous linear and Bézier-Spline-based functions. We show the successful validation of the model by numerically reproducing a physical experiment by Gillham, Klute and Heermann (1976) describing primary drainage and imbibition in a vertical soil column. With a deviation of 3%, the simple Bézier-Spline-based model performs significantly better that the play-type approach, which deviates by 30% from the experimental results. Finally, we discuss the realization of physical experiments in order to extend the model to secondary scanning curves and in order to determine scanning curve steepness. {Literature} Beven, K.J. (2012). Rainfall-Runoff-Modelling: The Primer. John Wiley and Sons. Russo, D., Jury, W. A., & Butters, G. L. (1989). Numerical analysis of solute transport during transient irrigation: 1. The effect of hysteresis and profile heterogeneity. Water Resources Research, 25(10), 2109-2118. https://doi.org/10.1029/WR025i010p02109. Beliaev, A.Y. & Hassanizadeh, S.M. (2001). A Theoretical Model of Hysteresis and Dynamic Effects in the Capillary Relation for Two-phase Flow in Porous Media. Transport in Porous Media 43: 487. doi:10.1023/A:1010736108256. Gillham, R., Klute, A., & Heermann, D. (1976). Hydraulic properties of a porous

  8. Agent-based Modeling Simulation Analysis on the Regulation of Institutional Investor's Encroachment Behavior in Stock Market

    Directory of Open Access Journals (Sweden)

    Yang Li

    2014-05-01

    Full Text Available Purpose: This study explores the effective regulation of institutional investor's encroachment behavior in stock market. Given the theoretical and practical importance, the present study examines the effect of the self-adaptive regulation strategy (adjusting the regulation factors such as punishment and the probability of investigating successfully in time for the sake of the small & medium-sized investor protection.Design/methodology/approach: This study was carried out through game theory and agent-based modeling simulation. Firstly, a dynamic game model was built to search the core factors of regulation and the equilibrium paths. Secondly, an agent-based modeling simulation model was built in Swarm to extend the game model. Finally, a simulation experiment (using virtual parameter values was performed to examine the effect of regulation strategy obtained form game model.Findings: The results of this study showed that the core factors of avoiding the institutional investor's encroachment behavior are the punishment and the probability of investigating successfully of the regulator. The core factors embody as the self-adaptability and the capability of regulator. If the regulator can adjust the regulation factors in time, the illegal behaviors will be avoided effectively.Research limitations/implications: The simulation experiment in this paper was performed with virtual parameter values. Although the results of experiment showed the effect of self-adaptive regulation, there are still some differences between simulation experiment and real market situation.Originality/value: The purpose of this study is to investigate an effective regulation strategy of institutional investor's encroachment behavior in stock market in order to maintain market order and protect the benefits of investors. Base on the game model and simulation model, a simulation experiment was preformed and the result showed that the self-adaptive regulation would be effective

  9. Mathematical Modeling and Simulation of SWRO Process Based on Simultaneous Method

    Directory of Open Access Journals (Sweden)

    Aipeng Jiang

    2014-01-01

    Full Text Available Reverse osmosis (RO technique is one of the most efficient ways for seawater desalination to solve the shortage of freshwater. For prediction and analysis of the performance of seawater reverse osmosis (SWRO process, an accurate and detailed model based on the solution-diffusion and mass transfer theory is established. Since the accurate formulation of the model includes many differential equations and strong nonlinear equations (differential and algebraic equations, DAEs, to solve the problem efficiently, the simultaneous method through orthogonal collocation on finite elements and large scale solver were used to obtain the solutions. The model was fully discretized into NLP (nonlinear programming with large scale variables and equations, and then the NLP was solved by large scale solver of IPOPT. Validation of the formulated model and solution method is verified by case study on a SWRO plant. Then simulation and analysis are carried out to demonstrate the performance of reverse osmosis process; operational conditions such as feed pressure and feed flow rate as well as feed temperature are also analyzed. This work is of significant meaning for the detailed understanding of RO process and future energy saving through operational optimization.

  10. Assessment of adsorbate density models for numerical simulations of zeolite-based heat storage applications

    International Nuclear Information System (INIS)

    Lehmann, Christoph; Beckert, Steffen; Gläser, Roger; Kolditz, Olaf; Nagel, Thomas

    2017-01-01

    Highlights: • Characteristic curves fit for binderless Zeolite 13XBFK. • Detailed comparison of adsorbate density models for Dubinin’s adsorption theory. • Predicted heat storage densities robust against choice of density model. • Use of simple linear density models sufficient. - Abstract: The study of water sorption in microporous materials is of increasing interest, particularly in the context of heat storage applications. The potential-theory of micropore volume filling pioneered by Polanyi and Dubinin is a useful tool for the description of adsorption equilibria. Based on one single characteristic curve, the system can be extensively characterised in terms of isotherms, isobars, isosteres, enthalpies etc. However, the mathematical description of the adsorbate density’s temperature dependence has a significant impact especially on the estimation of the energetically relevant adsorption enthalpies. Here, we evaluate and compare different models existing in the literature and elucidate those leading to realistic predictions of adsorption enthalpies. This is an important prerequisite for accurate simulations of heat and mass transport ranging from the laboratory scale to the reactor level of the heat store.

  11. An interactive system for creating object models from range data based on simulated annealing

    International Nuclear Information System (INIS)

    Hoff, W.A.; Hood, F.W.; King, R.H.

    1997-01-01

    In hazardous applications such as remediation of buried waste and dismantlement of radioactive facilities, robots are an attractive solution. Sensing to recognize and locate objects is a critical need for robotic operations in unstructured environments. An accurate 3-D model of objects in the scene is necessary for efficient high level control of robots. Drawing upon concepts from supervisory control, the authors have developed an interactive system for creating object models from range data, based on simulated annealing. Site modeling is a task that is typically performed using purely manual or autonomous techniques, each of which has inherent strengths and weaknesses. However, an interactive modeling system combines the advantages of both manual and autonomous methods, to create a system that has high operator productivity as well as high flexibility and robustness. The system is unique in that it can work with very sparse range data, tolerate occlusions, and tolerate cluttered scenes. The authors have performed an informal evaluation with four operators on 16 different scenes, and have shown that the interactive system is superior to either manual or automatic methods in terms of task time and accuracy

  12. Technology Evaluation of Process Configurations for Second Generation Bioethanol Production using Dynamic Model-based Simulations

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    An assessment of a number of different process flowsheets for bioethanol production was performed using dynamic model-based simulations. The evaluation employed diverse operational scenarios such as, fed-batch, continuous and continuous with recycle configurations. Each configuration was evaluated...... against the following benchmark criteria, yield (kg ethanol/kg dry-biomass), final product concentration and number of unit operations required in the different process configurations. The results has shown the process configuration for simultaneous saccharification and co-fermentation (SSCF) operating...... in continuous mode with a recycle of the SSCF reactor effluent, results in the best productivity of bioethanol among the proposed process configurations, with a yield of 0.18 kg ethanol /kg dry-biomass....

  13. Simulation of dynamic response of nuclear power plant based on user-defined model in PSASP

    International Nuclear Information System (INIS)

    Zhao Jie; Liu Dichen; Xiong Li; Chen Qi; Du Zhi; Lei Qingsheng

    2010-01-01

    Based on the energy transformation regularity in physical process of pressurized water reactors (PWR), PWR NPP models are established in PSASP (Power System Analysis Software Package), which are applicable for calculating the dynamic process of PWR NPP and power system transient stabilization. The power dynamic characteristics of PWR NPP is simulated and analyzed, including the PWR self-stability, self-regulation and power step responses under power regulation system. The results indicate that the PWR NPP can afford certain exterior disturbances and 10%P n step under temperature negative feedbacks. The regulate speed of PWR power can reach 5%P n /min under the power regulation system, which meets the requirement of peak regulation in Power Grid. (authors)

  14. Development of Reactor Core Model based on Optimal Analysis for Shinhanul no. 1, 2 Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyung-min [Korea Hydro Nuclear Power Co., Daejeon (Korea, Republic of)

    2016-10-15

    As one of the outputs of 'Development of the Shin Hanul Nuclear Plant(SHN) 1,2 Simulator' project which is being done by KHNP Central Research Institute, the SHN1,2 Simulator is being developed including the KNICS methodology and advanced Alarm Systems first applied to the Nuclear Power Plant in Korea, and the SHN 1,2 simulator adopts the virtually stimulated HMI(Human-Machine Interface) for the non-safety MMIS system, whose key-programs are identical to those applied to the real SHN 1,2 plants. The purpose of this paper is to develop localization core model by integrating the Simulator system with the Simulator core model though technology agreement of KAERI. To develop ShinHanul 1 and 2 reactor core simulator model, KHNP and KAERI create MASTER-SIM model and tried validation. And calculations of MASSIM{sub S}S program for MASTER{sub S}IM validation, are within tolerance range. Test has not yet been completed. And many verification will be conducted MASTER-SIM software is expected to be the highest economic software and satisfy international simulator standards.

  15. Improving multiple-point-based a priori models for inverse problems by combining Sequential Simulation with the Frequency Matching Method

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Lange, Katrine

    In order to move beyond simplified covariance based a priori models, which are typically used for inverse problems, more complex multiple-point-based a priori models have to be considered. By means of marginal probability distributions ‘learned’ from a training image, sequential simulation has...... proven to be an efficient way of obtaining multiple realizations that honor the same multiple-point statistics as the training image. The frequency matching method provides an alternative way of formulating multiple-point-based a priori models. In this strategy the pattern frequency distributions (i.......e. marginals) of the training image and a subsurface model are matched in order to obtain a solution with the same multiple-point statistics as the training image. Sequential Gibbs sampling is a simulation strategy that provides an efficient way of applying sequential simulation based algorithms as a priori...

  16. Simulation-based Education for Endoscopic Third Ventriculostomy: A Comparison Between Virtual and Physical Training Models.

    Science.gov (United States)

    Breimer, Gerben E; Haji, Faizal A; Bodani, Vivek; Cunningham, Melissa S; Lopez-Rios, Adriana-Lucia; Okrainec, Allan; Drake, James M

    2017-02-01

    The relative educational benefits of virtual reality (VR) and physical simulation models for endoscopic third ventriculostomy (ETV) have not been evaluated "head to head." To compare and identify the relative utility of a physical and VR ETV simulation model for use in neurosurgical training. Twenty-three neurosurgical residents and 3 fellows performed an ETV on both a physical and VR simulation model. Trainees rated the models using 5-point Likert scales evaluating the domains of anatomy, instrument handling, procedural content, and the overall fidelity of the simulation. Paired t tests were performed for each domain's mean overall score and individual items. The VR model has relative benefits compared with the physical model with respect to realistic representation of intraventricular anatomy at the foramen of Monro (4.5, standard deviation [SD] = 0.7 vs 4.1, SD = 0.6; P = .04) and the third ventricle floor (4.4, SD = 0.6 vs 4.0, SD = 0.9; P = .03), although the overall anatomy score was similar (4.2, SD = 0.6 vs 4.0, SD = 0.6; P = .11). For overall instrument handling and procedural content, the physical simulator outperformed the VR model (3.7, SD = 0.8 vs 4.5; SD = 0.5, P educational objectives. Training focused on learning anatomy or decision-making for anatomic cues may be aided with the VR simulation model. A focus on developing manual dexterity and technical skills using endoscopic equipment in the operating room may be better learned on the physical simulation model. Copyright © 2016 by the Congress of Neurological Surgeons

  17. Agent Based Modeling and Simulation of Pedestrian Crowds In Panic Situations

    KAUST Repository

    Alrashed, Mohammed

    2016-11-01

    The increasing occurrence of panic stampedes during mass events has motivated studying the impact of panic on crowd dynamics and the simulation of pedestrian flows in panic situations. The lack of understanding of panic stampedes still causes hundreds of fatalities each year, not to mention the scarce methodical studies of panic behavior capable of envisaging such crowd dynamics. Under those circumstances, there are thousands of fatalities and twice that many of injuries every year caused be crowd stampede worldwide, despite the tremendous efforts of crowd control and massive numbers of safekeeping forces. Pedestrian crowd dynamics are generally predictable in high-density crowds where pedestrians cannot move freely and thus gives rise to self-propelling interactions between pedestrians. Although every pedestrian has personal preferences, the motion dynamics can be modeled as a social force in such crowds. These forces are representations of internal preferences and objectives to perform certain actions or movements. The corresponding forces can be controlled for each individual to represent a different variety of behaviors that can be associated with panic situations such as escaping danger, clustering, and pushing. In this thesis, we use an agent-based model of pedestrian behavior in panic situations to predict the collective human behavior in such crowd dynamics. The proposed simulations suggests a practical way to alleviate fatalities and minimize the evacuation time in panic situations. Moreover, we introduce contagious panic and pushing behavior, resulting in a more realistic crowd dynamics model. The proposed methodology describes the intensity and spread of panic for each individual as a function of distances between pedestrians.

  18. Spatial-temporal-covariance-based modeling, analysis, and simulation of aero-optics wavefront aberrations.

    Science.gov (United States)

    Vogel, Curtis R; Tyler, Glenn A; Wittich, Donald J

    2014-07-01

    We introduce a framework for modeling, analysis, and simulation of aero-optics wavefront aberrations that is based on spatial-temporal covariance matrices extracted from wavefront sensor measurements. Within this framework, we present a quasi-homogeneous structure function to analyze nonhomogeneous, mildly anisotropic spatial random processes, and we use this structure function to show that phase aberrations arising in aero-optics are, for an important range of operating parameters, locally Kolmogorov. This strongly suggests that the d5/3 power law for adaptive optics (AO) deformable mirror fitting error, where d denotes actuator separation, holds for certain important aero-optics scenarios. This framework also allows us to compute bounds on AO servo lag error and predictive control error. In addition, it provides us with the means to accurately simulate AO systems for the mitigation of aero-effects, and it may provide insight into underlying physical processes associated with turbulent flow. The techniques introduced here are demonstrated using data obtained from the Airborne Aero-Optics Laboratory.

  19. System Dynamics based Dengue modeling environment to simulate evolution of Dengue infection under different climate scenarios

    Science.gov (United States)

    Anwar, R.; Khan, R.; Usmani, M.; Colwell, R. R.; Jutla, A.

    2017-12-01

    Vector borne infectious diseases such as Dengue, Zika and Chikungunya remain a public health threat. An estimate of the World Health Organization (WHO) suggests that about 2.5 billion people, representing ca. 40% of human population,are at increased risk of dengue; with more than 100 million infection cases every year. Vector-borne infections cannot be eradicated since disease causing pathogens survive in the environment. Over the last few decades dengue infection has been reported in more than 100 countries and is expanding geographically. Female Ae. Aegypti mosquito, the daytime active and a major vector for dengue virus, is associated with urban population density and regional climatic processes. However, mathematical quantification of relationships on abundance of vectors and climatic processes remain a challenge, particularly in regions where such data are not routinely collected. Here, using system dynamics based feedback mechanism, an algorithm integrating knowledge from entomological, meteorological and epidemiological processes is developed that has potential to provide ensemble simulations on risk of occurrence of dengue infection in human population. Using dataset from satellite remote sensing, the algorithm was calibrated and validated using actual dengue case data of Iquitos, Peru. We will show results on model capabilities in capturing initiation and peak in the observed time series. In addition, results from several simulation scenarios under different climatic conditions will be discussed.

  20. A simulation based optimization approach to model and design life support systems for manned space missions

    Science.gov (United States)

    Aydogan, Selen

    This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.

  1. Modeling and simulation of virtual human's coordination based on multi-agent systems

    Science.gov (United States)

    Zhang, Mei; Wen, Jing-Hua; Zhang, Zu-Xuan; Zhang, Jian-Qing

    2006-10-01

    The difficulties and hotspots researched in current virtual geographic environment (VGE) are sharing space and multiusers operation, distributed coordination and group decision-making. The theories and technologies of MAS provide a brand-new environment for analysis, design and realization of distributed opening system. This paper takes cooperation among virtual human in VGE which multi-user participate in as main researched object. First we describe theory foundation truss of VGE, and present the formalization description of Multi-Agent System (MAS). Then we detailed analyze and research arithmetic of collectivity operating behavior learning of virtual human based on best held Genetic Algorithm(GA), and establish dynamics action model which Multi-Agents and object interact dynamically and colony movement strategy. Finally we design a example which shows how 3 evolutional Agents cooperate to complete the task of colony pushing column box, and design a virtual world prototype of virtual human pushing box collectively based on V-Realm Builder 2.0, moreover we make modeling and dynamic simulation with Simulink 6.

  2. Tidal Simulations of an Incised-Valley Fluvial System with a Physics-Based Geologic Model

    Science.gov (United States)

    Ghayour, K.; Sun, T.

    2012-12-01

    Physics-based geologic modeling approaches use fluid flow in conjunction with sediment transport and deposition models to devise evolutionary geologic models that focus on underlying physical processes and attempt to resolve them at pertinent spatial and temporal scales. Physics-based models are particularly useful when the evolution of a depositional system is driven by the interplay of autogenic processes and their response to allogenic controls. This interplay can potentially create complex reservoir architectures with high permeability sedimentary bodies bounded by a hierarchy of shales that can effectively impede flow in the subsurface. The complex stratigraphy of tide-influenced fluvial systems is an example of such co-existing and interacting environments of deposition. The focus of this talk is a novel formulation of boundary conditions for hydrodynamics-driven models of sedimentary systems. In tidal simulations, a time-accurate boundary treatment is essential for proper imposition of tidal forcing and fluvial inlet conditions where the flow may be reversed at times within a tidal cycle. As such, the boundary treatment at the inlet has to accommodate for a smooth transition from inflow to outflow and vice-versa without creating numerical artifacts. Our numerical experimentations showed that boundary condition treatments based on a local (frozen) one-dimensional approach along the boundary normal which does not account for the variation of flow quantities in the tangential direction often lead to unsatisfactory results corrupted by numerical artifacts. In this talk, we propose a new boundary treatment that retains all spatial and temporal terms in the model and as such is capable to account for nonlinearities and sharp variations of model variables near boundaries. The proposed approach borrows heavily from the idea set forth by J. Sesterhenn1 for compressible Navier-Stokes equations. The methodology is successfully applied to a tide-influenced incised

  3. In silico modelling and molecular dynamics simulation studies of thiazolidine based PTP1B inhibitors.

    Science.gov (United States)

    Mahapatra, Manoj Kumar; Bera, Krishnendu; Singh, Durg Vijay; Kumar, Rajnish; Kumar, Manoj

    2018-04-01

    Protein tyrosine phosphatase 1B (PTP1B) has been identified as a negative regulator of insulin and leptin signalling pathway; hence, it can be considered as a new therapeutic target of intervention for the treatment of type 2 diabetes. Inhibition of this molecular target takes care of both diabetes and obesity, i.e. diabestiy. In order to get more information on identification and optimization of lead, pharmacophore modelling, atom-based 3D QSAR, docking and molecular dynamics studies were carried out on a set of ligands containing thiazolidine scaffold. A six-point pharmacophore model consisting of three hydrogen bond acceptor (A), one negative ionic (N) and two aromatic rings (R) with discrete geometries as pharmacophoric features were developed for a predictive 3D QSAR model. The probable binding conformation of the ligands within the active site was studied through molecular docking. The molecular interactions and the structural features responsible for PTP1B inhibition and selectivity were further supplemented by molecular dynamics simulation study for a time scale of 30 ns. The present investigation has identified some of the indispensible structural features of thiazolidine analogues which can further be explored to optimize PTP1B inhibitors.

  4. Modelling and Simulation of SVPWM Based Vector Controlled HVDC Light Systems

    Directory of Open Access Journals (Sweden)

    Ajay Kumar MOODADLA

    2012-11-01

    Full Text Available Recent upgrades in power electronics technology have lead to the improvements of insulated gate bipolar transistor (IGBT based Voltage source converter High voltage direct current (VSC HVDC transmission systems. These are also commercially known as HVDC Light systems, which are popular in renewable, micro grid, and electric power systems. Out of different pulse width modulation (PWM schemes, Space vector PWM (SVPWM control scheme finds growing importance in power system applications because of its better dc bus utilization. In this paper, modelling of the converter is described, and SVPWM scheme is utilized to control the HVDC Light system in order to achieve better DC bus utilization, harmonic reduction, and for reduced power fluctuations. The simulations are carried out in the MATLAB/SIMULINK environment and the results are provided for steady state and dynamic conditions. Finally, the performance of SVPWM based vector controlled HVDC Light transmission system is compared with sinusoidal pulse width modulation (SPWM based HVDC Light system in terms of output voltage and total harmonic distortion (THD.

  5. Simulation-optimization model of reservoir operation based on target storage curves

    Directory of Open Access Journals (Sweden)

    Hong-bin Fang

    2014-10-01

    Full Text Available This paper proposes a new storage allocation rule based on target storage curves. Joint operating rules are also proposed to solve the operation problems of a multi-reservoir system with joint demands and water transfer-supply projects. The joint operating rules include a water diversion rule to determine the amount of diverted water in a period, a hedging rule based on an aggregated reservoir to determine the total release from the system, and a storage allocation rule to specify the release from each reservoir. A simulation-optimization model was established to optimize the key points of the water diversion curves, the hedging rule curves, and the target storage curves using the improved particle swarm optimization (IPSO algorithm. The multi-reservoir water supply system located in Liaoning Province, China, including a water transfer-supply project, was employed as a case study to verify the effectiveness of the proposed join operating rules and target storage curves. The results indicate that the proposed operating rules are suitable for the complex system. The storage allocation rule based on target storage curves shows an improved performance with regard to system storage distribution.

  6. Aviation Safety Simulation Model

    Science.gov (United States)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  7. A variable hard sphere-based phenomenological inelastic collision model for rarefied gas flow simulations by the direct simulation Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Prasanth, P S; Kakkassery, Jose K; Vijayakumar, R, E-mail: y3df07@nitc.ac.in, E-mail: josekkakkassery@nitc.ac.in, E-mail: vijay@nitc.ac.in [Department of Mechanical Engineering, National Institute of Technology Calicut, Kozhikode - 673 601, Kerala (India)

    2012-04-01

    A modified phenomenological model is constructed for the simulation of rarefied flows of polyatomic non-polar gas molecules by the direct simulation Monte Carlo (DSMC) method. This variable hard sphere-based model employs a constant rotational collision number, but all its collisions are inelastic in nature and at the same time the correct macroscopic relaxation rate is maintained. In equilibrium conditions, there is equi-partition of energy between the rotational and translational modes and it satisfies the principle of reciprocity or detailed balancing. The present model is applicable for moderate temperatures at which the molecules are in their vibrational ground state. For verification, the model is applied to the DSMC simulations of the translational and rotational energy distributions in nitrogen gas at equilibrium and the results are compared with their corresponding Maxwellian distributions. Next, the Couette flow, the temperature jump and the Rayleigh flow are simulated; the viscosity and thermal conductivity coefficients of nitrogen are numerically estimated and compared with experimentally measured values. The model is further applied to the simulation of the rotational relaxation of nitrogen through low- and high-Mach-number normal shock waves in a novel way. In all cases, the results are found to be in good agreement with theoretically expected and experimentally observed values. It is concluded that the inelastic collision of polyatomic molecules can be predicted well by employing the constructed variable hard sphere (VHS)-based collision model.

  8. Model-based framework for multi-axial real-time hybrid simulation testing

    Science.gov (United States)

    Fermandois, Gaston A.; Spencer, Billie F.

    2017-10-01

    Real-time hybrid simulation is an efficient and cost-effective dynamic testing technique for performance evaluation of structural systems subjected to earthquake loading with rate-dependent behavior. A loading assembly with multiple actuators is required to impose realistic boundary conditions on physical specimens. However, such a testing system is expected to exhibit significant dynamic coupling of the actuators and suffer from time lags that are associated with the dynamics of the servo-hydraulic system, as well as control-structure interaction (CSI). One approach to reducing experimental errors considers a multi-input, multi-output (MIMO) controller design, yielding accurate reference tracking and noise rejection. In this paper, a framework for multi-axial real-time hybrid simulation (maRTHS) testing is presented. The methodology employs a real-time feedback-feedforward controller for multiple actuators commanded in Cartesian coordinates. Kinematic transformations between actuator space and Cartesian space are derived for all six-degrees-offreedom of the moving platform. Then, a frequency domain identification technique is used to develop an accurate MIMO transfer function of the system. Further, a Cartesian-domain model-based feedforward-feedback controller is implemented for time lag compensation and to increase the robustness of the reference tracking for given model uncertainty. The framework is implemented using the 1/5th-scale Load and Boundary Condition Box (LBCB) located at the University of Illinois at Urbana- Champaign. To demonstrate the efficacy of the proposed methodology, a single-story frame subjected to earthquake loading is tested. One of the columns in the frame is represented physically in the laboratory as a cantilevered steel column. For realtime execution, the numerical substructure, kinematic transformations, and controllers are implemented on a digital signal processor. Results show excellent performance of the maRTHS framework when six

  9. The Simulation and Correction to the Brain Deformation Based on the Linear Elastic Model in IGS

    Institute of Scientific and Technical Information of China (English)

    MU Xiao-lan; SONG Zhi-jian

    2004-01-01

    @@ The brain deformation is a vital factor affecting the precision of the IGS and it becomes a hotspot to simulate and correct the brain deformation recently.The research organizations, which firstly resolved the brain deformation with the physical models, have the Image Processing and Analysis department of Yale University, Biomedical Modeling Lab of Vanderbilt University and so on. The former uses the linear elastic model; the latter uses the consolidation model.The linear elastic model only needs to drive the model using the surface displacement of exposed brain cortex,which is more convenient to be measured in the clinic.

  10. Simulation-based surgical education.

    Science.gov (United States)

    Evgeniou, Evgenios; Loizou, Peter

    2013-09-01

    The reduction in time for training at the workplace has created a challenge for the traditional apprenticeship model of training. Simulation offers the opportunity for repeated practice in a safe and controlled environment, focusing on trainees and tailored to their needs. Recent technological advances have led to the development of various simulators, which have already been introduced in surgical training. The complexity and fidelity of the available simulators vary, therefore depending on our recourses we should select the appropriate simulator for the task or skill we want to teach. Educational theory informs us about the importance of context in professional learning. Simulation should therefore recreate the clinical environment and its complexity. Contemporary approaches to simulation have introduced novel ideas for teaching teamwork, communication skills and professionalism. In order for simulation-based training to be successful, simulators have to be validated appropriately and integrated in a training curriculum. Within a surgical curriculum, trainees should have protected time for simulation-based training, under appropriate supervision. Simulation-based surgical education should allow the appropriate practice of technical skills without ignoring the clinical context and must strike an adequate balance between the simulation environment and simulators. © 2012 The Authors. ANZ Journal of Surgery © 2012 Royal Australasian College of Surgeons.

  11. LoRa Scalability: A Simulation Model Based on Interference Measurements

    Directory of Open Access Journals (Sweden)

    Jetmir Haxhibeqiri

    2017-05-01

    Full Text Available LoRa is a long-range, low power, low bit rate and single-hop wireless communication technology. It is intended to be used in Internet of Things (IoT applications involving battery-powered devices with low throughput requirements. A LoRaWAN network consists of multiple end nodes that communicate with one or more gateways. These gateways act like a transparent bridge towards a common network server. The amount of end devices and their throughput requirements will have an impact on the performance of the LoRaWAN network. This study investigates the scalability in terms of the number of end devices per gateway of single-gateway LoRaWAN deployments. First, we determine the intra-technology interference behavior with two physical end nodes, by checking the impact of an interfering node on a transmitting node. Measurements show that even under concurrent transmission, one of the packets can be received under certain conditions. Based on these measurements, we create a simulation model for assessing the scalability of a single gateway LoRaWAN network. We show that when the number of nodes increases up to 1000 per gateway, the losses will be up to 32%. In such a case, pure Aloha will have around 90% losses. However, when the duty cycle of the application layer becomes lower than the allowed radio duty cycle of 1%, losses will be even lower. We also show network scalability simulation results for some IoT use cases based on real data.

  12. LoRa Scalability: A Simulation Model Based on Interference Measurements.

    Science.gov (United States)

    Haxhibeqiri, Jetmir; Van den Abeele, Floris; Moerman, Ingrid; Hoebeke, Jeroen

    2017-05-23

    LoRa is a long-range, low power, low bit rate and single-hop wireless communication technology. It is intended to be used in Internet of Things (IoT) applications involving battery-powered devices with low throughput requirements. A LoRaWAN network consists of multiple end nodes that communicate with one or more gateways. These gateways act like a transparent bridge towards a common network server. The amount of end devices and their throughput requirements will have an impact on the performance of the LoRaWAN network. This study investigates the scalability in terms of the number of end devices per gateway of single-gateway LoRaWAN deployments. First, we determine the intra-technology interference behavior with two physical end nodes, by checking the impact of an interfering node on a transmitting node. Measurements show that even under concurrent transmission, one of the packets can be received under certain conditions. Based on these measurements, we create a simulation model for assessing the scalability of a single gateway LoRaWAN network. We show that when the number of nodes increases up to 1000 per gateway, the losses will be up to 32%. In such a case, pure Aloha will have around 90% losses. However, when the duty cycle of the application layer becomes lower than the allowed radio duty cycle of 1%, losses will be even lower. We also show network scalability simulation results for some IoT use cases based on real data.

  13. A comparison between rate-and-state friction and microphysical models, based on numerical simulations of fault slip

    Science.gov (United States)

    van den Ende, M. P. A.; Chen, J.; Ampuero, J.-P.; Niemeijer, A. R.

    2018-05-01

    Rate-and-state friction (RSF) is commonly used for the characterisation of laboratory friction experiments, such as velocity-step tests. However, the RSF framework provides little physical basis for the extrapolation of these results to the scales and conditions of natural fault systems, and so open questions remain regarding the applicability of the experimentally obtained RSF parameters for predicting seismic cycle transients. As an alternative to classical RSF, microphysics-based models offer means for interpreting laboratory and field observations, but are generally over-simplified with respect to heterogeneous natural systems. In order to bridge the temporal and spatial gap between the laboratory and nature, we have implemented existing microphysical model formulations into an earthquake cycle simulator. Through this numerical framework, we make a direct comparison between simulations exhibiting RSF-controlled fault rheology, and simulations in which the fault rheology is dictated by the microphysical model. Even though the input parameters for the RSF simulation are directly derived from the microphysical model, the microphysics-based simulations produce significantly smaller seismic event sizes than the RSF-based simulation, and suggest a more stable fault slip behaviour. Our results reveal fundamental limitations in using classical rate-and-state friction for the extrapolation of laboratory results. The microphysics-based approach offers a more complete framework in this respect, and may be used for a more detailed study of the seismic cycle in relation to material properties and fault zone pressure-temperature conditions.

  14. Future changes in extreme precipitation in the Rhine basin based on global and regional climate model simulations

    NARCIS (Netherlands)

    Pelt, van S.C.; Beersma, J.J.; Buishand, T.A.; Hurk, van den B.J.J.M.; Kabat, P.

    2012-01-01

    Probability estimates of the future change of extreme precipitation events are usually based on a limited number of available global climate model (GCM) or regional climate model (RCM) simulations. Since floods are related to heavy precipitation events, this restricts the assessment of flood risks.

  15. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  16. Quality assurance target for community-based breast cancer screening in China: a model simulation.

    Science.gov (United States)

    Yang, Lan; Wang, Jing; Cheng, Juan; Wang, Yuan; Lu, Wenli

    2018-03-07

    We aimed to clarify the feasibility of a community-based screening strategy for breast cancer in Tianjin, China; to identify the factors that most significantly influenced its feasibility; and to identify the reference range for quality control. A state-transition Markov model simulated a hypothetical cohort of 100,000 healthy women, the start aged was set at 35 years and the time horizon was set to 50 years. The primary outcome for the model was the incremental cost-utility ratio (ICUR), defined as the program's cost per quality-adjusted life year (QALY) gained. Three screening strategies providing by community health service for women aged 35 to 69 years was compared regarding to different intervals. The probability of the ICUR being below 20 272USD (i.e., triple the annual gross domestic product [3 GDPs]) per QALY saved was 100% for annual screening strategy and screening every three years. Only when the attendance rate was > 50%, the probability for annual screening would be cost effective > 95%. The probability for the annual screening strategy being cost effective could reach to 95% for a willingness-to-pay (WTP) of 2 GDPs when the compliance rate for transfer was > 80%. When 10% stage I tumors were detected by screening, the probability of the annual screening strategy being cost effective would be up to 95% for a WTP > 3 GDPs. Annual community-based breast cancer screening was cost effective for a WTP of 3 GDP based on the incidence of breast cancer in Tianjin, China. Measures are needed to ensure performance indicators to a desirable level for the cost-effectiveness of breast cancer screening.

  17. One-dimensional GIS-based model compared with a two-dimensional model in urban floods simulation.

    Science.gov (United States)

    Lhomme, J; Bouvier, C; Mignot, E; Paquier, A

    2006-01-01

    A GIS-based one-dimensional flood simulation model is presented and applied to the centre of the city of Nîmes (Gard, France), for mapping flow depths or velocities in the streets network. The geometry of the one-dimensional elements is derived from the Digital Elevation Model (DEM). The flow is routed from one element to the next using the kinematic wave approximation. At the crossroads, the flows in the downstream branches are computed using a conceptual scheme. This scheme was previously designed to fit Y-shaped pipes junctions, and has been modified here to fit X-shaped crossroads. The results were compared with the results of a two-dimensional hydrodynamic model based on the full shallow water equations. The comparison shows that good agreements can be found in the steepest streets of the study zone, but differences may be important in the other streets. Some reasons that can explain the differences between the two models are given and some research possibilities are proposed.

  18. Simulation of Nitrogen and Phosphorus Removal in Ecological Ditch Based on EFDC Model

    Science.gov (United States)

    Li, S. M.; Wang, X. L.; Zhou, Q. Y.; Han, N. N.

    2018-03-01

    Agricultural non-point source pollution threatens water quality and ecological system recently. To control it, the first and most important task is to control the migration and transformation of nitrogen and phosphorus in the agricultural ditches. An ecological ditch was designed, and according to the design a pilot device was built, the mechanism of N and P removal in ditches under the collaboration of aquatic organisms-hydraulic power was studied through the dynamic and static experiments, in order to find out the specific influences of different environmental factors such as influent concentration, influent flow and water level. The transport and diffusion of N and P in the ditch was simulated by a three dimensional water quality model EFDC, the simulation results and the experimental data were compared. The average relative errors of EFDC model simulated results were all less than 15%, which verified the reliability of the model.

  19. A PC-based discrete event simulation model of the Civilian Radioactive Waste Management System

    International Nuclear Information System (INIS)

    Airth, G.L.; Joy, D.S.; Nehls, J.W.

    1991-01-01

    A System Simulation Model has been developed for the Department of Energy to simulate the movement of individual waste packages (spent fuel assemblies and fuel containers) through the Civilian Radioactive Waste Management System (CRWMS). A discrete event simulation language, GPSS/PC, which runs on an IBM/PC and operates under DOS 5.0, mathematically represents the movement and processing of radioactive waste packages through the CRWMS and the interaction of these packages with the equipment in the various facilities. This model can be used to quantify the impacts of different operating schedules, operational rules, system configurations, and equipment reliability and availability considerations on the performance of processes comprising the CRWMS and how these factors combine to determine overall system performance for the purpose of making system design decisions. The major features of the System Simulation Model are: the ability to reference characteristics of the different types of radioactive waste (age, burnup, etc.) in order to make operational and/or system design decisions, the ability to place stochastic variations on operational parameters such as processing time and equipment outages, and the ability to include a rigorous simulation of the transportation system. Output from the model includes the numbers, types, and characteristics of waste packages at selected points in the CRWMS and the extent to which various resources will be utilized in order to transport, process, and emplace the waste

  20. Numerical simulation of Trichel pulses of negative DC corona discharge based on a plasma chemical model

    Science.gov (United States)

    Chen, Xiaoyue; Lan, Lei; Lu, Hailiang; Wang, Yu; Wen, Xishan; Du, Xinyu; He, Wangling

    2017-10-01

    A numerical simulation method of negative direct current (DC) corona discharge based on a plasma chemical model is presented, and a coaxial cylindrical gap is adopted. There were 15 particle species and 61 kinds of collision reactions electrons involved, and 22 kinds of reactions between ions are considered in plasma chemical reactions. Based on this method, continuous Trichel pulses are calculated on about a 100 us timescale, and microcosmic physicochemical process of negative DC corona discharge in three different periods is discussed. The obtained results show that the amplitude of Trichel pulses is between 1-2 mA, and that pulse interval is in the order of 10-5 s. The positive ions produced by avalanche ionization enhanced the electric field near the cathode at the beginning of the pulse, then disappeared from the surface of cathode. The electric field decreases and the pulse ceases to develop. The negative ions produced by attachment slowly move away from the cathode, and the electric field increases gradually until the next pulse begins to develop. The positive and negative ions with the highest density during the corona discharge process are O4+ and O3- , respectively.

  1. Modeling of High-Speed InP DHBTs using Electromagnetic Simulation Based De-embedding

    DEFF Research Database (Denmark)

    Johansen, Tom Keinicke; Krozer, Viktor; Konczykowska, Agnieszka

    2006-01-01

    In this paper an approach for high-speed InP DHBT modeling valid to 110 GHz is reported. Electromagnetic (EM) simulation is applied to predict the embedded network model caused by pad parasitics. The form of the parasitic network calls for a 4-step de-embedding approach. Applying direct parameter...... extraction on the de-embedded device response leads to accurate small-signal model description of the InP DHBT. An parameter extraction approach is described for the Agilent HBT model, which assures consistency between large-signal and bias-dependent smallsignal modeling....

  2. Skin sensitization: Modeling based on skin metabolism simulation and formation of protein conjugates

    DEFF Research Database (Denmark)

    Dimitrov, Sabcho; Low, Lawrence; Patlewicz, Grace

    2005-01-01

    alerting groups, three-dimensional (3D)-QSARs were developed to describe the multiplicity of physicochemical, steric, and electronic parameters. These 3D-QSARs, so-called pattern recognition-type models, were applied each time a latent alerting group was identified in a parent chemical or its generated...... in the model building. The TIssue MEtabolism Simulator (TIMES) software was used to integrate a skin metabolism simulator and 3D-QSARs to evaluate the reactivity of chemicals thus predicting their likely skin sensitization potency....

  3. A Pore Scale Flow Simulation of Reconstructed Model Based on the Micro Seepage Experiment

    Directory of Open Access Journals (Sweden)

    Jianjun Liu

    2017-01-01

    Full Text Available Researches on microscopic seepage mechanism and fine description of reservoir pore structure play an important role in effective development of low and ultralow permeability reservoir. The typical micro pore structure model was established by two ways of the conventional model reconstruction method and the built-in graphics function method of Comsol® in this paper. A pore scale flow simulation was conducted on the reconstructed model established by two different ways using creeping flow interface and Brinkman equation interface, respectively. The results showed that the simulation of the two models agreed well in the distribution of velocity, pressure, Reynolds number, and so on. And it verified the feasibility of the direct reconstruction method from graphic file to geometric model, which provided a new way for diversifying the numerical study of micro seepage mechanism.

  4. A numerical simulation of wheel spray for simplified vehicle model based on discrete phase method

    Directory of Open Access Journals (Sweden)

    Xingjun Hu

    2015-07-01

    Full Text Available Road spray greatly affects vehicle body soiling and driving safety. The study of road spray has attracted increasing attention. In this article, computational fluid dynamics software with widely used finite volume method code was employed to investigate the numerical simulation of spray induced by a simplified wheel model and a modified square-back model proposed by the Motor Industry Research Association. Shear stress transport k-omega turbulence model, discrete phase model, and Eulerian wall-film model were selected. In the simulation process, the phenomenon of breakup and coalescence of drops were considered, and the continuous and discrete phases were treated as two-way coupled in momentum and turbulent motion. The relationship between the vehicle external flow structure and body soiling was also discussed.

  5. Conceptual design of a nucleo electric simulator with PBMR reactor based in Reduced order models

    International Nuclear Information System (INIS)

    Valle H, J.; Morales S, J.B.

    2005-01-01

    This project has as purpose to know to depth the operation of a PBMR nucleo electric type (Pebble Bed Modular Reactor), which has a reactor of moderate graphite spheres and fuel of uranium dioxide cooled with Helium and Brayton thermodynamic cycle. The simulator seeks to describe the dynamics of the one process of energy generation in the nuclear fuel, the process of transport toward the coolant one and the conversion to mechanical energy in the turbo-generators as well as in the heat exchangers indispensable for the process. The dynamics of reload of the fuel elements it is not modeled in detail but their effects are represented in the parameters of the pattern. They are modeled also the turbo-compressors of the primary circuit of the work fluid. The control of the power of the nuclear reactor is modeled by means of reactivity functions specified in the simulation platform. The proposed mathematical models will be settled in the platform of simulation of Simulink-Mat Lab. The proposed control panels for this simulator can be designed and to implement using the box of tools of Simulink that facilitates this process. The work presents the mathematical models more important used for their future implementation in Simulink. (Author)

  6. A PC-based discrete event simulation model of the civilian radioactive waste management system

    International Nuclear Information System (INIS)

    Airth, G.L.; Joy, D.S.; Nehls, J.W.

    1992-01-01

    This paper discusses a System Simulation Model which has been developed for the Department of Energy to simulate the movement of individual waste packages (spent fuel assemblies and fuel containers) through the Civilian Radioactive Waste Management System (CRWMS). A discrete event simulation language, GPSS/PC, which runs on an IBM/PC and operates under DOS 5.0, mathematically represents the movement and processing of radioactive waste packages through the CRWMS and the interaction of these packages with the equipment in the various facilities. The major features of the System Simulation Model are: the ability to reference characteristics of the different types of radioactive waste (age, burnup, etc.) in order to make operational and/or system design decisions, the ability to place stochastic variations on operational parameters such as processing time and equipment outages, and the ability to include a rigorous simulation of the transportation system. Output from the model includes the numbers, types, and characteristics of waste packages at selected points in the CRWMS and the extent to which various resources will be utilized in order to transport, process, and emplace the waste

  7. A method to solve the aircraft magnetic field model basing on geomagnetic environment simulation

    International Nuclear Information System (INIS)

    Lin, Chunsheng; Zhou, Jian-jun; Yang, Zhen-yu

    2015-01-01

    In aeromagnetic survey, it is difficult to solve the aircraft magnetic field model by flying for some unman controlled or disposable aircrafts. So a model solving method on the ground is proposed. The method simulates the geomagnetic environment where the aircraft is flying and creates the background magnetic field samples which is the same as the magnetic field arose by aircraft’s maneuvering. Then the aircraft magnetic field model can be solved by collecting the magnetic field samples. The method to simulate the magnetic environment and the method to control the errors are presented as well. Finally, an experiment is done for verification. The result shows that the model solving precision and stability by the method is well. The calculated model parameters by the method in one district can be used in worldwide districts as well. - Highlights: • A method to solve the aircraft magnetic field model on the ground is proposed. • The method solves the model by simulating dynamic geomagnetic environment as in the real flying. • The way to control the error of the method was analyzed. • An experiment is done for verification

  8. General model for Pc-based simulation of PWR and BWR plant components

    Energy Technology Data Exchange (ETDEWEB)

    Ratemi, W M; Abomustafa, A M [Faculty of enginnering, alfateh univerity Tripoli, (Libyan Arab Jamahiriya)

    1995-10-01

    In this paper, we present a basic mathematical model derived from physical principles to suit the simulation of PWR-components such as pressurizer, intact steam generator, ruptured steam generator, and the reactor component of a BWR-plant. In our development, we produced an NMMS-package for nuclear modular modelling simulation. Such package is installed on a personal computer and it is designed to be user friendly through color graphics windows interfacing. The package works under three environments, namely, pre-processor, simulation, and post-processor. Our analysis of results using cross graphing technique for steam generator tube rupture (SGTR) accident, yielded a new proposal for on-line monitoring of control strategy of SGTR-accident for nuclear or conventional power plant. 4 figs.

  9. Computer simulation of 2D grain growth using a cellular automata model based on the lowest energy principle

    International Nuclear Information System (INIS)

    He Yizhu; Ding Hanlin; Liu Liufa; Shin, Keesam

    2006-01-01

    The morphology, topology and kinetics of normal grain growth in two-dimension were studied by computer simulation using a cellular automata (Canada) model based on the lowest energy principle. The thermodynamic energy that follows Maxwell-Boltzmann statistics has been introduced into this model for the calculation of energy change. The transition that can reduce the system energy to the lowest level is chosen to occur when there is more than one possible transition direction. The simulation results show that the kinetics of normal grain growth follows the Burke equation with the growth exponent m = 2. The analysis of topology further indicates that normal grain growth can be simulated fairly well by the present CA model. The vanishing of grains with different number of sides is discussed in the simulation

  10. Modelling and Simulating Complex Systems in Biology: introducing NetBioDyn : A Pedagogical and Intuitive Agent-Based Software

    OpenAIRE

    Ballet, Pascal; Rivière, Jérémy; Pothet, Alain; Théron, Michaël; Pichavant, Karine; Abautret, Frank; Fronville, Alexandra; Rodin, Vincent

    2017-01-01

    International audience; Modelling and teaching complex biological systems is a difficult process. Multi-Agent Based Simulations (MABS) have proved to be an appropriate approach both in research and education when dealing with such systems including emergent, self-organizing phenomena. This chapter presents NetBioDyn, an original software aimed at biologists (students, teachers, researchers) to easily build and simulate complex biological mechanisms observed in multicellular and molecular syst...

  11. Numerical simulation of hot-melt extrusion processes for amorphous solid dispersions using model-based melt viscosity.

    Science.gov (United States)

    Bochmann, Esther S; Steffens, Kristina E; Gryczke, Andreas; Wagner, Karl G

    2018-03-01

    Simulation of HME processes is a valuable tool for increased process understanding and ease of scale-up. However, the experimental determination of all required input parameters is tedious, namely the melt rheology of the amorphous solid dispersion (ASD) in question. Hence, a procedure to simplify the application of hot-melt extrusion (HME) simulation for forming amorphous solid dispersions (ASD) is presented. The commercial 1D simulation software Ludovic ® was used to conduct (i) simulations using a full experimental data set of all input variables including melt rheology and (ii) simulations using model-based melt viscosity data based on the ASDs glass transition and the physical properties of polymeric matrix only. Both types of HME computation were further compared to experimental HME results. Variation in physical properties (e.g. heat capacity, density) and several process characteristics of HME (residence time distribution, energy consumption) among the simulations and experiments were evaluated. The model-based melt viscosity was calculated by using the glass transition temperature (T g ) of the investigated blend and the melt viscosity of the polymeric matrix by means of a T g -viscosity correlation. The results of measured melt viscosity and model-based melt viscosity were similar with only few exceptions, leading to similar HME simulation outcomes. At the end, the experimental effort prior to HME simulation could be minimized and the procedure enables a good starting point for rational development of ASDs by means of HME. As model excipients, Vinylpyrrolidone-vinyl acetate copolymer (COP) in combination with various APIs (carbamazepine, dipyridamole, indomethacin, and ibuprofen) or polyethylene glycol (PEG 1500) as plasticizer were used to form the ASDs. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A new approach to model-based simulation of disordered polymer blend solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Stenzel, Ole; Thiedmann, Ralf; Schmidt, Volker [Institute of Stochastics, Ulm University, Ulm, 89069 (Germany); Koster, L.J.A. [Molecular Electronics, Zernike Institute for Advanced Materials, University of Groningen, Groningen, 9747 AG (Netherlands); Oosterhout, Stefan D.; Janssen, Rene A.J. [Chemical Engineering and Chemistry, Molecular Materials and Nanosystems, Eindhoven University of Technology, Eindhoven, 5600 MB (Netherlands)

    2012-03-21

    The 3D nanomorphology of blends of two different (organic and inorganic) solid phases as used in bulk heterojunction solar cells is described by a spatial stochastic model. The model is fitted to 3D image data describing the photoactive layer of poly(3-hexylthiophene)-ZnO (P3HT-ZnO) solar cells fabricated with varying spin-coating velocities. A scenario analysis is performed where 3D morphologies are simulated for different spin-coating velocities to elucidate the correlation between processing conditions, morphology, and efficiency of hybrid P3HT-ZnO solar cells. The simulated morphologies are analyzed quantitatively in terms of structural and physical characteristics. It is found that there is a tendency for the morphology to coarsen with increasing spin-coating velocity, creating larger domains of P3HT and ZnO. The impact of the spin-coating velocity on the connectivity of the morphology and the existence of percolation pathways for charge carriers in the resulting films appears insignificant, but the quality of percolation pathways, considering the charge carrier mobility, strongly varies with the spin-coating velocity, especially in the ZnO phase. Also, the exciton quenching efficiency decreases significantly for films deposited at large spin-coating velocities. The stochastic simulation model investigated is compared to a simulated annealing model and is found to provide a better fit to the experimental data. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  13. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  14. Design base transient analysis using the real-time nuclear reactor simulator model

    International Nuclear Information System (INIS)

    Tien, K.K.; Yakura, S.J.; Morin, J.P.; Gregory, M.V.

    1987-01-01

    A real-time simulation model has been developed to describe the dynamic response of all major systems in a nuclear process reactor. The model consists of a detailed representation of all hydraulic components in the external coolant circulating loops consisting of piping, valves, pumps and heat exchangers. The reactor core is described by a three-dimensional neutron kinetics model with detailed representation of assembly coolant and moderator thermal hydraulics. The models have been developed to support a real-time training simulator, therefore, they reproduce system parameters characteristic of steady state normal operation with high precision. The system responses for postulated severe transients such as large pipe breaks, loss of pumping power, piping leaks, malfunctions in control rod insertion, and emergency injection of neutron absorber are calculated to be in good agreement with reference safety analyses. Restrictions were imposed by the requirement that the resulting code be able to run in real-time with sufficient spare time to allow interfacing with secondary systems and simulator hardware. Due to hardware set-up and real plant instrumentation, simplifications due to symmetry were not allowed. The resulting code represents a coarse-node engineering model in which the level of detail has been tailored to the available computing power of a present generation super-minicomputer. Results for several significant transients, as calculated by the real-time model, are compared both to actual plant data and to results generated by fine-mesh analysis codes

  15. Design base transient analysis using the real-time nuclear reactor simulator model

    International Nuclear Information System (INIS)

    Tien, K.K.; Yakura, S.J.; Morin, J.P.; Gregory, M.V.

    1987-01-01

    A real-time simulation model has been developed to describe the dynamic response of all major systems in a nuclear process reactor. The model consists of a detailed representation of all hydraulic components in the external coolant circulating loops consisting of piping, valves, pumps and heat exchangers. The reactor core is described by a three-dimensional neutron kinetics model with detailed representation of assembly coolant and mode-rator thermal hydraulics. The models have been developed to support a real-time training simulator, therefore, they reproduce system parameters characteristic of steady state normal operation with high precision. The system responses for postulated severe transients such as large pipe breaks, loss of pumping power, piping leaks, malfunctions in control rod insertion, and emergency injection of neutron absorber are calculated to be in good agreement with reference safety analyses. Restrictions were imposed by the requirement that the resulting code be able to run in real-time with sufficient spare time to allow interfacing with secondary systems and simulator hardware. Due to hardware set-up and real plant instrumentation, simplifications due to symmetry were not allowed. The resulting code represents a coarse-node engineering model in which the level of detail has been tailored to the available computing power of a present generation super-minicomputer. Results for several significant transients, as calculated by the real-time model, are compared both to actual plant data and to results generated by fine-mesh analysis codes

  16. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  17. Simulation model of ANN based maximum power point tracking controller for solar PV system

    Energy Technology Data Exchange (ETDEWEB)

    Rai, Anil K.; Singh, Bhupal [Department of Electrical and Electronics Engineering, Ajay Kumar Garg Engineering College, Ghaziabad 201009 (India); Kaushika, N.D.; Agarwal, Niti [School of Research and Development, Bharati Vidyapeeth College of Engineering, A-4 Paschim Vihar, New Delhi 110063 (India)

    2011-02-15

    In this paper the simulation model of an artificial neural network (ANN) based maximum power point tracking controller has been developed. The controller consists of an ANN tracker and the optimal control unit. The ANN tracker estimates the voltages and currents corresponding to a maximum power delivered by solar PV (photovoltaic) array for variable cell temperature and solar radiation. The cell temperature is considered as a function of ambient air temperature, wind speed and solar radiation. The tracker is trained employing a set of 124 patterns using the back propagation algorithm. The mean square error of tracker output and target values is set to be of the order of 10{sup -5} and the successful convergent of learning process takes 1281 epochs. The accuracy of the ANN tracker has been validated by employing different test data sets. The control unit uses the estimates of the ANN tracker to adjust the duty cycle of the chopper to optimum value needed for maximum power transfer to the specified load. (author)

  18. Modeling and Simulation of Polarization in Internet Group Opinions Based on Cellular Automata

    Directory of Open Access Journals (Sweden)

    Yaofeng Zhang

    2015-01-01

    Full Text Available Hot events on Internet always attract many people who usually form one or several opinion camps through discussion. For the problem of polarization in Internet group opinions, we propose a new model based on Cellular Automata by considering neighbors, opinion leaders, and external influences. Simulation results show the following: (1 It is easy to form the polarization for both continuous opinions and discrete opinions when we only consider neighbors influence, and continuous opinions are more effective in speeding the polarization of group. (2 Coevolution mechanism takes more time to make the system stable, and the global coupling mechanism leads the system to consensus. (3 Opinion leaders play an important role in the development of consensus in Internet group opinions. However, both taking the opinion leaders as zealots and taking some randomly selected individuals as zealots are not conductive to the consensus. (4 Double opinion leaders with consistent opinions will accelerate the formation of group consensus, but the opposite opinions will lead to group polarization. (5 Only small external influences can change the evolutionary direction of Internet group opinions.

  19. Effects of distribution density and cell dimension of 3D vegetation model on canopy NDVI simulation base on DART

    Science.gov (United States)

    Tao, Zhu; Shi, Runhe; Zeng, Yuyan; Gao, Wei

    2017-09-01

    The 3D model is an important part of simulated remote sensing for earth observation. Regarding the small-scale spatial extent of DART software, both the details of the model itself and the number of models of the distribution have an important impact on the scene canopy Normalized Difference Vegetation Index (NDVI).Taking the phragmitesaustralis in the Yangtze Estuary as an example, this paper studied the effect of the P.australias model on the canopy NDVI, based on the previous studies of the model precision, mainly from the cell dimension of the DART software and the density distribution of the P.australias model in the scene, As well as the choice of the density of the P.australiass model under the cost of computer running time in the actual simulation. The DART Cell dimensions and the density of the scene model were set by using the optimal precision model from the existing research results. The simulation results of NDVI with different model densities under different cell dimensions were analyzed by error analysis. By studying the relationship between relative error, absolute error and time costs, we have mastered the density selection method of P.australias model in the simulation of small-scale spatial scale scene. Experiments showed that the number of P.australias in the simulated scene need not be the same as those in the real environment due to the difference between the 3D model and the real scenarios. The best simulation results could be obtained by keeping the density ratio of about 40 trees per square meter, simultaneously, of the visual effects.

  20. Data-driven Travel Demand Modelling and Agent-based Traffic Simulation in Amsterdam Urban Area

    NARCIS (Netherlands)

    Melnikov, V.R.; Krzhizhanovskaya, V.V.; Lees, M.H.; Boukhanovsky, A.V.

    2016-01-01

    The goal of this project is the development of a large-scale agent-based traffic simulation system for Amsterdam urban area, validated on sensor data and adjusted for decision support in critical situations and for policy making in sustainable city development, emission control and electric car

  1. MODELLING TEMPORAL SCHEDULE OF URBAN TRAINS USING AGENT-BASED SIMULATION AND NSGA2-BASED MULTIOBJECTIVE OPTIMIZATION APPROACHES

    Directory of Open Access Journals (Sweden)

    M. Sahelgozin

    2015-12-01

    Full Text Available Increasing distances between locations of residence and services leads to a large number of daily commutes in urban areas. Developing subway systems has been taken into consideration of transportation managers as a response to this huge amount of travel demands. In developments of subway infrastructures, representing a temporal schedule for trains is an important task; because an appropriately designed timetable decreases Total passenger travel times, Total Operation Costs and Energy Consumption of trains. Since these variables are not positively correlated, subway scheduling is considered as a multi-criteria optimization problem. Therefore, proposing a proper solution for subway scheduling has been always a controversial issue. On the other hand, research on a phenomenon requires a summarized representation of the real world that is known as Model. In this study, it is attempted to model temporal schedule of urban trains that can be applied in Multi-Criteria Subway Schedule Optimization (MCSSO problems. At first, a conceptual framework is represented for MCSSO. Then, an agent-based simulation environment is implemented to perform Sensitivity Analysis (SA that is used to extract the interrelations between the framework components. These interrelations is then taken into account in order to construct the proposed model. In order to evaluate performance of the model in MCSSO problems, Tehran subway line no. 1 is considered as the case study. Results of the study show that the model was able to generate an acceptable distribution of Pareto-optimal solutions which are applicable in the real situations while solving a MCSSO is the goal. Also, the accuracy of the model in representing the operation of subway systems was significant.

  2. Use of stratigraphic, petrographic, hydrogeologic and geochemical information for hydrogeologic modelling based on geostatistical simulation

    International Nuclear Information System (INIS)

    Rohlig, K.J.; Fischer, H.; Poltl, B.

    2004-01-01

    This paper describes the stepwise utilization of geologic information from various sources for the construction of hydrogeological models of a sedimentary site by means of geostatistical simulation. It presents a practical application of aquifer characterisation by firstly simulating hydrogeological units and then the hydrogeological parameters. Due to the availability of a large amount of hydrogeological, geophysical and other data and information, the Gorleben site (Northern Germany) has been used for a case study in order to demonstrate the approach. The study, which has not yet been completed, tries to incorporate as much as possible of the available information and to characterise the remaining uncertainties. (author)

  3. ChainMail based neural dynamics modeling of soft tissue deformation for surgical simulation.

    Science.gov (United States)

    Zhang, Jinao; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2017-07-20

    Realistic and real-time modeling and simulation of soft tissue deformation is a fundamental research issue in the field of surgical simulation. In this paper, a novel cellular neural network approach is presented for modeling and simulation of soft tissue deformation by combining neural dynamics of cellular neural network with ChainMail mechanism. The proposed method formulates the problem of elastic deformation into cellular neural network activities to avoid the complex computation of elasticity. The local position adjustments of ChainMail are incorporated into the cellular neural network as the local connectivity of cells, through which the dynamic behaviors of soft tissue deformation are transformed into the neural dynamics of cellular neural network. Experiments demonstrate that the proposed neural network approach is capable of modeling the soft tissues' nonlinear deformation and typical mechanical behaviors. The proposed method not only improves ChainMail's linear deformation with the nonlinear characteristics of neural dynamics but also enables the cellular neural network to follow the principle of continuum mechanics to simulate soft tissue deformation.

  4. MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.

    Science.gov (United States)

    Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan

    2016-02-01

    A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.

  5. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  6. A Cellular Automata-based Model for Simulating Restitution Property in a Single Heart Cell.

    Science.gov (United States)

    Sabzpoushan, Seyed Hojjat; Pourhasanzade, Fateme

    2011-01-01

    Ventricular fibrillation is the cause of the most sudden mortalities. Restitution is one of the specific properties of ventricular cell. The recent findings have clearly proved the correlation between the slope of restitution curve with ventricular fibrillation. This; therefore, mandates the modeling of cellular restitution to gain high importance. A cellular automaton is a powerful tool for simulating complex phenomena in a simple language. A cellular automaton is a lattice of cells where the behavior of each cell is determined by the behavior of its neighboring cells as well as the automata rule. In this paper, a simple model is depicted for the simulation of the property of restitution in a single cardiac cell using cellular automata. At first, two state variables; action potential and recovery are introduced in the automata model. In second, automata rule is determined and then recovery variable is defined in such a way so that the restitution is developed. In order to evaluate the proposed model, the generated restitution curve in our study is compared with the restitution curves from the experimental findings of valid sources. Our findings indicate that the presented model is not only capable of simulating restitution in cardiac cell, but also possesses the capability of regulating the restitution curve.

  7. Simulation of recrystallization textures in FCC materials based on a self consistent model

    International Nuclear Information System (INIS)

    Bolmaro, R.E; Roatta, A; Fourty, A.L; Signorelli, J.W; Bertinetti, M.A

    2004-01-01

    The development of re-crystallization textures in FCC polycrystalline materials has been a long lasting scientific problem. The appearance of the so-called cubic component in high stack fault energy laminated FCC materials is not an entirely understood phenomenon. This work approaches the problem using a self- consistent simulation technique of homogenization. The information on first preferential neighbors is used in the model to consider grain boundary energies and intra granular misorientations and to treat the growth of grains and the mobility of the grain boundary. The energies accumulated by deformations are taken as conducting energies of the nucleation and the later growth is statistically governed by the grain boundary energies. The model shows the correct trend for re-crystallization textures obtained from previously simulated deformation textures for high and low stack fault energy FCC materials. The model's topological representation is discussed (CW)

  8. UML Profile for Mining Process: Supporting Modeling and Simulation Based on Metamodels of Activity Diagram

    Directory of Open Access Journals (Sweden)

    Andrea Giubergia

    2014-01-01

    Full Text Available An UML profile describes lightweight extension mechanism to the UML by defining custom stereotypes, tagged values, and constraints. They are used to adapt UML metamodel to different platforms and domains. In this paper we present an UML profile for models supporting event driving simulation. In particular, we use the Arena simulation tool and we focus on the mining process domain. Profiles provide an easy way to obtain well-defined specifications, regulated by the Object Management Group (OMG. They can be used as a presimulation technique to obtain solid models for the mining industry. In this work we present a new profile to extend the UML metamodel; in particular we focus on the activity diagram. This extended model is applied to an industry problem involving loading and transportation of minerals in the field of mining process.

  9. Simulation of optimal arctic routes using a numerical sea ice model based on an ice-coupled ocean circulation method

    Directory of Open Access Journals (Sweden)

    Jong-Ho Nam

    2013-06-01

    Full Text Available Ever since the Arctic region has opened its mysterious passage to mankind, continuous attempts to take advantage of its fastest route across the region has been made. The Arctic region is still covered by thick ice and thus finding a feasible navigating route is essential for an economical voyage. To find the optimal route, it is necessary to establish an efficient transit model that enables us to simulate every possible route in advance. In this work, an enhanced algorithm to determine the optimal route in the Arctic region is introduced. A transit model based on the simulated sea ice and environmental data numerically modeled in the Arctic is developed. By integrating the simulated data into a transit model, further applications such as route simulation, cost estimation or hindcast can be easily performed. An interactive simulation system that determines the optimal Arctic route using the transit model is developed. The simulation of optimal routes is carried out and the validity of the results is discussed.

  10. Numerical simulation on ferrofluid flow in fractured porous media based on discrete-fracture model

    Science.gov (United States)

    Huang, Tao; Yao, Jun; Huang, Zhaoqin; Yin, Xiaolong; Xie, Haojun; Zhang, Jianguang

    2017-06-01

    Water flooding is an efficient approach to maintain reservoir pressure and has been widely used to enhance oil recovery. However, preferential water pathways such as fractures can significantly decrease the sweep efficiency. Therefore, the utilization ratio of injected water is seriously affected. How to develop new flooding technology to further improve the oil recovery in this situation is a pressing problem. For the past few years, controllable ferrofluid has caused the extensive concern in oil industry as a new functional material. In the presence of a gradient in the magnetic field strength, a magnetic body force is produced on the ferrofluid so that the attractive magnetic forces allow the ferrofluid to be manipulated to flow in any desired direction through the control of the external magnetic field. In view of these properties, the potential application of using the ferrofluid as a new kind of displacing fluid for flooding in fractured porous media is been studied in this paper for the first time. Considering the physical process of the mobilization of ferrofluid through porous media by arrangement of strong external magnetic fields, the magnetic body force was introduced into the Darcy equation and deals with fractures based on the discrete-fracture model. The fully implicit finite volume method is used to solve mathematical model and the validity and accuracy of numerical simulation, which is demonstrated through an experiment with ferrofluid flowing in a single fractured oil-saturated sand in a 2-D horizontal cell. At last, the water flooding and ferrofluid flooding in a complex fractured porous media have been studied. The results showed that the ferrofluid can be manipulated to flow in desired direction through control of the external magnetic field, so that using ferrofluid for flooding can raise the scope of the whole displacement. As a consequence, the oil recovery has been greatly improved in comparison to water flooding. Thus, the ferrofluid

  11. A method to generate equivalent energy spectra and filtration models based on measurement for multidetector CT Monte Carlo dosimetry simulations

    International Nuclear Information System (INIS)

    Turner, Adam C.; Zhang Di; Kim, Hyun J.; DeMarco, John J.; Cagnon, Chris H.; Angel, Erin; Cody, Dianna D.; Stevens, Donna M.; Primak, Andrew N.; McCollough, Cynthia H.; McNitt-Gray, Michael F.

    2009-01-01

    The purpose of this study was to present a method for generating x-ray source models for performing Monte Carlo (MC) radiation dosimetry simulations of multidetector row CT (MDCT) scanners. These so-called ''equivalent'' source models consist of an energy spectrum and filtration description that are generated based wholly on the measured values and can be used in place of proprietary manufacturer's data for scanner-specific MDCT MC simulations. Required measurements include the half value layers (HVL 1 and HVL 2 ) and the bowtie profile (exposure values across the fan beam) for the MDCT scanner of interest. Using these measured values, a method was described (a) to numerically construct a spectrum with the calculated HVLs approximately equal to those measured (equivalent spectrum) and then (b) to determine a filtration scheme (equivalent filter) that attenuates the equivalent spectrum in a similar fashion as the actual filtration attenuates the actual x-ray beam, as measured by the bowtie profile measurements. Using this method, two types of equivalent source models were generated: One using a spectrum based on both HVL 1 and HVL 2 measurements and its corresponding filtration scheme and the second consisting of a spectrum based only on the measured HVL 1 and its corresponding filtration scheme. Finally, a third type of source model was built based on the spectrum and filtration data provided by the scanner's manufacturer. MC simulations using each of these three source model types were evaluated by comparing the accuracy of multiple CT dose index (CTDI) simulations to measured CTDI values for 64-slice scanners from the four major MDCT manufacturers. Comprehensive evaluations were carried out for each scanner using each kVp and bowtie filter combination available. CTDI experiments were performed for both head (16 cm in diameter) and body (32 cm in diameter) CTDI phantoms using both central and peripheral measurement positions. Both equivalent source model types

  12. Modeling and simulation of five-axis virtual machine based on NX

    Science.gov (United States)

    Li, Xiaoda; Zhan, Xianghui

    2018-04-01

    Virtual technology in the machinery manufacturing industry has shown the role of growing. In this paper, the Siemens NX software is used to model the virtual CNC machine tool, and the parameters of the virtual machine are defined according to the actual parameters of the machine tool so that the virtual simulation can be carried out without loss of the accuracy of the simulation. How to use the machine builder of the CAM module to define the kinematic chain and machine components of the machine is described. The simulation of virtual machine can provide alarm information of tool collision and over cutting during the process to users, and can evaluate and forecast the rationality of the technological process.

  13. [Non-linear System Dynamics Simulation Modeling of Adolescent Obesity: Using Korea Youth Risk Behavior Web-based Survey].

    Science.gov (United States)

    Lee, Hanna; Park, Eun Suk; Yu, Jae Kook; Yun, Eun Kyoung

    2015-10-01

    The purpose of this study was to develop a system dynamics model for adolescent obesity in Korea that could be used for obesity policy analysis. On the basis of the casual loop diagram, a model was developed by converting to stock and flow diagram. The Vensim DSS 5.0 program was used in the model development. We simulated method of moments to the calibration of this model with data from The Korea Youth Risk Behavior Web-based Survey 2005 to 2013. We ran the scenario simulation. This model can be used to understand the current adolescent obesity rate, predict the future obesity rate, and be utilized as a tool for controlling the risk factors. The results of the model simulation match well with the data. It was identified that a proper model, able to predict obesity probability, was established. These results of stock and flow diagram modeling in adolescent obesity can be helpful in development of obesity by policy planners and other stakeholders to better anticipate the multiple effects of interventions in both the short and the long term. In the future we suggest the development of an expanded model based on this adolescent obesity model.

  14. SUN-RAH: a nucleoelectric BWR university simulator based in reduced order models

    International Nuclear Information System (INIS)

    Morales S, J.B.; Lopez R, A.; Sanchez B, A.; Sanchez S, R.; Hernandez S, A.

    2003-01-01

    The development of a simulator that allows to represent the dynamics of a nucleo electric central, with nuclear reactor of the BWR type, using reduced order models is presented. These models present the characteristics defined by the dominant poles of the system (1) and most of those premature operation transitories in a power station can be reproduced with considerable fidelity if the models are identified with data of plant or references of a code of better estimate like RAMONA, TRAC (2) or RELAP. The models of the simulator are developments or own simplifications starting from the physical laws and retaining the main terms. This work describes the objective of the project and the general specifications of the University student of Nucleo electric simulator with Boiling Water Reactor type (SUN-RAH) as well as the finished parts that fundamentally are the nuclear reactor, the one of steam supply (NSSS), the plant balance (BOP), the main controllers of the plant and the implemented graphic interfaces. The pendent goals as well as the future developments and applications of SUN-RAH are described. (Author)

  15. Evolution dynamics modeling and simulation of logistics enterprise's core competence based on service innovation

    Science.gov (United States)

    Yang, Bo; Tong, Yuting

    2017-04-01

    With the rapid development of economy, the development of logistics enterprises in China is also facing a huge challenge, especially the logistics enterprises generally lack of core competitiveness, and service innovation awareness is not strong. Scholars in the process of studying the core competitiveness of logistics enterprises are mainly from the perspective of static stability, not from the perspective of dynamic evolution to explore. So the author analyzes the influencing factors and the evolution process of the core competence of logistics enterprises, using the method of system dynamics to study the cause and effect of the evolution of the core competence of logistics enterprises, construct a system dynamics model of evolution of core competence logistics enterprises, which can be simulated by vensim PLE. The analysis for the effectiveness and sensitivity of simulation model indicates the model can be used as the fitting of the evolution process of the core competence of logistics enterprises and reveal the process and mechanism of the evolution of the core competence of logistics enterprises, and provide management strategies for improving the core competence of logistics enterprises. The construction and operation of computer simulation model offers a kind of effective method for studying the evolution of logistics enterprise core competence.

  16. Modelica-based Modeling and Simulation to Support Research and Development in Building Energy and Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2009-02-12

    Traditional building simulation programs possess attributes that make them difficult to use for the design and analysis of building energy and control systems and for the support of model-based research and development of systems that may not already be implemented in these programs. This article presents characteristic features of such applications, and it shows how equation-based object-oriented modelling can meet requirements that arise in such applications. Next, the implementation of an open-source component model library for building energy systems is presented. The library has been developed using the equation-based object-oriented Modelica modelling language. Technical challenges of modelling and simulating such systems are discussed. Research needs are presented to make this technology accessible to user groups that have more stringent requirements with respect to the numerical robustness of simulation than a research community may have. Two examples are presented in which models from the here described library were used. The first example describes the design of a controller for a nonlinear model of a heating coil using model reduction and frequency domain analysis. The second example describes the tuning of control parameters for a static pressure reset controller of a variable air volume flow system. The tuning has been done by solving a non-convex optimization problem that minimizes fan energy subject to state constraints.

  17. On the required complexity of vehicle dynamic models for use in simulation-based highway design.

    Science.gov (United States)

    Brown, Alexander; Brennan, Sean

    2014-06-01

    This paper presents the results of a comprehensive project whose goal is to identify roadway design practices that maximize the margin of safety between the friction supply and friction demand. This study is motivated by the concern for increased accident rates on curves with steep downgrades, geometries that contain features that interact in all three dimensions - planar curves, grade, and superelevation. This complexity makes the prediction of vehicle skidding quite difficult, particularly for simple simulation models that have historically been used for road geometry design guidance. To obtain estimates of friction margin, this study considers a range of vehicle models, including: a point-mass model used by the American Association of State Highway Transportation Officials (AASHTO) design policy, a steady-state "bicycle model" formulation that considers only per-axle forces, a transient formulation of the bicycle model commonly used in vehicle stability control systems, and finally, a full multi-body simulation (CarSim and TruckSim) regularly used in the automotive industry for high-fidelity vehicle behavior prediction. The presence of skidding--the friction demand exceeding supply--was calculated for each model considering a wide range of vehicles and road situations. The results indicate that the most complicated vehicle models are generally unnecessary for predicting skidding events. However, there are specific maneuvers, namely braking events within lane changes and curves, which consistently predict the worst-case friction margins across all models. This suggests that any vehicle model used for roadway safety analysis should include the effects of combined cornering and braking. The point-mass model typically used by highway design professionals may not be appropriate to predict vehicle behavior on high-speed curves during braking in low-friction situations. However, engineers can use the results of this study to help select the appropriate vehicle dynamic

  18. The Effect of Model Fidelity on Learning Outcomes of a Simulation-Based Education Program for Central Venous Catheter Insertion.

    Science.gov (United States)

    Diederich, Emily; Mahnken, Jonathan D; Rigler, Sally K; Williamson, Timothy L; Tarver, Stephen; Sharpe, Matthew R

    2015-12-01

    Simulation-based education for central venous catheter (CVC) insertion has been repeatedly documented to improve performance, but the impact of simulation model fidelity has not been described. The aim of this study was to examine the impact of the physical fidelity of the simulation model on learning outcomes for a simulation-based education program for CVC insertion. Forty consecutive residents rotating through the medical intensive care unit of an academic medical center completed a simulation-based education program for CVC insertion. The curriculum was designed in accordance with the principles of deliberate practice and mastery learning. Each resident underwent baseline skills testing and was then randomized to training on a commercially available CVC model with high physical fidelity (High-Fi group) or a simply constructed model with low physical fidelity (Low-Fi group) in a noninferiority trial. Upon completion of their medical intensive care unit rotation 4 weeks later, residents returned for repeat skills testing on the high-fidelity model using a 26-item checklist. The mean (SD) posttraining score on the 26-item checklist for the Low-Fi group was 23.8 (2.2) (91.5%) and was not inferior to the mean (SD) score for the High-Fi group of 22.5 (2.6) (86.5%) (P Simulation-based education using equipment with low physical fidelity can achieve learning outcomes comparable with those with high-fidelity equipment, as long as other aspects of fidelity are maintained and robust educational principles are applied during the design of the curriculum.

  19. Simulation of counter-current imbibition in water-wet fractured reservoirs based on discrete-fracture model

    Directory of Open Access Journals (Sweden)

    Wang Yueying

    2017-08-01

    Full Text Available Isolated fractures usually exist in fractured media systems, where the capillary pressure in the fracture is lower than that of the matrix, causing the discrepancy in oil recoveries between fractured and non-fractured porous media. Experiments, analytical solutions and conventional simulation methods based on the continuum model approach are incompetent or insufficient in describing media containing isolated fractures. In this paper, the simulation of the counter-current imbibition in fractured media is based on the discrete-fracture model (DFM. The interlocking or arrangement of matrix and fracture system within the model resembles the traditional discrete fracture network model and the hybrid-mixed-finite-element method is employed to solve the associated equations. The Behbahani experimental data validates our simulation solution for consistency. The simulation results of the fractured media show that the isolated-fractures affect the imbibition in the matrix block. Moreover, the isolated fracture parameters such as fracture length and fracture location influence the trend of the recovery curves. Thus, the counter-current imbibition behavior of media with isolated fractures can be predicted using this method based on the discrete-fracture model.

  20. Simulated precipitation diurnal cycles over East Asia using different CAPE-based convective closure schemes in WRF model

    Science.gov (United States)

    Yang, Ben; Zhou, Yang; Zhang, Yaocun; Huang, Anning; Qian, Yun; Zhang, Lujun

    2018-03-01

    Closure assumption in convection parameterization is critical for reasonably modeling the precipitation diurnal variation in climate models. This study evaluates the precipitation diurnal cycles over East Asia during the summer of 2008 simulated with three convective available potential energy (CAPE) based closure assumptions, i.e. CAPE-relaxing (CR), quasi-equilibrium (QE), and free-troposphere QE (FTQE) and investigates the impacts of planetary boundary layer (PBL) mixing, advection, and radiation on the simulation by using the weather research and forecasting model. The sensitivity of precipitation diurnal cycle to PBL vertical resolution is also examined. Results show that the precipitation diurnal cycles simulated with different closures all exhibit large biases over land and the simulation with FTQE closure agrees best with observation. In the simulation with QE closure, the intensified PBL mixing after sunrise is responsible for the late-morning peak of convective precipitation, while in the simulation with FTQE closure, convective precipitation is mainly controlled by advection cooling. The relative contributions of different processes to precipitation formation are functions of rainfall intensity. In the simulation with CR closure, the dynamical equilibrium in the free troposphere still can be reached, implying the complex cause-effect relationship between atmospheric motion and convection. For simulations in which total CAPE is consumed for the closures, daytime precipitation decreases with increased PBL resolution because thinner model layer produces lower convection starting layer, leading to stronger downdraft cooling and CAPE consumption. The sensitivity of the diurnal peak time of precipitation to closure assumption can also be modulated by changes in PBL vertical resolution. The results of this study help us better understand the impacts of various processes on the precipitation diurnal cycle simulation.

  1. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  2. Estimation of Financial Agent-Based Models with Simulated Maximum Likelihood

    Czech Academy of Sciences Publication Activity Database

    Kukačka, Jiří; Baruník, Jozef

    2017-01-01

    Roč. 85, č. 1 (2017), s. 21-45 ISSN 0165-1889 R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : heterogeneous agent model, * simulated maximum likelihood * switching Subject RIV: AH - Economics OBOR OECD: Finance Impact factor: 1.000, year: 2016 http://library.utia.cas.cz/separaty/2017/E/kukacka-0478481.pdf

  3. Developing an Agent-Based Model to Simulate Urban Land-Use Expansion (Case Study: Qazvin)

    OpenAIRE

    F. Nourian; A. A. Alesheikh; F. Hosseinali

    2012-01-01

    Extended abstract1-IntroductionUrban land-use expansion is a challenging issue in developing countries. Increases in population as well as the immigration from the villages to the cities are the two major factors for that phenomenon. Those factors have reduced the influence of efforts that try to limit the cities’ boundaries. Thus, spatial planners always look for the models that simulate the expansion of urban land-uses and enable them to prevent unbalanced expansions of cities and guide the...

  4. Development of thermodynamically-based models for simulation of hydrogeochemical processes coupled to channel flow processes in abandoned underground mines

    Energy Technology Data Exchange (ETDEWEB)

    Kruse, N.A., E-mail: natalie.kruse@ncl.ac.uk [Sir Joseph Swan Institute for Energy Research, Newcastle University, Newcastle upon Tyne NE1 7RU (United Kingdom); Younger, P.L. [Sir Joseph Swan Institute for Energy Research, Newcastle University, Newcastle upon Tyne NE1 7RU (United Kingdom)

    2009-07-15

    Accurate modeling of changing geochemistry in mine water can be an important tool in post-mining site management. The Pollutant Sources and Sinks in Underground Mines (POSSUM) model and Pollutant Loadings Above Average Pyrite Influenced Geochemistry POSSUM (PLAYING POSSUM) model were developed using object-oriented programming techniques to simulate changing geochemistry in abandoned underground mines over time. The conceptual model was created to avoid significant simplifying assumptions that decrease the accuracy and defensibility of model solutions. POSSUM and PLAYING POSSUM solve for changes in flow rate and depth of flow using a finite difference hydrodynamics model then, subsequently, solve for geochemical changes at distinct points along the flow path. Geochemical changes are modeled based on a suite of 28 kinetically controlled mineral weathering reactions. Additional geochemical transformations due to reversible sorption, dissolution and precipitation of acid generating salts and mineral precipitation are also simulated using simplified expressions. Contaminant transport is simulated using a novel application of the Random-Walk method. By simulating hydrogeochemical changes with a physically and thermodynamically controlled model, the 'state of the art' in post-mining management can be advanced.

  5. The human body metabolism process mathematical simulation based on Lotka-Volterra model

    Science.gov (United States)

    Oliynyk, Andriy; Oliynyk, Eugene; Pyptiuk, Olexandr; DzierŻak, RóŻa; Szatkowska, Małgorzata; Uvaysova, Svetlana; Kozbekova, Ainur

    2017-08-01

    The mathematical model of metabolism process in human organism based on Lotka-Volterra model has beeng proposed, considering healing regime, nutrition system, features of insulin and sugar fragmentation process in the organism. The numerical algorithm of the model using IV-order Runge-Kutta method has been realized. After the result of calculations the conclusions have been made, recommendations about using the modeling results have been showed, the vectors of the following researches are defined.

  6. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  7. Adjoint-Based Climate Model Tuning: Application to the Planet Simulator

    Science.gov (United States)

    Lyu, Guokun; Köhl, Armin; Matei, Ion; Stammer, Detlef

    2018-01-01

    The adjoint method is used to calibrate the medium complexity climate model "Planet Simulator" through parameter estimation. Identical twin experiments demonstrate that this method can retrieve default values of the control parameters when using a long assimilation window of the order of 2 months. Chaos synchronization through nudging, required to overcome limits in the temporal assimilation window in the adjoint method, is employed successfully to reach this assimilation window length. When assimilating ERA-Interim reanalysis data, the observations of air temperature and the radiative fluxes are the most important data for adjusting the control parameters. The global mean net longwave fluxes at the surface and at the top of the atmosphere are significantly improved by tuning two model parameters controlling the absorption of clouds and water vapor. The global mean net shortwave radiation at the surface is improved by optimizing three model parameters controlling cloud optical properties. The optimized parameters improve the free model (without nudging terms) simulation in a way similar to that in the assimilation experiments. Results suggest a promising way for tuning uncertain parameters in nonlinear coupled climate models.

  8. A Bubble-Based Drag Model at the Local-Grid Level for Eulerian Simulation of Bubbling Fluidized Beds

    Directory of Open Access Journals (Sweden)

    Kun Hong

    2016-01-01

    Full Text Available A bubble-based drag model at the local-grid level is proposed to simulate gas-solid flows in bubbling fluidized beds of Geldart A particles. In this model, five balance equations are derived from the mass and the momentum conservation. This set of equations along with necessary correlations for bubble diameter and voidage of emulsion phase is solved to obtain seven local structural parameters (uge, upe, εe, δb, ub, db, and ab which describe heterogeneous flows of bubbling fluidized beds. The modified drag coefficient obtained from the above-mentioned structural parameters is then incorporated into the two-fluid model to simulate the hydrodynamics of Geldart A particles in a lab-scale bubbling fluidized bed. The comparison between experimental and simulation results for the axial and radial solids concentration profiles is promising.

  9. Simulating Land-Use Change using an Agent-Based Land Transaction Model

    Science.gov (United States)

    Bakker, M. M.; van Dijk, J.; Alam, S. J.

    2013-12-01

    In the densely populated cultural landscapes of Europe, the vast majority of all land is owned by private parties, be it farmers (the majority), nature organizations, property developers, or citizens. Therewith, the vast majority of all land-use change arises from land transactions between different owner types: successful farms expand at the expense of less successful farms, and meanwhile property developers, individual citizens, and nature organizations also actively purchase land. These land transactions are driven by specific properties of the land, by governmental policies, and by the (economic) motives of both buyers and sellers. Climate/global change can affect these drivers at various scales: at the local scale changes in hydrology can make certain land less or more desirable; at the global scale the agricultural markets will affect motives of farmers to buy or sell land; while at intermediate (e.g. provincial) scales property developers and nature conservationists may be encouraged or discouraged to purchase land. The cumulative result of all these transactions becomes manifest in changing land-use patterns, and consequent environmental responses. Within the project Climate Adaptation for Rural Areas an agent-based land-use model was developed that explores the future response of individual land users to climate change, within the context of wider global change (i.e. policy and market change). It simulates the exchange of land among farmers and between farmers and nature organizations and property developers, for a specific case study area in the east of the Netherlands. Results show that local impacts of climate change can result in a relative stagnation in the land market in waterlogged areas. Furthermore, the increase in dairying at the expense of arable cultivation - as has been observed in the area in the past - is slowing down as arable produce shows a favourable trend in the agricultural world market. Furthermore, budgets for nature managers are

  10. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  11. The simulation of surface fire spread based on Rothermel model in windthrow area of Changbai Mountain (Jilin, China)

    Science.gov (United States)

    Yin, Hang; Jin, Hui; Zhao, Ying; Fan, Yuguang; Qin, Liwu; Chen, Qinghong; Huang, Liya; Jia, Xiang; Liu, Lijie; Dai, Yuhong; Xiao, Ying

    2018-03-01

    The forest-fire not only brings great loss to natural resources, but also destructs the ecosystem and reduces the soil fertility, causing some natural disasters as soil erosion and debris flow. However, due to the lack of the prognosis for forest fire spreading trend in forest fire fighting, it is difficult to formulate rational and effective fire-fighting scheme. In the event of forest fire, achieving accurate judgment to the fire behavior would greatly improve the fire-fighting efficiency, and reduce heavy losses caused by fire. Researches on forest fire spread simulation can effectively reduce the loss of disasters. The present study focused on the simulation of "29 May 2012" wildfire in windthrow area of Changbai Mountain. Basic data were retrieved from the "29 May 2012" wildfire and field survey. A self-development forest fire behavior simulated program based on Rothermel Model was used in the simulation. Kappa coefficient and Sørensen index were employed to evaluate the simulation accuracy. The results showed that: The perimeter of simulated burned area was 4.66 km, the area was 56.47 hm2 and the overlapped burned area was 33.68 hm2, and the estimated rate of fire spread was 0.259 m/s. Between the simulated fire and actual fire, the Kappa coefficient was 0.7398 and the Sørensen co-efficient was 0.7419. This proved the application of Rothermel model to conduct fire behavior simulation in windthrow meadow was feasible. It can achieve the goal of forecasting for the spread behavior in windthrow area of Changbai Mountain. Thus, our self-development program based on the Rothermel model can provide a effective forecast of fire spread, which will facilitate the fire suppression work.

  12. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  13. [Parameter sensitivity of simulating net primary productivity of Larix olgensis forest based on BIOME-BGC model].

    Science.gov (United States)

    He, Li-hong; Wang, Hai-yan; Lei, Xiang-dong

    2016-02-01

    Model based on vegetation ecophysiological process contains many parameters, and reasonable parameter values will greatly improve simulation ability. Sensitivity analysis, as an important method to screen out the sensitive parameters, can comprehensively analyze how model parameters affect the simulation results. In this paper, we conducted parameter sensitivity analysis of BIOME-BGC model with a case study of simulating net primary productivity (NPP) of Larix olgensis forest in Wangqing, Jilin Province. First, with the contrastive analysis between field measurement data and the simulation results, we tested the BIOME-BGC model' s capability of simulating the NPP of L. olgensis forest. Then, Morris and EFAST sensitivity methods were used to screen the sensitive parameters that had strong influence on NPP. On this basis, we also quantitatively estimated the sensitivity of the screened parameters, and calculated the global, the first-order and the second-order sensitivity indices. The results showed that the BIOME-BGC model could well simulate the NPP of L. olgensis forest in the sample plot. The Morris sensitivity method provided a reliable parameter sensitivity analysis result under the condition of a relatively small sample size. The EFAST sensitivity method could quantitatively measure the impact of simulation result of a single parameter as well as the interaction between the parameters in BIOME-BGC model. The influential sensitive parameters for L. olgensis forest NPP were new stem carbon to new leaf carbon allocation and leaf carbon to nitrogen ratio, the effect of their interaction was significantly greater than the other parameter' teraction effect.

  14. An individual-based evolving predator-prey ecosystem simulation using a fuzzy cognitive map as the behavior model

    OpenAIRE

    Gras , Robin; Devaurs , Didier; Wozniak , Adrianna; Aspinall , Adam

    2009-01-01

    International audience; This paper presents an individual-based predator-prey model with, for the first time, each agent behavior being modeled by a Fuzzy Cognitive Map (FCM), allowing the evolution of the agent behavior through the epochs of the simulation. The FCM enables the agent to evaluate its environment (e.g., distance to predator/prey, distance to potential breeding partner, distance to food, energy level), its internal state (e.g., fear, hunger, curiosity) with memory and choosing s...

  15. Integrating the simulation of domestic water demand behaviour to an urban water model using agent based modelling

    Science.gov (United States)

    Koutiva, Ifigeneia; Makropoulos, Christos

    2015-04-01

    The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model

  16. Simulation of the Beating Heart Based on Physically Modeling a Deformable Balloon

    International Nuclear Information System (INIS)

    Rohmer, Damien; Sitek, Arkadiusz; Gullberg, Grant T.

    2006-01-01

    The motion of the beating heart is complex and creates artifacts in SPECT and x-ray CT images. Phantoms such as the Jaszczak Dynamic Cardiac Phantom are used to simulate cardiac motion for evaluation of acquisition and data processing protocols used for cardiac imaging. Two concentric elastic membranes filled with water are connected to tubing and pump apparatus for creating fluid flow in and out of the inner volume to simulate motion of the heart. In the present report, the movement of two concentric balloons is solved numerically in order to create a computer simulation of the motion of the moving membranes in the Jaszczak Dynamic Cardiac Phantom. A system of differential equations, based on the physical properties, determine the motion. Two methods are tested for solving the system of differential equations. The results of both methods are similar providing a final shape that does not converge to a trivial circular profile. Finally, a tomographic imaging simulation is performed by acquiring static projections of the moving shape and reconstructing the result to observe motion artifacts. Two cases are taken into account: in one case each projection angle is sampled for a short time interval and the other case is sampled for a longer time interval. The longer sampling acquisition shows a clear improvement in decreasing the tomographic streaking artifacts

  17. Using ant-behavior-based simulation model AntWeb to improve website organization

    Science.gov (United States)

    Li, Weigang; Pinheiro Dib, Marcos V.; Teles, Wesley M.; Morais de Andrade, Vlaudemir; Alves de Melo, Alba C. M.; Cariolano, Judas T.

    2002-03-01

    Some web usage mining algorithms showed the potential application to find the difference among the organizations expected by visitors to the website. However, there are still no efficient method and criterion for a web administrator to measure the performance of the modification. In this paper, we developed an AntWeb, a model inspired by ants' behavior to simulate the sequence of visiting the website, in order to measure the efficient of the web structure. We implemented a web usage mining algorithm using backtrack to the intranet website of the Politec Informatic Ltd., Brazil. We defined throughput (the number of visitors to reach their target pages per time unit relates to the total number of visitors) as an index to measure the website's performance. We also used the link in a web page to represent the effect of visitors' pheromone trails. For every modification in the website organization, for example, putting a link from the expected location to the target object, the simulation reported the value of throughput as a quick answer about this modification. The experiment showed the stability of our simulation model, and a positive modification to the intranet website of the Politec.

  18. Modelling and simulation of [18F]fluoromisonidazole dynamics based on histology-derived microvessel maps

    Science.gov (United States)

    Mönnich, David; Troost, Esther G. C.; Kaanders, Johannes H. A. M.; Oyen, Wim J. G.; Alber, Markus; Thorwarth, Daniela

    2011-04-01

    Hypoxia can be assessed non-invasively by positron emission tomography (PET) using radiotracers such as [18F]fluoromisonidazole (Fmiso) accumulating in poorly oxygenated cells. Typical features of dynamic Fmiso PET data are high signal variability in the first hour after tracer administration and slow formation of a consistent contrast. The purpose of this study is to investigate whether these characteristics can be explained by the current conception of the underlying microscopic processes and to identify fundamental effects. This is achieved by modelling and simulating tissue oxygenation and tracer dynamics on the microscopic scale. In simulations, vessel structures on histology-derived maps act as sources and sinks for oxygen as well as tracer molecules. Molecular distributions in the extravascular space are determined by reaction-diffusion equations, which are solved numerically using a two-dimensional finite element method. Simulated Fmiso time activity curves (TACs), though not directly comparable to PET TACs, reproduce major characteristics of clinical curves, indicating that the microscopic model and the parameter values are adequate. Evidence for dependence of the early PET signal on the vascular fraction is found. Further, possible effects leading to late contrast formation and potential implications on the quantification of Fmiso PET data are discussed.

  19. Simulation of Population-Based Commuter Exposure to NO2 Using Different Air Pollution Models

    Directory of Open Access Journals (Sweden)

    Martina S. Ragettli

    2014-05-01

    Full Text Available We simulated commuter routes and long-term exposure to traffic-related air pollution during commute in a representative population sample in Basel (Switzerland, and evaluated three air pollution models with different spatial resolution for estimating commute exposures to nitrogen dioxide (NO2 as a marker of long-term exposure to traffic-related air pollution. Our approach includes spatially and temporally resolved data on actual commuter routes, travel modes and three air pollution models. Annual mean NO2 commuter exposures were similar between models. However, we found more within-city and within-subject variability in annual mean (±SD NO2 commuter exposure with a high resolution dispersion model (40 ± 7 µg m−3, range: 21–61 than with a dispersion model with a lower resolution (39 ± 5 µg m−3; range: 24–51, and a land use regression model (41 ± 5 µg m−3; range: 24–54. Highest median cumulative exposures were calculated along motorized transport and bicycle routes, and the lowest for walking. For estimating commuter exposure within a city and being interested also in small-scale variability between roads, a model with a high resolution is recommended. For larger scale epidemiological health assessment studies, models with a coarser spatial resolution are likely sufficient, especially when study areas include suburban and rural areas.

  20. Simulation of the catalyst layer in PEMFC based on a novel two-phase lattice model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Jiejing; Yang Wei; Xu Li [School of Chemical Engineering and Technology, State Key Laboratory of Chemical Engineering, Tianjin Key Laboratory of Membrane Science and Desalination Technology, Tianjin University, Tianjin 300072 (China); Wang Yuxin, E-mail: yxwang@tju.edu.cn [School of Chemical Engineering and Technology, State Key Laboratory of Chemical Engineering, Tianjin Key Laboratory of Membrane Science and Desalination Technology, Tianjin University, Tianjin 300072 (China)

    2011-08-01

    Highlights: > We propose a novel two phase lattice model of catalyst layer in PEMFC. > The model features a catalyst phase and a mixed ionomer and pores phase. > Transport and electrochemical reaction in the lattice are simulated. > The model enables more accurate results than pore-solid two phase model. > Profiles of oxygen level and reaction rate across catalyst layer vary with cell current. - Abstract: A lattice model of catalyst layer in proton exchange membrane fuel cells (PEMFCs), consisting of randomly distributed catalyst phase (C phase) and mixed ionomer-pore phase (IP phase), was established by means of Monte Carlo method. Transport and electrochemical reactions in the model catalyst layer were calculated. The newly proposed C-IP model was compared with previously established pore-solid two phase model. The variation of oxygen level and reaction rate along the thickness of catalyst layer with cell current was discussed. The effect of ionomer distribution across catalyst layer was studied by comparing profiles of oxygen level, reaction rate and overpotential, as well as corresponding polarization curves.

  1. Simulation of Drought-induced Tree Mortality Using a New Individual and Hydraulic Trait-based Model (S-TEDy)

    Science.gov (United States)

    Sinha, T.; Gangodagamage, C.; Ale, S.; Frazier, A. G.; Giambelluca, T. W.; Kumagai, T.; Nakai, T.; Sato, H.

    2017-12-01

    Drought-related tree mortality at a regional scale causes drastic shifts in carbon and water cycling in Southeast Asian tropical rainforests, where severe droughts are projected to occur more frequently, especially under El Niño conditions. To provide a useful tool for projecting the tropical rainforest dynamics under climate change conditions, we developed the Spatially Explicit Individual-Based (SEIB) Dynamic Global Vegetation Model (DGVM) applicable to simulating mechanistic tree mortality induced by the climatic impacts via individual-tree-scale ecophysiology such as hydraulic failure and carbon starvation. In this study, we present the new model, SEIB-originated Terrestrial Ecosystem Dynamics (S-TEDy) model, and the computation results were compared with observations collected at a field site in a Bornean tropical rainforest. Furthermore, after validating the model's performance, numerical experiments addressing a future of the tropical rainforest were conducted using some global climate model (GCM) simulation outputs.

  2. Coupling dynamic modeling and simulation of three-degree-of-freedom micromanipulator based on piezoelectric ceramic of fuzzy PID

    Science.gov (United States)

    Li, Dongjie; Fu, Yu; Yang, Liu

    2017-08-01

    For further research on the microparticles trajectory in the process of micromanipulation, the paper modeled on the coupling dynamic of three-degree-of-freedom micromanipulator which is based on piezoelectric ceramic. In the micromanipulation, the transformation of certain movement direction can generate a corresponding change in the coupling in three-degree-of-freedom micromanipulator movement, the fuzzy PID method was adopted by the control system of this study, and the modeling analysis was performed on the control system. After completing the above modeling, the simulation model is built by the MATLAB Simulink software. The simulation output results are basically in accordance with the actual trajectory, which achieve the successful research purposes of coupling dynamics model for three-degree-of-freedom micromanipulator and application of fuzzy PID method.

  3. Model extension and improvement for simulator-based software safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H.-W. [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China) and Institute of Nuclear Energy Research (INER), No. 1000 Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China)]. E-mail: hwhwang@iner.gov.tw; Shih Chunkuan [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China); Yih Swu [Department of Computer Science and Information Engineering, Ching Yun University, 229 Chien-Hsin Road, Jung-Li, Taoyuan County 320, Taiwan (China); Chen, M.-H. [Institute of Nuclear Energy Research (INER), No. 1000Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Lin, J.-M. [Taiwan Power Company (TPC), 242 Roosevelt Road, Section 3, Taipei 100, Taiwan (China)

    2007-05-15

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study.

  4. Towards a complex systems approach in sports injury research: simulating running-related injury development with agent-based modelling.

    Science.gov (United States)

    Hulme, Adam; Thompson, Jason; Nielsen, Rasmus Oestergaard; Read, Gemma J M; Salmon, Paul M

    2018-06-18

    There have been recent calls for the application of the complex systems approach in sports injury research. However, beyond theoretical description and static models of complexity, little progress has been made towards formalising this approach in way that is practical to sports injury scientists and clinicians. Therefore, our objective was to use a computational modelling method and develop a dynamic simulation in sports injury research. Agent-based modelling (ABM) was used to model the occurrence of sports injury in a synthetic athlete population. The ABM was developed based on sports injury causal frameworks and was applied in the context of distance running-related injury (RRI). Using the acute:chronic workload ratio (ACWR), we simulated the dynamic relationship between changes in weekly running distance and RRI through the manipulation of various 'athlete management tools'. The findings confirmed that building weekly running distances over time, even within the reported ACWR 'sweet spot', will eventually result in RRI as athletes reach and surpass their individual physical workload limits. Introducing training-related error into the simulation and the modelling of a 'hard ceiling' dynamic resulted in a higher RRI incidence proportion across the population at higher absolute workloads. The presented simulation offers a practical starting point to further apply more sophisticated computational models that can account for the complex nature of sports injury aetiology. Alongside traditional forms of scientific inquiry, the use of ABM and other simulation-based techniques could be considered as a complementary and alternative methodological approach in sports injury research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework.

    Science.gov (United States)

    Kandiah, Venu; Binder, Andrew R; Berglund, Emily Z

    2017-10-01

    Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the "risk publics" model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters-including social groups, relationships, and communication variables, also from survey data-are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks. © 2017 Society for Risk Analysis.

  6. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    Energy Technology Data Exchange (ETDEWEB)

    Auld, Joshua; Hope, Michael; Ley, Hubert; Sokolov, Vadim; Xu, Bo; Zhang, Kuilin

    2016-03-01

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typically done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.

  7. Modelica-based modeling and simulation of a twin screw compressor for heat pump applications

    International Nuclear Information System (INIS)

    Chamoun, Marwan; Rulliere, Romuald; Haberschill, Philippe; Peureux, Jean-Louis

    2013-01-01

    A new twin screw compressor has been developed by SRM (Svenska Rotor Maskiner) for use in a new high temperature heat pump using water as refrigerant. This article presents a mathematical model of the thermodynamic process of compression in twin screw compressors. Using a special discretization method, a transient twin screw compressor model has been developed using Modelica in order to study the dry compression cycle of this machine at high temperature levels. The pressure and enthalpy evolution in the control volumes of the model are calculated as a function of the rotational angle of the male rotor using energy and continuity equations. In addition, associated processes encountered in real machines such as variable fluid leakages, water injection and heat losses are modeled and implemented in the main compressor model. A comparison is performed using the model developed, demonstrating the behavior of the compressor and the evolution of its different parameters in different configurations with and without water injection. This comparison shows the need for water injection to avoid compressor failure and improve its efficiency. -- Highlights: • Difficulties related to the compressor limit the development of a high temperature heat pump using water as refrigerant. • A new water vapor double screw compressor has been developed to overcome compression problems. • A dynamic model of this compressor has been developed and simulated using Modelica. • The behavior of the compressor has been identified all along the compression cycle and efficiencies have been calculated

  8. Fish Individual-based Numerical Simulator (FINS): A particle-based model of juvenile salmonid movement and dissolved gas exposure history in the Columbia River Basin

    International Nuclear Information System (INIS)

    Scheibe, Timothy D.; Richmond, Marshall C.

    2002-01-01

    This paper describes a numerical model of juvenile salmonid migration in the Columbia and Snake Rivers. The model, called the Fish Individual-based Numerical Simulator or FINS, employs a discrete, particle-based approach to simulate the migration and history of exposure to dissolved gases of individual fish. FINS is linked to a two-dimensional (vertically-averaged) hydrodynamic simulator that quantifies local water velocity, temperature, and dissolved gas levels as a function of river flow rates and dam operations. Simulated gas exposure histories can be input to biological mortality models to predict the effects of various river configurations on fish injury and mortality due to dissolved gas supersaturation. Therefore, FINS serves as a critical linkage between hydrodynamic models of the river system and models of biological impacts. FINS was parameterized and validated based on observations of individual fish movements collected using radiotelemetry methods during 1997 and 1998 . A quasi-inverse approach was used to decouple fish swimming movements from advection with the local water velocity, allowing inference of time series of non-advective displacements of individual fish from the radiotelemetry data. Statistical analyses of these displacements are presented, and confirm that strong temporal correlation of fish swimming behavior persists in some cases over several hours. A correlated random-walk model was employed to simulate the observed migration behavior, and parameters of the model were estimated that lead to close correspondence between predictions and observations

  9. Lattice Boltzmann based multicomponent reactive transport model coupled with geochemical solver for scale simulations

    NARCIS (Netherlands)

    Patel, R.A.; Perko, J.; Jaques, D.; De Schutter, G.; Ye, G.; Van Breugel, K.

    2013-01-01

    A Lattice Boltzmann (LB) based reactive transport model intended to capture reactions and solid phase changes occurring at the pore scale is presented. The proposed approach uses LB method to compute multi component mass transport. The LB multi-component transport model is then coupled with the

  10. A Bayesian network based approach for integration of condition-based maintenance in strategic offshore wind farm O&M simulation models

    DEFF Research Database (Denmark)

    Nielsen, Jannie Sønderkær; Sørensen, John Dalsgaard; Sperstad, Iver Bakken

    2018-01-01

    In the overall decision problem regarding optimization of operation and maintenance (O&M) for offshore wind farms, there are many approaches for solving parts of the overall decision problem. Simulation-based strategy models accurately capture system effects related to logistics, but model...... to generate failures and CBM tasks. An example considering CBM for wind turbine blades demonstrates the feasibility of the approach....

  11. Simulating demand for innovative radiotherapies: An illustrative model based on carbon ion and proton radiotherapy

    International Nuclear Information System (INIS)

    Pommier, Pascal; Lievens, Yolande; Feschet, Fabien; Borras, Josep M.; Baron, Marie Helene; Shtiliyanova, Anastasiya; Pijls-Johannesma, Madelon

    2010-01-01

    Background and purpose: Innovative therapies are not only characterized by major uncertainties regarding clinical benefit and cost but also the expected recruitment of patients. An original model was developed to simulate patient recruitment to a costly particle therapy by varying layout of the facility and patient referral (one vs. several countries) and by weighting the treated indication by the expected benefit of particle therapy. Material and methods: A multi-step probabilistic spatial model was used to allocate patients to the optimal treatment strategy and facility taking into account the estimated therapeutic gain from the new therapy for each tumour type, the geographical accessibility of the facilities and patient preference. Recruitment was simulated under different assumptions relating to the demand and supply. Results: Extending the recruitment area, reducing treatment capacity, equipping all treatment rooms with a carbon ion gantry and inclusion of proton protocols in carbon ion facilities led to an increased proportion of indications with the highest expected benefit. Assuming the existence of a competing carbon ions facility, lower values of therapeutic gain, and a greater unwillingness of patients to travel for treatment increased the proportion of indications with low expected benefit. Conclusions: Modelling patient recruitment may aid decision-making when planning new and expensive treatments.

  12. A simulation-based Data Envelopment Analysis (DEA model to evaluate wind plants locations

    Directory of Open Access Journals (Sweden)

    Hossein Sameie

    2015-04-01

    Full Text Available As the world is getting overpopulated and over polluted the human being is seeking to utilize new sources of energy that are cleaner, cheaper, and more accessible. Wind is one of these clean energy sources that is accessible everywhere on the planet earth. This source of energy cannot be stored for later use; therefore, environmental circumstances and geographical location of wind plants are crucial matters. This study proposes a model to decide on the optimum location for a wind farm among the demand area. To tackle the uncertainty related to the geographical position of the nominated location such as wind speed; altitude; mean temperature; and humidity; a simulation method is applied on the problem. Other factors such as the time that a plant is out of service and demand fluctuations also have been considered in the simulation phase. Moreover, a probability distribution function is calculated for the turbine power. Then Data Envelopment Analysis (DEA performs the selection between all the nominated locations for wind farm. The proposed model takes into account several important elements of the problems. Elements such as land cost; average power received from the wind blowing; demand point population etc. are considered at the same time to select the optimum location of wind plants. Finally, the model is applied on a real case in order to demonstrate its reliability and applicability.

  13. 3D realistic head model simulation based on transcranial magnetic stimulation.

    Science.gov (United States)

    Yang, Shuo; Xu, Guizhi; Wang, Lei; Chen, Yong; Wu, Huanli; Li, Ying; Yang, Qingxin

    2006-01-01

    Transcranial magnetic stimulation (TMS) is a powerful non-invasive tool for investigating functions in the brain. The target inside the head is stimulated with eddy currents induced in the tissue by the time-varying magnetic field. Precise spatial localization of stimulation sites is the key of efficient functional magnetic stimulations. Many researchers devote to magnetic field analysis in empty free space. In this paper, a realistic head model used in Finite Element Method has been developed. The magnetic field inducted in the head bt TMS has been analysed. This three-dimensional simulation is useful for spatial localization of stimulation.

  14. Modelling and Simulation Based on Matlab/Simulink: A Press Mechanism

    International Nuclear Information System (INIS)

    Halicioglu, R; Dulger, L C; Bozdana, A T

    2014-01-01

    In this study, design and kinematic analysis of a crank-slider mechanism for a crank press is studied. The crank-slider mechanism is the commonly applied one as direct and indirect drive alternatives in practice. Since inexpensiveness, flexibility and controllability are getting more and more important in many industrial applications especially in automotive industry, a crank press with servo actuator (servo crank press) is taken as an application. Design and kinematic analysis of representative mechanism is presented with geometrical analysis for the inverse kinematic of the mechanism by using desired motion concept of slider. The mechanism is modelled in MATLAB/Simulink platform. The simulation results are presented herein

  15. Modelling and Simulation Based on Matlab/Simulink: A Press Mechanism

    Science.gov (United States)

    Halicioglu, R.; Dulger, L. C.; Bozdana, A. T.

    2014-03-01

    In this study, design and kinematic analysis of a crank-slider mechanism for a crank press is studied. The crank-slider mechanism is the commonly applied one as direct and indirect drive alternatives in practice. Since inexpensiveness, flexibility and controllability are getting more and more important in many industrial applications especially in automotive industry, a crank press with servo actuator (servo crank press) is taken as an application. Design and kinematic analysis of representative mechanism is presented with geometrical analysis for the inverse kinematic of the mechanism by using desired motion concept of slider. The mechanism is modelled in MATLAB/Simulink platform. The simulation results are presented herein.

  16. In situ measurement and modeling of biomechanical response of human cadaveric soft tissues for physics-based surgical simulation.

    Science.gov (United States)

    Lim, Yi-Je; Deo, Dhanannjay; Singh, Tejinder P; Jones, Daniel B; De, Suvranu

    2009-06-01

    Development of a laparoscopic surgery simulator that delivers high-fidelity visual and haptic (force) feedback, based on the physical models of soft tissues, requires the use of empirical data on the mechanical behavior of intra-abdominal organs under the action of external forces. As experiments on live human patients present significant risks, the use of cadavers presents an alternative. We present techniques of measuring and modeling the mechanical response of human cadaveric tissue for the purpose of developing a realistic model. The major contribution of this paper is the development of physics-based models of soft tissues that range from linear elastic models to nonlinear viscoelastic models which are efficient for application within the framework of a real-time surgery simulator. To investigate the in situ mechanical, static, and dynamic properties of intra-abdominal organs, we have developed a high-precision instrument by retrofitting a robotic device from Sensable Technologies (position resolution of 0.03 mm) with a six-axis Nano 17 force-torque sensor from ATI Industrial Automation (force resolution of 1/1,280 N along each axis), and used it to apply precise displacement stimuli and record the force response of liver and stomach of ten fresh human cadavers. The mean elastic modulus of liver and stomach is estimated as 5.9359 kPa and 1.9119 kPa, respectively over the range of indentation depths tested. We have also obtained the parameters of a quasilinear viscoelastic (QLV) model to represent the nonlinear viscoelastic behavior of the cadaver stomach and liver over a range of indentation depths and speeds. The models are found to have an excellent goodness of fit (with R (2) > 0.99). The data and models presented in this paper together with additional ones based on the principles presented in this paper would result in realistic physics-based surgical simulators.

  17. Molecular dynamics simulation study of PTP1B with allosteric inhibitor and its application in receptor based pharmacophore modeling

    Science.gov (United States)

    Bharatham, Kavitha; Bharatham, Nagakumar; Kwon, Yong Jung; Lee, Keun Woo

    2008-12-01

    Allosteric inhibition of protein tyrosine phosphatase 1B (PTP1B), has paved a new path to design specific inhibitors for PTP1B, which is an important drug target for the treatment of type II diabetes and obesity. The PTP1B1-282-allosteric inhibitor complex crystal structure lacks α7 (287-298) and moreover there is no available 3D structure of PTP1B1-298 in open form. As the interaction between α7 and α6-α3 helices plays a crucial role in allosteric inhibition, α7 was modeled to the PTP1B1-282 in open form complexed with an allosteric inhibitor (compound-2) and a 5 ns MD simulation was performed to investigate the relative orientation of the α7-α6-α3 helices. The simulation conformational space was statistically sampled by clustering analyses. This approach was helpful to reveal certain clues on PTP1B allosteric inhibition. The simulation was also utilized in the generation of receptor based pharmacophore models to include the conformational flexibility of the protein-inhibitor complex. Three cluster representative structures of the highly populated clusters were selected for pharmacophore model generation. The three pharmacophore models were subsequently utilized for screening databases to retrieve molecules containing the features that complement the allosteric site. The retrieved hits were filtered based on certain drug-like properties and molecular docking simulations were performed in two different conformations of protein. Thus, performing MD simulation with α7 to investigate the changes at the allosteric site, then developing receptor based pharmacophore models and finally docking the retrieved hits into two distinct conformations will be a reliable methodology in identifying PTP1B allosteric inhibitors.

  18. MODELING OF INVESTMENT STRATEGIES IN STOCKS MARKETS: AN APPROACH FROM MULTI AGENT BASED SIMULATION AND FUZZY LOGIC

    Directory of Open Access Journals (Sweden)

    ALEJANDRO ESCOBAR

    2010-01-01

    Full Text Available This paper presents a simulation model of a complex system, in this case a financial market, using a MultiAgent Based Simulation approach. Such model takes into account microlevel aspects like the Continuous Double Auction mechanism, which is widely used within stock markets, as well as investor agents reasoning who participate looking for profits. To model such reasoning several variables were considered including general stocks information like profitability and volatility, but also some agent's aspects like their risk tendency. All these variables are incorporated throughout a fuzzy logic approach trying to represent in a faithful manner the kind of reasoning that nonexpert investors have, including a stochastic component in order to model human factors.

  19. [Hardware Implementation of Numerical Simulation Function of Hodgkin-Huxley Model Neurons Action Potential Based on Field Programmable Gate Array].

    Science.gov (United States)

    Wang, Jinlong; Lu, Mai; Hu, Yanwen; Chen, Xiaoqiang; Pan, Qiangqiang

    2015-12-01

    Neuron is the basic unit of the biological neural system. The Hodgkin-Huxley (HH) model is one of the most realistic neuron models on the electrophysiological characteristic description of neuron. Hardware implementation of neuron could provide new research ideas to clinical treatment of spinal cord injury, bionics and artificial intelligence. Based on the HH model neuron and the DSP Builder technology, in the present study, a single HH model neuron hardware implementation was completed in Field Programmable Gate Array (FPGA). The neuron implemented in FPGA was stimulated by different types of current, the action potential response characteristics were analyzed, and the correlation coefficient between numerical simulation result and hardware implementation result were calculated. The results showed that neuronal action potential response of FPGA was highly consistent with numerical simulation result. This work lays the foundation for hardware implementation of neural network.

  20. Graph Cellular Automata with Relation-Based Neighbourhoods of Cells for Complex Systems Modelling: A Case of Traffic Simulation

    Directory of Open Access Journals (Sweden)

    Krzysztof Małecki

    2017-12-01

    Full Text Available A complex system is a set of mutually interacting elements for which it is possible to construct a mathematical model. This article focuses on the cellular automata theory and the graph theory in order to compare various types of cellular automata and to analyse applications of graph structures together with cellular automata. It proposes a graph cellular automaton with a variable configuration of cells and relation-based neighbourhoods (r–GCA. The developed mechanism enables modelling of phenomena found in complex systems (e.g., transport networks, urban logistics, social networks taking into account the interaction between the existing objects. As an implementation example, modelling of moving vehicles has been made and r–GCA was compared to the other cellular automata models simulating the road traffic and used in the computer simulation process.

  1. Condition-based maintenance effectiveness for series–parallel power generation system—A combined Markovian simulation model

    International Nuclear Information System (INIS)

    Azadeh, A.; Asadzadeh, S.M.; Salehi, N.; Firoozi, M.

    2015-01-01

    Condition-based maintenance (CBM) is an increasingly applicable policy in the competitive marketplace as a means of improving equipment reliability and efficiency. Not only has maintenance a close relationship with safety but its costs also make it even more attractive issue for researchers. This study proposes a model to evaluate the effectiveness of CBM policy compared to two other maintenance policies: Corrective Maintenance (CM) and Preventive Maintenance (PM). Maintenance policies are compared through two system performance indicators: reliability and cost. To estimate the reliability and costs of the system, the proposed Markovian discrete-event simulation model is developed under each of these policies. The applicability and usefulness of the proposed Markovian simulation model is illustrated for a series–parallel power generation system. The simulated characteristics of CBM system include its prognostics efficiency to estimate remaining useful life of the equipment. Results show that with an efficient prognostics, CBM policy is an effective strategy compared to other maintenance strategies. - Highlights: • A model is developed to evaluate the effectiveness of CBM policy. • Maintenance policies are compared through reliability and cost. • A Markovian simulation model is developed. • A series–parallel power generation system is considered. • CBM is an effective strategy compared to others

  2. Advanced numerical simulation based on a non-local micromorphic model for metal forming processes

    Directory of Open Access Journals (Sweden)

    Diamantopoulou Evangelia

    2016-01-01

    Full Text Available An advanced numerical methodology is developed for metal forming simulation based on thermodynamically-consistent nonlocal constitutive equations accounting for various fully coupled mechanical phenomena under finite strain in the framework of micromorphic continua. The numerical implementation into ABAQUS/Explicit is made for 2D quadrangular elements thanks to the VUEL users’ subroutine. Simple examples with presence of a damaged area are made in order to show the ability of the proposed methodology to describe the independence of the solution from the space discretization.

  3. A Model-Based Approach for Bridging Virtual and Physical Sensor Nodes in a Hybrid Simulation Framework

    Directory of Open Access Journals (Sweden)

    Mohammad Mozumdar

    2014-06-01

    Full Text Available The Model Based Design (MBD approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL simulation.

  4. Numerical simulation of interior ballistic process of railgun based on the multi-field coupled model

    Directory of Open Access Journals (Sweden)

    Qinghua Lin

    2016-04-01

    Full Text Available Railgun launcher design relies on appropriate models. A multi-field coupled model of railgun launcher was presented in this paper. The 3D transient multi-field was composed of electromagnetic field, thermal field and structural field. The magnetic diffusion equations were solved by a finite-element boundary-element coupling method. The thermal diffusion equations and structural equations were solved by a finite element method. A coupled calculation was achieved by the transfer data from the electromagnetic field to the thermal and structural fields. Some characteristics of railgun shot, such as velocity skin effect, melt-wave erosion and magnetic sawing, which are generated under the condition of large-current and high-speed sliding electrical contact, were demonstrated by numerical simulation.

  5. Models and simulations

    International Nuclear Information System (INIS)

    Lee, M.J.; Sheppard, J.C.; Sullenberger, M.; Woodley, M.D.

    1983-09-01

    On-line mathematical models have been used successfully for computer controlled operation of SPEAR and PEP. The same model control concept is being implemented for the operation of the LINAC and for the Damping Ring, which will be part of the Stanford Linear Collider (SLC). The purpose of this paper is to describe the general relationships between models, simulations and the control system for any machine at SLAC. The work we have done on the development of the empirical model for the Damping Ring will be presented as an example

  6. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  7. Modeling and simulation of adaptive Neuro-fuzzy based intelligent system for predictive stabilization in structured overlay networks

    Directory of Open Access Journals (Sweden)

    Ramanpreet Kaur

    2017-02-01

    Full Text Available Intelligent prediction of neighboring node (k well defined neighbors as specified by the dht protocol dynamism is helpful to improve the resilience and can reduce the overhead associated with topology maintenance of structured overlay networks. The dynamic behavior of overlay nodes depends on many factors such as underlying user’s online behavior, geographical position, time of the day, day of the week etc. as reported in many applications. We can exploit these characteristics for efficient maintenance of structured overlay networks by implementing an intelligent predictive framework for setting stabilization parameters appropriately. Considering the fact that human driven behavior usually goes beyond intermittent availability patterns, we use a hybrid Neuro-fuzzy based predictor to enhance the accuracy of the predictions. In this paper, we discuss our predictive stabilization approach, implement Neuro-fuzzy based prediction in MATLAB simulation and apply this predictive stabilization model in a chord based overlay network using OverSim as a simulation tool. The MATLAB simulation results present that the behavior of neighboring nodes is predictable to a large extent as indicated by the very small RMSE. The OverSim based simulation results also observe significant improvements in the performance of chord based overlay network in terms of lookup success ratio, lookup hop count and maintenance overhead as compared to periodic stabilization approach.

  8. SU-F-J-178: A Computer Simulation Model Observer for Task-Based Image Quality Assessment in Radiation Therapy

    International Nuclear Information System (INIS)

    Dolly, S; Mutic, S; Anastasio, M; Li, H; Yu, L

    2016-01-01

    Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework was developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation

  9. Modelling and Simulating of Risk Behaviours in Virtual Environments Based on Multi-Agent and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Linqin Cai

    2013-11-01

    Full Text Available Due to safety and ethical issues, traditional experimental approaches to modelling underground risk behaviours can be costly, dangerous and even impossible to realize. Based on multi-agent technology, a virtual coalmine platform for risk behaviour simulation is presented to model and simulate the human-machine-environment related risk factors in underground coalmines. To reveal mine workers' risk behaviours, a fuzzy emotional behaviour model is proposed to simulate underground miners' responding behaviours to potential hazardous events based on cognitive appraisal theories and fuzzy logic techniques. The proposed emotion model can generate more believable behaviours for virtual miners according to personalized emotion states, internal motivation needs and behaviour selection thresholds. Finally, typical accident cases of underground hazard spotting and locomotive transport were implemented. The behaviour believability of virtual miners was evaluated with a user assessment method. Experimental results show that the proposed models can create more realistic and reasonable behaviours in virtual coalmine environments, which can improve miners' risk awareness and further train miners' emergent decision-making ability when facing unexpected underground situations.

  10. Modeling and Simulation of - and Silicon Germanium-Base Bipolar Transistors Operating at a Wide Range of Temperatures.

    Science.gov (United States)

    Shaheed, M. Reaz

    1995-01-01

    Higher speed at lower cost and at low power consumption is a driving force for today's semiconductor technology. Despite a substantial effort toward achieving this goal via alternative technologies such as III-V compounds, silicon technology still dominates mainstream electronics. Progress in silicon technology will continue for some time with continual scaling of device geometry. However, there are foreseeable limits on achievable device performance, reliability and scaling for room temperature technologies. Thus, reduced temperature operation is commonly viewed as a means for continuing the progress towards higher performance. Although silicon CMOS will be the first candidate for low temperature applications, bipolar devices will be used in a hybrid fashion, as line drivers or in limited critical path elements. Silicon -germanium-base bipolar transistors look especially attractive for low-temperature bipolar applications. At low temperatures, various new physical phenomena become important in determining device behavior. Carrier freeze-out effects which are negligible at room temperature, become of crucial importance for analyzing the low temperature device characteristics. The conventional Pearson-Bardeen model of activation energy, used for calculation of carrier freeze-out, is based on an incomplete picture of the physics that takes place and hence, leads to inaccurate results at low temperatures. Plasma -induced bandgap narrowing becomes more pronounced in device characteristics at low temperatures. Even with modern numerical simulators, this effect is not well modeled or simulated. In this dissertation, improved models for such physical phenomena are presented. For accurate simulation of carrier freeze-out, the Pearson-Bardeen model has been extended to include the temperature dependence of the activation energy. The extraction of the model is based on the rigorous, first-principle theoretical calculations available in the literature. The new model is shown

  11. Decoherence and Entanglement Simulation in a Model of Quantum Neural Network Based on Quantum Dots

    Directory of Open Access Journals (Sweden)

    Altaisky Mikhail V.

    2016-01-01

    Full Text Available We present the results of the simulation of a quantum neural network based on quantum dots using numerical method of path integral calculation. In the proposed implementation of the quantum neural network using an array of single-electron quantum dots with dipole-dipole interaction, the coherence is shown to survive up to 0.1 nanosecond in time and up to the liquid nitrogen temperature of 77K.We study the quantum correlations between the quantum dots by means of calculation of the entanglement of formation in a pair of quantum dots on the GaAs based substrate with dot size of 100 ÷ 101 nanometer and interdot distance of 101 ÷ 102 nanometers order.

  12. Discrete Event Simulation-Based Resource Modelling in Health Technology Assessment.

    Science.gov (United States)

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Dixon, Simon

    2017-10-01

    The objective of this article was to conduct a systematic review of published research on the use of discrete event simulation (DES) for resource modelling (RM) in health technology assessment (HTA). RM is broadly defined as incorporating and measuring effects of constraints on physical resources (e.g. beds, doctors, nurses) in HTA models. Systematic literature searches were conducted in academic databases (JSTOR, SAGE, SPRINGER, SCOPUS, IEEE, Science Direct, PubMed, EMBASE) and grey literature (Google Scholar, NHS journal library), enhanced by manual searchers (i.e. reference list checking, citation searching and hand-searching techniques). The search strategy yielded 4117 potentially relevant citations. Following the screening and manual searches, ten articles were included. Reviewing these articles provided insights into the applications of RM: firstly, different types of economic analyses, model settings, RM and cost-effectiveness analysis (CEA) outcomes were identified. Secondly, variation in the characteristics of the constraints such as types and nature of constraints and sources of data for the constraints were identified. Thirdly, it was found that including the effects of constraints caused the CEA results to change in these articles. The review found that DES proved to be an effective technique for RM but there were only a small number of studies applied in HTA. However, these studies showed the important consequences of modelling physical constraints and point to the need for a framework to be developed to guide future applications of this approach.

  13. A simulation-based interval two-stage stochastic model for agricultural nonpoint source pollution control through land retirement

    International Nuclear Information System (INIS)

    Luo, B.; Li, J.B.; Huang, G.H.; Li, H.L.

    2006-01-01

    This study presents a simulation-based interval two-stage stochastic programming (SITSP) model for agricultural nonpoint source (NPS) pollution control through land retirement under uncertain conditions. The modeling framework was established by the development of an interval two-stage stochastic program, with its random parameters being provided by the statistical analysis of the simulation outcomes of a distributed water quality approach. The developed model can deal with the tradeoff between agricultural revenue and 'off-site' water quality concern under random effluent discharge for a land retirement scheme through minimizing the expected value of long-term total economic and environmental cost. In addition, the uncertainties presented as interval numbers in the agriculture-water system can be effectively quantified with the interval programming. By subdividing the whole agricultural watershed into different zones, the most pollution-related sensitive cropland can be identified and an optimal land retirement scheme can be obtained through the modeling approach. The developed method was applied to the Swift Current Creek watershed in Canada for soil erosion control through land retirement. The Hydrological Simulation Program-FORTRAN (HSPF) was used to simulate the sediment information for this case study. Obtained results indicate that the total economic and environmental cost of the entire agriculture-water system can be limited within an interval value for the optimal land retirement schemes. Meanwhile, a best and worst land retirement scheme was obtained for the study watershed under various uncertainties

  14. Modelling and simulating decision processes of linked lives: An approach based on concurrent processes and stochastic race.

    Science.gov (United States)

    Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M

    2017-10-01

    Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.

  15. Impact of different policies on unhealthy dietary behaviors in an urban adult population: an agent-based simulation model.

    Science.gov (United States)

    Zhang, Donglan; Giabbanelli, Philippe J; Arah, Onyebuchi A; Zimmerman, Frederick J

    2014-07-01

    Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems.

  16. Enhancing Transportation Education through On-line Simulation using an Agent-Based Demand and Assignment Model

    OpenAIRE

    Shanjiang Zhu; Feng Xie; David Levinson

    2005-01-01

    This research explores the effectiveness of using simulation as a tool for enhancing classroom learning in the Civil Engineering Department of the University of Minnesota at Twin Cities. The authors developed a modern transportation planning software package, Agent-based Demand and Assignment Model (ADAM), that is consistent with our present understanding of travel behavior, that is platform independent, and that is easy to learn and is thus usable by students. An in-class project incorporate...

  17. Numerical simulation of the heat extraction in EGS with thermal-hydraulic-mechanical coupling method based on discrete fractures model

    International Nuclear Information System (INIS)

    Sun, Zhi-xue; Zhang, Xu; Xu, Yi; Yao, Jun; Wang, Hao-xuan; Lv, Shuhuan; Sun, Zhi-lei; Huang, Yong; Cai, Ming-yu; Huang, Xiaoxue

    2017-01-01

    The Enhanced Geothermal System (EGS) creates an artificial geothermal reservoir by hydraulic fracturing which allows heat transmission through the fractures by the circulating fluids as they extract heat from Hot Dry Rock (HDR). The technique involves complex thermal–hydraulic–mechanical (THM) coupling process. A numerical approach is presented in this paper to simulate and analyze the heat extraction process in EGS. The reservoir is regarded as fractured porous media consisting of rock matrix blocks and discrete fracture networks. Based on thermal non-equilibrium theory, the mathematical model of THM coupling process in fractured rock mass is used. The proposed model is validated by comparing it with several analytical solutions. An EGS case from Cooper Basin, Australia is simulated with 2D stochastically generated fracture model to study the characteristics of fluid flow, heat transfer and mechanical response in geothermal reservoir. The main parameters controlling the outlet temperature of EGS are also studied by sensitivity analysis. The results shows the significance of taking into account the THM coupling effects when investigating the efficiency and performance of EGS. - Highlights: • EGS reservoir comprising discrete fracture networks and matrix rock is modeled. • A THM coupling model is proposed for simulating the heat extraction in EGS. • The numerical model is validated by comparing with several analytical solutions. • A case study is presented for understanding the main characteristics of EGS. • The THM coupling effects are shown to be significant factors to EGS's running performance.

  18. Model validation of solar PV plant with hybrid data dynamic simulation based on fast-responding generator method

    Directory of Open Access Journals (Sweden)

    Zhao Dawei

    2016-01-01

    Full Text Available In recent years, a significant number of large-scale solar photovoltaic (PV plants have been put into operation or been under planning around the world. The model accuracy of solar PV plant is the key factor to investigate the mutual influences between solar PV plants and a power grid. However, this problem has not been well solved, especially in how to apply the real measurements to validate the models of the solar PV plants. Taking fast-responding generator method as an example, this paper presents a model validation methodology for solar PV plant via the hybrid data dynamic simulation. First, the implementation scheme of hybrid data dynamic simulation suitable for DIgSILENT PowerFactory software is proposed, and then an analysis model of solar PV plant integration based on IEEE 9 system is established. At last, model validation of solar PV plant is achieved by employing hybrid data dynamic simulation. The results illustrate the effectiveness of the proposed method in solar PV plant model validation.

  19. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  20. Fault Diagnosis for a Multistage Planetary Gear Set Using Model-Based Simulation and Experimental Investigation

    Directory of Open Access Journals (Sweden)

    Guoyan Li

    2016-01-01

    Full Text Available The gear damage will induce modulation effects in vibration signals. A thorough analysis of modulation sidebands spectral structure is necessary for fault diagnosis of planetary gear set. However, the spectral characteristics are complicated in practice, especially for a multistage planetary gear set which contains close frequency components. In this study, a coupled lateral and torsional dynamic model is established to predict the modulation sidebands of a two-stage compound planetary gear set. An improved potential energy method is used to calculate the time-varying mesh stiffness of each gear pair, and the influence of crack propagation on the mesh stiffness is analyzed. The simulated signals of the gear set are obtained by using Runge-Kutta numerical analysis method. Meanwhile, the sidebands characteristics are summarized to exhibit the modulation effects caused by sun gear damage. At the end, the experimental signals collected from an industrial SD16 planetary gearbox are analyzed to verify the theoretical derivations. The results of experiment agree well with the simulated analysis.

  1. Laboratory-based grain-shape models for simulating dust infrared spectra

    NARCIS (Netherlands)

    Mutschke, H.; Min, M.; Tamanai, A.

    2009-01-01

    Context. Analysis of thermal dust emission spectra for dust mineralogy and physical grain properties depends on comparison spectra, which are either laboratory-measured infrared extinction spectra or calculated extinction cross sections based on certain grain models. Often, the agreement between

  2. Structure Based Modeling of Small Molecules Binding to the TLR7 by Atomistic Level Simulations

    Directory of Open Access Journals (Sweden)

    Francesco Gentile

    2015-05-01

    Full Text Available Toll-Like Receptors (TLR are a large family of proteins involved in the immune system response. Both the activation and the inhibition of these receptors can have positive effects on several diseases, including viral pathologies and cancer, therefore prompting the development of new compounds. In order to provide new indications for the design of Toll-Like Receptor 7 (TLR7-targeting drugs, the mechanism of interaction between the TLR7 and two important classes of agonists (imidazoquinoline and adenine derivatives was investigated through docking and Molecular Dynamics simulations. To perform the computational analysis, a new model for the dimeric form of the receptors was necessary and therefore created. Qualitative and quantitative differences between agonists and inactive compounds were determined. The in silico results were compared with previous experimental observations and employed to define the ligand binding mechanism of TLR7.

  3. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  4. Dynamic modeling and simulation of sheave damper based on AMESim software

    Directory of Open Access Journals (Sweden)

    BI Ke

    2017-10-01

    Full Text Available [Objectives] Considering the shortcomings of the traditional sheave damper in buffer performance and the peak value of the greatest cable tension,[Methods] this paper presents a sheave damper with variable damping according to piston displacement as a replacement for the traditional sheave damper, and AMESim software is used for the modeling and simulation.[Results] The results show that the new sheave damper can significantly improve the arresting gear performance indicators, and has better adaptability for aircraft impact load. Compared with the traditional sheave damper, the new method can reduce cable tension by 25% and reduce the maximum deceleration of aircraft by 23%.[Conclusions] As such, the research in this paper can provide a theoretical reference for improving the performance of aircraft arresting gear.

  5. Simulation-based cutaneous surgical-skill training on a chicken-skin bench model in a medical undergraduate program.

    Science.gov (United States)

    Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo

    2013-05-01

    Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described.

  6. Simulation-based cutaneous surgical-skill training on a chicken-skin bench model in a medical undergraduate program

    Directory of Open Access Journals (Sweden)

    Rafael Denadai

    2013-01-01

    Full Text Available Background: Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. Aims: To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. Materials and Methods: One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. Results: A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. Conclusion: The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described.

  7. Modeling of synchrotron-based laboratory simulations of Titan's ionospheric photochemistry

    Science.gov (United States)

    Carrasco, Nathalie; Peng, Zhe; Pernot, Pascal

    2014-11-01

    The APSIS reactor has been designed to simulate in the laboratory with a VUV synchrotron irradiation the photochemistry occurring in planetary upper atmospheres. A N2-CH4 Titan-like gas mixture has been studied, whose photochemistry in Titan's ionospheric irradiation conditions leads to a coupled chemical network involving both radicals and ions. In the present work, an ion-neutral coupled model is developed to interpret the experimental data, taking into account the uncertainties on the kinetic parameters by Monte Carlo sampling. The model predicts species concentrations in agreement with mass spectrometry measurements of the methane consumption and product blocks intensities. Ion chemistry and in particular dissociative recombination are found to be very important through sensitivity analysis. The model is also applied to complementary environmental conditions, corresponding to Titan's ionospheric average conditions and to another existing synchrotron setup. An innovative study of the correlations between species concentrations identifies two main competitive families, leading respectively to saturated and unsaturated species. We find that the unsaturated growth family, driven by C2H2 , is dominant in Titan's upper atmosphere, as observed by the Cassini INMS. But the saturated species are substantially more intense in the measurements of the two synchrotron experimental setups, and likely originate from catalysis by metallic walls of the reactors.

  8. Dual RBFNNs-Based Model-Free Adaptive Control With Aspen HYSYS Simulation.

    Science.gov (United States)

    Zhu, Yuanming; Hou, Zhongsheng; Qian, Feng; Du, Wenli

    2017-03-01

    In this brief, we propose a new data-driven model-free adaptive control (MFAC) method with dual radial basis function neural networks (RBFNNs) for a class of discrete-time nonlinear systems. The main novelty lies in that it provides a systematic design method for controller structure by the direct usage of I/O data, rather than using the first-principle model or offline identified plant model. The controller structure is determined by equivalent-dynamic-linearization representation of the ideal nonlinear controller, and the controller parameters are tuned by the pseudogradient information extracted from the I/O data of the plant, which can deal with the unknown nonlinear system. The stability of the closed-loop control system and the stability of the training process for RBFNNs are guaranteed by rigorous theoretical analysis. Meanwhile, the effectiveness and the applicability of the proposed method are further demonstrated by the numerical example and Aspen HYSYS simulation of distillation column in crude styrene produce process.

  9. Fog Simulations Based on Multi-Model System: A Feasibility Study

    Science.gov (United States)

    Shi, Chune; Wang, Lei; Zhang, Hao; Zhang, Su; Deng, Xueliang; Li, Yaosun; Qiu, Mingyan

    2012-05-01

    Accurate forecasts of fog and visibility are very important to air and high way traffic, and are still a big challenge. A 1D fog model (PAFOG) is coupled to MM5 by obtaining the initial and boundary conditions (IC/BC) and some other necessary input parameters from MM5. Thus, PAFOG can be run for any area of interest. On the other hand, MM5 itself can be used to simulate fog events over a large domain. This paper presents evaluations of the fog predictability of these two systems for December of 2006 and December of 2007, with nine regional fog events observed in a field experiment, as well as over a large domain in eastern China. Among the simulations of the nine fog events by the two systems, two cases were investigated in detail. Daily results of ground level meteorology were validated against the routine observations at the CMA observational network. Daily fog occurrences for the two study periods was validated in Nanjing. General performance of the two models for the nine fog cases are presented by comparing with routine and field observational data. The results of MM5 and PAFOG for two typical fog cases are verified in detail against field observations. The verifications demonstrated that all methods tended to overestimate fog occurrence, especially for near-fog cases. In terms of TS/ETS, the LWC-only threshold with MM5 showed the best performance, while PAFOG showed the worst. MM5 performed better for advection-radiation fog than for radiation fog, and PAFOG could be an alternative tool for forecasting radiation fogs. PAFOG did show advantages over MM5 on the fog dissipation time. The performance of PAFOG highly depended on the quality of MM5 output. The sensitive runs of PAFOG with different IC/BC showed the capability of using MM5 output to run the 1D model and the high sensitivity of PAFOG on cloud cover. Future works should intensify the study of how to improve the quality of input data (e.g. cloud cover, advection, large scale subsidence) for the 1D

  10. Runoff Simulation in the Upper Reaches of Heihe River Basin Based on the RIEMS–SWAT Model

    Directory of Open Access Journals (Sweden)

    Songbing Zou

    2016-10-01

    Full Text Available In the distributed hydrological simulations for complex mountain areas, large amounts of meteorological input parameters with high spatial and temporal resolutions are necessary. However, the extreme scarcity and uneven distribution of the traditional meteorological observation stations in cold and arid regions of Northwest China makes it very difficult in meeting the requirements of hydrological simulations. Alternatively, regional climate models (RCMs, which can provide a variety of distributed meteorological data with high temporal and spatial resolution, have become an effective solution to improve hydrological simulation accuracy and to further study water resource responses to human activities and global climate change. In this study, abundant and evenly distributed virtual weather stations in the upper reaches of the Heihe River Basin (HRB of Northwest China were built for the optimization of the input data, and thus a regional integrated environmental model system (RIEMS based on RCM and a distributed hydrological model of soil and water assessment tool (SWAT were integrated as a coupled climate–hydrological RIEMS-SWAT model, which was applied to simulate monthly runoff from 1995 to 2010 in the region. Results show that the simulated and observed values are close; Nash–Sutcliffe efficiency is higher than 0.65; determination coefficient (R2 values are higher than 0.70; percent bias is controlled within ±20%; and root-mean-square-error-observation standard deviation ratio is less than 0.65. These results indicate that the coupled model can present basin hydrological processes properly, and provide scientific support for prediction and management of basin water resources.

  11. Simulation Package based on Placet

    CERN Document Server

    D'Amico, T E; Leros, Nicolas; Schulte, Daniel

    2001-01-01

    The program PLACET is used to simulate transverse and longitudinal beam effects in the main linac, the drive-beam accelerator and the drive-beam decelerators of CLIC, as well as in the linac of CTF3. It provides different models of accelerating and decelerating structures, linear optics and thin multipoles. Several methods of beam-based alignment, including emittance tuning bumps and feedback, and different failure modes can be simulated. An interface to the beam-beam simulation code GUINEA-PIG exists. Currently, interfaces to MAD and TRANSPORT are under development and an extension to transfer lines and bunch compressors is also being made. In the future, the simulations will need to be performed by many users, which requires a simplified user interface. The paper describes the status of PLACET and plans for the futur

  12. Land Surface Model and Particle Swarm Optimization Algorithm Based on the Model-Optimization Method for Improving Soil Moisture Simulation in a Semi-Arid Region.

    Science.gov (United States)

    Yang, Qidong; Zuo, Hongchao; Li, Weidong

    2016-01-01

    Improving the capability of land-surface process models to simulate soil moisture assists in better understanding the atmosphere-land interaction. In semi-arid regions, due to limited near-surface observational data and large errors in large-scale parameters obtained by the remote sensing method, there exist uncertainties in land surface parameters, which can cause large offsets between the simulated results of land-surface process models and the observational data for the soil moisture. In this study, observational data from the Semi-Arid Climate Observatory and Laboratory (SACOL) station in the semi-arid loess plateau of China were divided into three datasets: summer, autumn, and summer-autumn. By combing the particle swarm optimization (PSO) algorithm and the land-surface process model SHAW (Simultaneous Heat and Water), the soil and vegetation parameters that are related to the soil moisture but difficult to obtain by observations are optimized using three datasets. On this basis, the SHAW model was run with the optimized parameters to simulate the characteristics of the land-surface process in the semi-arid loess plateau. Simultaneously, the default SHAW model was run with the same atmospheric forcing as a comparison test. Simulation results revealed the following: parameters optimized by the particle swarm optimization algorithm in all simulation tests improved simulations of the soil moisture and latent heat flux; differences between simulated results and observational data are clearly reduced, but simulation tests involving the adoption of optimized parameters cannot simultaneously improve the simulation results for the net radiation, sensible heat flux, and soil temperature. Optimized soil and vegetation parameters based on different datasets have the same order of magnitude but are not identical; soil parameters only vary to a small degree, but the variation range of vegetation parameters is large.

  13. Mathematical modeling based evaluation and simulation of boron removal in bioelectrochemical systems

    Energy Technology Data Exchange (ETDEWEB)

    Ping, Qingyun [Department of Civil and Environmental Engineering, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061 (United States); Abu-Reesh, Ibrahim M. [Department of Chemical Engineering, College of Engineering, Qatar University, P.O. Box 2713, Doha (Qatar); He, Zhen, E-mail: zhenhe@vt.edu [Department of Civil and Environmental Engineering, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061 (United States)

    2016-11-01

    Boron removal is an arising issue in desalination plants due to boron's toxicity. As an emerging treatment concept, bioelectrochemical systems (BES) can achieve potentially cost-effective boron removal by taking advantage of cathodic-produced alkali. Prior studies have demonstrated successful removal of boron in microbial desalination cells (MDCs) and microbial fuel cells (MFCs), both of which are representative BES. Herein, mathematical models were developed to further evaluate boron removal by different BES and understand the key operating factors. The models delivered very good prediction of the boron concentration in the MDC integrated with Donnan Dialysis (DD) system with the lowest relative root-mean-square error (RMSE) of 0.00%; the predication of the MFC performance generated the highest RMSE of 18.55%. The model results of salt concentration, solution pH, and current generation were well fitted with experimental data for RMSE values mostly below 10%. The long term simulation of the MDC-DD system suggests that the accumulation of salt in the catholyte/stripping solution could have a positive impact on the removal of boron due to osmosis-driven convection. The current generation in the MDC may have little influence on the boron removal, while in the MFC the current-driven electromigration can contribute up to 40% of boron removal. Osmosis-induced convection transport of boron could be the major driving force for boron removal to a low level < 2 mg L{sup −} {sup 1}. The ratio between the anolyte and the catholyte flow rates should be kept > 22.2 in order to avoid boron accumulation in the anolyte effluent. - Highlights: • Mathematical models are developed to understand boron removal in BES. • Boron removal can be driven by electromigration induced by current generation. • Diffusion induced by a salt concentration gradient also contributes to boron removal. • Osmosis and current driven convection transport play diverse roles in different BES.

  14. Simulating Durum Wheat (Triticum turgidum L. Response to Root Zone Salinity based on Statistics and Macroscopic Models

    Directory of Open Access Journals (Sweden)

    Vahid Reza Jalali

    2017-10-01

    water was taken from Maharlu Lake, Fars province, Iran. This natural and highly saline water with electrical conductivity of 512 dS/m diluted with fresh water to obtain the designated saline waters required for the experimental treatments. The designed experimental treatments were consisted of a non-saline water and five salinity levels of 2, 4, 6, 8 and 10 dS/m with three replicates. Three statistics of modified coefficient efficiency (E', modified index of agreement (d' and coefficient of residual mass (CRM were used to compare the used models and to assess their performances. Results and discussion Comparing the relative performance of models based on statistical indices of Modified Coefficient Efficiency (E' and Modified Index of agreement (d' indicated that the nonlinear model of Homaee et al. is most accurate between process-physical models and Modified Gompertz Function is most accurate between statistical-experimental models. Comparison assessment of all models based on statistical index indicated that Homaee et al. model was the most accurate model for simulation of durum wheat yield. This is while the parameters of Homaee et al. equation is well-defined concept and is easily measurable, but in statistical-experimental models, parameters of each model have no biophysical concept and the absolute values of each parameter do not express any information about development status of the plant. So, the nonlinear model of Homaee et al. was chosen as the optimal model in this research. Conclusion Most of the plants such as wheat, are sensitive to salinity and by increasing the age, their sensitivity to salinity are reduced. Based on the obtained results of this study, by knowing and quantitative assessment of the dominant cultivars sensitivity of each region, as well as using appropriate simulation models, one can use brackish or saline waters to partly compensate fresh water shortage for scientific and extension Agricultural programs.

  15. Profile control simulations and experiments on TCV: a controller test environment and results using a model-based predictive controller

    Science.gov (United States)

    Maljaars, E.; Felici, F.; Blanken, T. C.; Galperti, C.; Sauter, O.; de Baar, M. R.; Carpanese, F.; Goodman, T. P.; Kim, D.; Kim, S. H.; Kong, M.; Mavkov, B.; Merle, A.; Moret, J. M.; Nouailletas, R.; Scheffer, M.; Teplukhina, A. A.; Vu, N. M. T.; The EUROfusion MST1-team; The TCV-team

    2017-12-01

    The successful performance of a model predictive profile controller is demonstrated in simulations and experiments on the TCV tokamak, employing a profile controller test environment. Stable high-performance tokamak operation in hybrid and advanced plasma scenarios requires control over the safety factor profile (q-profile) and kinetic plasma parameters such as the plasma beta. This demands to establish reliable profile control routines in presently operational tokamaks. We present a model predictive profile controller that controls the q-profile and plasma beta using power requests to two clusters of gyrotrons and the plasma current request. The performance of the controller is analyzed in both simulation and TCV L-mode discharges where successful tracking of the estimated inverse q-profile as well as plasma beta is demonstrated under uncertain plasma conditions and the presence of disturbances. The controller exploits the knowledge of the time-varying actuator limits in the actuator input calculation itself such that fast transitions between targets are achieved without overshoot. A software environment is employed to prepare and test this and three other profile controllers in parallel in simulations and experiments on TCV. This set of tools includes the rapid plasma transport simulator RAPTOR and various algorithms to reconstruct the plasma equilibrium and plasma profiles by merging the available measurements with model-based predictions. In this work the estimated q-profile is merely based on RAPTOR model predictions due to the absence of internal current density measurements in TCV. These results encourage to further exploit model predictive profile control in experiments on TCV and other (future) tokamaks.

  16. Modeling of phosphorus loads in sugarcane in a low-relief landscape using ontology-based simulation.

    Science.gov (United States)

    Kwon, Ho-Young; Grunwald, Sabine; Beck, Howard W; Jung, Yunchul; Daroub, Samira H; Lang, Timothy A; Morgan, Kelly T

    2010-01-01

    Water flow and P dynamics in a low-relief landscape manipulated by extensive canal and ditch drainage systems were modeled utilizing an ontology-based simulation model. In the model, soil water flux and processes between three soil inorganic P pools (labile, active, and stable) and organic P are represented as database objects. And user-defined relationships among objects are used to automatically generate computer code (Java) for running the simulation of discharge and P loads. Our objectives were to develop ontology-based descriptions of soil P dynamics within sugarcane- (Saccharum officinarum L.) grown farm basins of the Everglades Agricultural Area (EAA) and to calibrate and validate such processes with water quality monitoring data collected at one farm basin (1244 ha). In the calibration phase (water year [WY] 99-00), observed discharge totaled 11,114 m3 ha(-1) and dissolved P 0.23 kg P ha(-1); and in the validation phase (WY 02-03), discharge was 10,397 m3 ha(-1) and dissolved P 0.11 kg P ha(-). During WY 99-00 the root mean square error (RMSE) for monthly discharge was 188 m3 ha(-1) and for monthly dissolved P 0.0077 kg P ha(-1); whereas during WY 02-03 the RMSE for monthly discharge was 195 m3 ha(-1) and monthly dissolved P 0.0022 kg P ha(-1). These results were confirmed by Nash-Sutcliffe Coefficient of 0.69 (calibration) and 0.81 (validation) comparing measured and simulated P loads. The good model performance suggests that our model has promise to simulate P dynamics, which may be useful as a management tool to reduce P loads in other similar low-relief areas.

  17. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  18. Linking Bayesian and agent-based models to simulate complex social-ecological systems in semi-arid regions

    Directory of Open Access Journals (Sweden)

    Aloah J Pope

    2015-08-01

    Full Text Available Interdependencies of ecologic, hydrologic, and social systems challenge traditional approaches to natural resource management in semi-arid regions. As a complex social-ecological system, water demands in the Sonoran Desert from agricultural and urban users often conflicts with water needs for its ecologically-significant riparian corridors. To explore this system, we developed an agent-based model to simulate complex feedbacks between human decisions and environmental conditions in the Rio Sonora Watershed. Cognitive mapping in conjunction with stakeholder participation produced a Bayesian model of conditional probabilities of local human decision-making processes resulting to changes in water demand. Probabilities created in the Bayesian model were incorporated into the agent-based model, so that each agent had a unique probability to make a positive decision based on its perceived environment at each point in time and space. By using a Bayesian approach, uncertainty in the human decision-making process could be incorporated. The spatially-explicit agent-based model simulated changes in depth-to-groundwater by well pumping based on an agent’s water demand. Changes in depth-to-groundwater feedback to influence agent behavior, as well as determine unique vegetation classes within the riparian corridor. Each vegetation class then provides varying stakeholder-defined quality values of ecosystem services. Using this modeling approach allowed us to examine effects on both the ecological and social system of semi-arid riparian corridors under various scenarios. The insight provided by the model contributes to understanding how specific interventions may alter the complex social-ecological system in the future.

  19. Autogenerator-based modelling framework for development of strategic games simulations: rational pigs game extended.

    Science.gov (United States)

    Fabac, Robert; Radošević, Danijel; Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, "the tramp," one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players.

  20. Correspondence model-based 4D VMAT dose simulation for analysis of local metastasis recurrence after extracranial SBRT

    Science.gov (United States)

    Sothmann, T.; Gauer, T.; Wilms, M.; Werner, R.

    2017-12-01

    The purpose of this study is to introduce a novel approach to incorporate patient-specific breathing variability information into 4D dose simulation of volumetric arc therapy (VMAT)-based stereotactic body radiotherapy (SBRT) of extracranial metastases. Feasibility of the approach is illustrated by application to treatment planning and motion data of lung and liver metastasis patients. The novel 4D dose simulation approach makes use of a regression-based correspondence model that allows representing patient motion variability by breathing signal-steered interpolation and extrapolation of deformable image registration motion fields. To predict the internal patient motion during treatment with only external breathing signal measurements being available, the patients’ internal motion information and external breathing signals acquired during 4D CT imaging were correlated. Combining the correspondence model, patient-specific breathing signal measurements during treatment and time-resolved information about dose delivery, reconstruction of a motion variability-affected dose becomes possible. As a proof of concept, the proposed approach is illustrated by a retrospective 4D simulation of VMAT-based SBRT treatment of ten patients with 15 treated lung and liver metastases and known clinical endpoints for the individual metastases (local metastasis recurrence yes/no). Resulting 4D-simulated dose distributions were compared to motion-affected dose distributions estimated by standard 4D CT-only dose accumulation and the originally (i.e. statically) planned dose distributions by means of GTV D98 indices (dose to 98% of the GTV volume). A potential linkage of metastasis-specific endpoints to differences between GTV D98 indices of planned and 4D-simulated dose distributions was analyzed.

  1. System-Level Modelling and Simulation of MEMS-Based Sensors

    DEFF Research Database (Denmark)

    Virk, Kashif M.; Madsen, Jan; Shafique, Mohammad

    2005-01-01

    The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration with the......The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration...... with the existing embedded system design methodologies is possible. In this paper, we present a MEMS design methodology that uses VHDL-AMS based system-level model of a MEMS device as a starting point and combines the top-down and bottom-up design approaches for design, verification, and optimization...

  2. A Satellite-Based Model for Simulating Ecosystem Respiration in the Tibetan and Inner Mongolian Grasslands

    Directory of Open Access Journals (Sweden)

    Rong Ge

    2018-01-01

    Full Text Available It is important to accurately evaluate ecosystem respiration (RE in the alpine grasslands of the Tibetan Plateau and the temperate grasslands of the Inner Mongolian Plateau, as it serves as a sensitivity indicator of regional and global carbon cycles. Here, we combined flux measurements taken between 2003 and 2013 from 16 grassland sites across northern China and the corresponding MODIS land surface temperature (LST, enhanced vegetation index (EVI, and land surface water index (LSWI to build a satellite-based model to estimate RE at a regional scale. First, the dependencies of both spatial and temporal variations of RE on these biotic and climatic factors were examined explicitly. We found that plant productivity and moisture, but not temperature, can best explain the spatial pattern of RE in northern China’s grasslands; while temperature plays a major role in regulating the temporal variability of RE in the alpine grasslands, and moisture is equally as important as temperature in the temperate grasslands. However, the moisture effect on RE and the explicit representation of spatial variation process are often lacking in most of the existing satellite-based RE models. On this basis, we developed a model by comprehensively considering moisture, temperature, and productivity effects on both temporal and spatial processes of RE, and then, we evaluated the model performance. Our results showed that the model well explained the observed RE in both the alpine (R2 = 0.79, RMSE = 0.77 g C m−2 day−1 and temperate grasslands (R2 = 0.75, RMSE = 0.60 g C m−2 day−1. The inclusion of the LSWI as the water-limiting factor substantially improved the model performance in arid and semi-arid ecosystems, and the spatialized basal respiration rate as an indicator for spatial variation largely determined the regional pattern of RE. Finally, the model accurately reproduced the seasonal and inter-annual variations and spatial variability of RE, and it avoided

  3. Plasma modelling and numerical simulation

    International Nuclear Information System (INIS)

    Van Dijk, J; Kroesen, G M W; Bogaerts, A

    2009-01-01

    Plasma modelling is an exciting subject in which virtually all physical disciplines are represented. Plasma models combine the electromagnetic, statistical and fluid dynamical theories that have their roots in the 19th century with the modern insights concerning the structure of matter that were developed throughout the 20th century. The present cluster issue consists of 20 invited contributions, which are representative of the state of the art in plasma modelling and numerical simulation. These contributions provide an in-depth discussion of the major theories and modelling and simulation strategies, and their applications to contemporary plasma-based technologies. In this editorial review, we introduce and complement those papers by providing a bird's eye perspective on plasma modelling and discussing the historical context in which it has surfaced. (editorial review)

  4. Simulation of glacial ocean biogeochemical tracer and isotope distributions based on the PMIP3 suite of climate models

    Science.gov (United States)

    Khatiwala, Samar; Muglia, Juan; Kvale, Karin; Schmittner, Andreas

    2016-04-01

    In the present climate system, buoyancy forced convection at high-latitudes together with internal mixing results in a vigorous overturning circulation whose major component is North Atlantic Deep Water. One of the key questions of climate science is whether this "mode" of circulation persisted during glacial periods, and in particular at the Last Glacial Maximum (LGM; 21000 years before present). Resolving this question is both important for advancing our understanding of the climate system, as well as a critical test of numerical models' ability to reliably simulate different climates. The observational evidence, based on interpreting geochemical tracers archived in sediments, is conflicting, as are simulations carried out with state-of-the-art climate models (e.g., as part of the PMIP3 suite), which, due to the computational cost involved, do not by and large include biogeochemical and isotope tracers that can be directly compared with proxy data. Here, we apply geochemical observations to evaluate the ability of several realisations of an ocean model driven by atmospheric forcing from the PMIP3 suite of climate models to simulate global ocean circulation during the LGM. This results in a wide range of circulation states that are then used to simulate biogeochemical tracer and isotope (13C, 14C and Pa/Th) distributions using an efficient, "offline" computational scheme known as the transport matrix method (TMM). One of the key advantages of this approach is the use of a uniform set of biogeochemical and isotope parameterizations across all the different circulations based on the PMIP3 models. We compare these simulated distributions to both modern observations and data from LGM ocean sediments to identify similarities and discrepancies between model and data. We find, for example, that when the ocean model is forced with wind stress from the PMIP3 models the radiocarbon age of the deep ocean is systematically younger compared with reconstructions. Changes in

  5. Numerical simulation of transitional flow on a wind turbine airfoil with RANS-based transition model

    Science.gov (United States)

    Zhang, Ye; Sun, Zhengzhong; van Zuijlen, Alexander; van Bussel, Gerard

    2017-09-01

    This paper presents a numerical investigation of transitional flow on the wind turbine airfoil DU91-W2-250 with chord-based Reynolds number Rec = 1.0 × 106. The Reynolds-averaged Navier-Stokes based transition model using laminar kinetic energy concept, namely the k - kL - ω model, is employed to resolve the boundary layer transition. Some ambiguities for this model are discussed and it is further implemented into OpenFOAM-2.1.1. The k - kL - ω model is first validated through the chosen wind turbine airfoil at the angle of attack (AoA) of 6.24° against wind tunnel measurement, where lift and drag coefficients, surface pressure distribution and transition location are compared. In order to reveal the transitional flow on the airfoil, the mean boundary layer profiles in three zones, namely the laminar, transitional and fully turbulent regimes, are investigated. Observation of flow at the transition location identifies the laminar separation bubble. The AoA effect on boundary layer transition over wind turbine airfoil is also studied. Increasing the AoA from -3° to 10°, the laminar separation bubble moves upstream and reduces in size, which is in close agreement with wind tunnel measurement.

  6. Accounting for large deformations in real-time simulations of soft tissues based on reduced-order models.

    Science.gov (United States)

    Niroomandi, S; Alfaro, I; Cueto, E; Chinesta, F

    2012-01-01

    Model reduction techniques have shown to constitute a valuable tool for real-time simulation in surgical environments and other fields. However, some limitations, imposed by real-time constraints, have not yet been overcome. One of such limitations is the severe limitation in time (established in 500Hz of frequency for the resolution) that precludes the employ of Newton-like schemes for solving non-linear models as the ones usually employed for modeling biological tissues. In this work we present a technique able to deal with geometrically non-linear models, based on the employ of model reduction techniques, together with an efficient non-linear solver. Examples of the performance of the technique over some examples will be given. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. The application of dynamic micro-simulation model of urban planning based on multi-agent system

    Science.gov (United States)

    Xu, J.; Shiming, W.

    2012-12-01

    The dynamic micro-simulation model of urban planning based on multi-agent, is mainly used to measure and predict the impact of the policy on urban land use, employment opportunities and the price of real estate. The representation of the supply and characteristics of land and of real estate development, at a spatial scale. The use of real estate markets as a central organizing focus, with consumer choices and supplier choices explicitly represented, as well as the resulting effects on real estate prices. The relationship of agents to real estate tied to specific locations provided a clean accounting of space and its use. Finally, it will produce a map composited with the dynamic demographic distribution and the dynamic employment transfer by the geographic spatial data. With the data produced by the urban micro-simulation model, it can provide the favorable forecast reference for the scientific urban land use.

  8. Physics-based simulation modeling and optimization of microstructural changes induced by machining and selective laser melting processes in titanium and nickel based alloys

    Science.gov (United States)

    Arisoy, Yigit Muzaffer

    Manufacturing processes may significantly affect the quality of resultant surfaces and structural integrity of the metal end products. Controlling manufacturing process induced changes to the product's surface integrity may improve the fatigue life and overall reliability of the end product. The goal of this study is to model the phenomena that result in microstructural alterations and improve the surface integrity of the manufactured parts by utilizing physics-based process simulations and other computational methods. Two different (both conventional and advanced) manufacturing processes; i.e. machining of Titanium and Nickel-based alloys and selective laser melting of Nickel-based powder alloys are studied. 3D Finite Element (FE) process simulations are developed and experimental data that validates these process simulation models are generated to compare against predictions. Computational process modeling and optimization have been performed for machining induced microstructure that includes; i) predicting recrystallization and grain size using FE simulations and the Johnson-Mehl-Avrami-Kolmogorov (JMAK) model, ii) predicting microhardness using non-linear regression models and the Random Forests method, and iii) multi-objective machining optimization for minimizing microstructural changes. Experimental analysis and computational process modeling of selective laser melting have been also conducted including; i) microstructural analysis of grain sizes and growth directions using SEM imaging and machine learning algorithms, ii) analysis of thermal imaging for spattering, heating/cooling rates and meltpool size, iii) predicting thermal field, meltpool size, and growth directions via thermal gradients using 3D FE simulations, iv) predicting localized solidification using the Phase Field method. These computational process models and predictive models, once utilized by industry to optimize process parameters, have the ultimate potential to improve performance of

  9. Microsurgery Workout: A Novel Simulation Training Curriculum Based on Nonliving Models.

    Science.gov (United States)

    Rodriguez, Jose R; Yañez, Ricardo; Cifuentes, Ignacio; Varas, Julian; Dagnino, Bruno

    2016-10-01

    Currently, there are no valid training programs based solely on nonliving models. The authors aimed to develop and validate a microsurgery training program based on nonliving models and assess the transfer of skills to a live rat model. Postgraduate year-3 general surgery residents were assessed in a 17-session program, performing arterial and venous end-to-end anastomosis on ex vivo chicken models. Procedures were recorded and rated by two blinded experts using validated global and specific scales (objective structured assessment of technical skills) and a validated checklist. Operating times and patency rates were assessed. Hand-motion analysis was used to measure economy of movements. After training, residents performed an arterial and venous end-to-end anastomosis on live rats. Results were compared to six experienced surgeons in the same models. Values of p < 0.05 were considered statistically significant. Learning curves were achieved. Ten residents improved their median global and specific objective structured assessment of technical skills scores for artery [10 (range, 8 to 10) versus 28 (range, 27 to 29), p < 0.05; and 8 (range, 7 to 9) versus 28 (range, 27 to 28), p < 0.05] and vein [8 (range, 8 to 11) versus 28 (range, 27 to 28), p < 0.05; and 8 (range, 7 to 9) versus 28 (range, 27 to 29), p < 0.05]. Checklist scores also improved for both procedures (p < 0.05). Trainees were slower and less efficient than experienced surgeons (p < 0.05). In the living rat, patency rates at 30 minutes were 100 percent and 50 percent for artery and vein, respectively. Significant acquisition of microsurgical skills was achieved by trainees to a level similar to that of experienced surgeons. Acquired skills were transferred to a more complex live model.

  10. Mathematical modeling based evaluation and simulation of boron removal in bioelectrochemical systems.

    Science.gov (United States)

    Ping, Qingyun; Abu-Reesh, Ibrahim M; He, Zhen

    2016-11-01

    Boron removal is an arising issue in desalination plants due to boron's toxicity. As an emerging treatment concept, bioelectrochemical systems (BES) can achieve potentially cost-effective boron removal by taking advantage of cathodic-produced alkali. Prior studies have demonstrated successful removal of boron in microbial desalination cells (MDCs) and microbial fuel cells (MFCs), both of which are representative BES. Herein, mathematical models were developed to further evaluate boron removal by different BES and understand the key operating factors. The models delivered very good prediction of the boron concentration in the MDC integrated with Donnan Dialysis (DD) system with the lowest relative root-mean-square error (RMSE) of 0.00%; the predication of the MFC performance generated the highest RMSE of 18.55%. The model results of salt concentration, solution pH, and current generation were well fitted with experimental data for RMSE values mostly below 10%. The long term simulation of the MDC-DD system suggests that the accumulation of salt in the catholyte/stripping solution could have a positive impact on the removal of boron due to osmosis-driven convection. The current generation in the MDC may have little influence on the boron removal, while in the MFC the current-driven electromigration can contribute up to 40% of boron removal. Osmosis-induced convection transport of boron could be the major driving force for boron removal to a low level 22.2 in order to avoid boron accumulation in the anolyte effluent. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Designing Citizen Business Loan Model to Reduce Non-Performing Loan: An Agent-based Modeling and Simulation Approach in Regional Development

    Directory of Open Access Journals (Sweden)

    Moses L Singgih

    2015-09-01

    Full Text Available Citizen Business Loan (CBL constitutes a program poverty alleviation based on economic empowerment of small and medium enterprise. This study focuses on implementation of CBL at Regional Development Bank branch X. The problem is the existing of interdependencies between CBL’s implements (Bank and the uncertainty of debtor’s capability in returning the credit. The impact of this circumstance is non-performing loan (NPL becomes relatively high (22%. The ultimate objective is to minimize NPL by designing the model based on the agent that can represent the problem through a simulation using agent-based modeling and simulation (ABMS. The model is considered by managing the probability of the debtor to pay or not based on 5 C categories, they are: character, capacity, capital, condition, and collateral that inherent to each debtor. There are two improvement scenarios proposed in this model. The first scenario only involves the first category of debtor in simulation. The result of this scenario is NPL value as 0%. The second scenario includes the first and second of debtor’s category in simulation and resulting NPL value between 4.6% and 11.4%.

  12. SBMLmod: a Python-based web application and web service for efficient data integration and model simulation.

    Science.gov (United States)

    Schäuble, Sascha; Stavrum, Anne-Kristin; Bockwoldt, Mathias; Puntervoll, Pål; Heiland, Ines

    2017-06-24

    Systems Biology Markup Language (SBML) is the standard model representation and description language in systems biology. Enriching and analysing systems biology models by integrating the multitude of available data, increases the predictive power of these models. This may be a daunting task, which commonly requires bioinformatic competence and scripting. We present SBMLmod, a Python-based web application and service, that automates integration of high throughput data into SBML models. Subsequent steady state analysis is readily accessible via the web service COPASIWS. We illustrate the utility of SBMLmod by integrating gene expression data from different healthy tissues as well as from a cancer dataset into a previously published model of mammalian tryptophan metabolism. SBMLmod is a user-friendly platform for model modification and simulation. The web application is available at http://sbmlmod.uit.no , whereas the WSDL definition file for the web service is accessible via http://sbmlmod.uit.no/SBMLmod.wsdl . Furthermore, the entire package can be downloaded from https://github.com/MolecularBioinformatics/sbml-mod-ws . We envision that SBMLmod will make automated model modification and simulation available to a broader research community.

  13. An ensemble-based dynamic Bayesian averaging approach for discharge simulations using multiple global precipitation products and hydrological models

    Science.gov (United States)

    Qi, Wei; Liu, Junguo; Yang, Hong; Sweetapple, Chris

    2018-03-01

    Global precipitation products are very important datasets in flow simulations, especially in poorly gauged regions. Uncertainties resulting from precipitation products, hydrological models and their combinations vary with time and data magnitude, and undermine their application to flow simulations. However, previous studies have not quantified these uncertainties individually and explicitly. This study developed an ensemble-based dynamic Bayesian averaging approach (e-Bay) for deterministic discharge simulations using multiple global precipitation products and hydrological models. In this approach, the joint probability of precipitation products and hydrological models being correct is quantified based on uncertainties in maximum and mean estimation, posterior probability is quantified as functions of the magnitude and timing of discharges, and the law of total probability is implemented to calculate expected discharges. Six global fine-resolution precipitation products and two hydrological models of different complexities are included in an illustrative application. e-Bay can effectively quantify uncertainties and therefore generate better deterministic discharges than traditional approaches (weighted average methods with equal and varying weights and maximum likelihood approach). The mean Nash-Sutcliffe Efficiency values of e-Bay are up to 0.97 and 0.85 in training and validation periods respectively, which are at least 0.06 and 0.13 higher than traditional approaches. In addition, with increased training data, assessment criteria values of e-Bay show smaller fluctuations than traditional approaches and its performance becomes outstanding. The proposed e-Bay approach bridges the gap between global precipitation products and their pragmatic applications to discharge simulations, and is beneficial to water resources management in ungauged or poorly gauged regions across the world.

  14. The Central Simulation Committee (CSC): a model for centralization and standardization of simulation-based medical education in the U.S. Army healthcare system.

    Science.gov (United States)

    Deering, Shad; Sawyer, Taylor; Mikita, Jeffrey; Maurer, Douglas; Roth, Bernard J

    2012-07-01

    In this report, we describe the organizational framework, operations and current status of the Central Simulation Committee (CSC). The CSC was established in 2007 with the goals of standardizing simulation-based training in Army graduate medical education programs, assisting in redeployment training of physicians returning from war, and improving patient safety within the Army Medical Department. Presently, the CSC oversees 10 Simulation Centers, controls over 21,000 sq ft of simulation center space, and provides specialty-specific training in 14 medical specialties. In the past 2 years, CSC Simulation Centers have trained over 50,000 Army medical students, residents, physician assistants, nurses, Soldiers and DoD civilian medical personnel. We hope this report provides simulation educators within the military, and our civilian simulation colleagues, with insight into the workings of our organization and provides an example of centralized support and oversight of simulation-based medical education.

  15. Theoretical and Simulations-Based Modeling of Micellization in Linear and Branched Surfactant Systems

    Science.gov (United States)

    Mendenhall, Jonathan D.

    's and other micellization properties for a variety of linear and branched surfactant chemical architectures which are commonly encountered in practice. Single-component surfactant solutions are investigated, in order to clarify the specific contributions of the surfactant head and tail to the free energy of micellization, a quantity which determines the cmc and all other aspects of micellization. First, a molecular-thermodynamic (MT) theory is presented which makes use of bulk-phase thermodynamics and a phenomenological thought process to describe the energetics related to the formation of a micelle from its constituent surfactant monomers. Second, a combined computer-simulation/molecular-thermodynamic (CSMT) framework is discussed which provides a more detailed quantification of the hydrophobic effect using molecular dynamics simulations. A novel computational strategy to identify surfactant head and tail using an iterative dividing surface approach, along with simulated micelle results, is proposed. Force-field development for novel surfactant structures is also discussed. Third, a statistical-thermodynamic, single-chain, mean-field theory for linear and branched tail packing is formulated, which enables quantification of the specific energetic penalties related to confinement and constraint of surfactant tails within micelles. Finally, these theoretical and simulations-based strategies are used to predict the micellization behavior of 55 linear surfactants and 28 branched surfactants. Critical micelle concentration and optimal micelle properties are reported and compared with experiment, demonstrating good agreement across a range of surfactant head and tail types. In particular, the CSMT framework is found to provide improved agreement with experimental cmc's for the branched surfactants considered. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  16. Tyre-road friction coefficient estimation based on tyre sensors and lateral tyre deflection: modelling, simulations and experiments

    Science.gov (United States)

    Hong, Sanghyun; Erdogan, Gurkan; Hedrick, Karl; Borrelli, Francesco

    2013-05-01

    The estimation of the tyre-road friction coefficient is fundamental for vehicle control systems. Tyre sensors enable the friction coefficient estimation based on signals extracted directly from tyres. This paper presents a tyre-road friction coefficient estimation algorithm based on tyre lateral deflection obtained from lateral acceleration. The lateral acceleration is measured by wireless three-dimensional accelerometers embedded inside the tyres. The proposed algorithm first determines the contact patch using a radial acceleration profile. Then, the portion of the lateral acceleration profile, only inside the tyre-road contact patch, is used to estimate the friction coefficient through a tyre brush model and a simple tyre model. The proposed strategy accounts for orientation-variation of accelerometer body frame during tyre rotation. The effectiveness and performance of the algorithm are demonstrated through finite element model simulations and experimental tests with small tyre slip angles on different road surface conditions.

  17. Comparing the Performance of Commonly Available Digital Elevation Models in GIS-based Flood Simulation

    Science.gov (United States)

    Ybanez, R. L.; Lagmay, A. M. A.; David, C. P.

    2016-12-01

    With climatological hazards increasing globally, the Philippines is listed as one of the most vulnerable countries in the world due to its location in the Western Pacific. Flood hazards mapping and modelling is one of the responses by local government and research institutions to help prepare for and mitigate the effects of flood hazards that constantly threaten towns and cities in floodplains during the 6-month rainy season. Available digital elevation maps, which serve as the most important dataset used in 2D flood modelling, are limited in the Philippines and testing is needed to determine which of the few would work best for flood hazards mapping and modelling. Two-dimensional GIS-based flood modelling with the flood-routing software FLO-2D was conducted using three different available DEMs from the ASTER GDEM, the SRTM GDEM, and the locally available IfSAR DTM. All other parameters kept uniform, such as resolution, soil parameters, rainfall amount, and surface roughness, the three models were run over a 129-sq. kilometer watershed with only the basemap varying. The output flood hazard maps were compared on the basis of their flood distribution, extent, and depth. The ASTER and SRTM GDEMs contained too much error and noise which manifested as dissipated and dissolved hazard areas in the lower watershed where clearly delineated flood hazards should be present. Noise on the two datasets are clearly visible as erratic mounds in the floodplain. The dataset which produced the only feasible flood hazard map is the IfSAR DTM which delineates flood hazard areas clearly and properly. Despite the use of ASTER and SRTM with their published resolution and accuracy, their use in GIS-based flood modelling would be unreliable. Although not as accessible, only IfSAR or better datasets should be used for creating secondary products from these base DEM datasets. For developing countries which are most prone to hazards, but with limited choices for basemaps used in hazards

  18. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  19. Model-Based Control of a Nonlinear Aircraft Engine Simulation using an Optimal Tuner Kalman Filter Approach

    Science.gov (United States)

    Connolly, Joseph W.; Csank, Jeffrey Thomas; Chicatelli, Amy; Kilver, Jacob

    2013-01-01

    This paper covers the development of a model-based engine control (MBEC) methodology featuring a self tuning on-board model applied to an aircraft turbofan engine simulation. Here, the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40k) serves as the MBEC application engine. CMAPSS40k is capable of modeling realistic engine performance, allowing for a verification of the MBEC over a wide range of operating points. The on-board model is a piece-wise linear model derived from CMAPSS40k and updated using an optimal tuner Kalman Filter (OTKF) estimation routine, which enables the on-board model to self-tune to account for engine performance variations. The focus here is on developing a methodology for MBEC with direct control of estimated parameters of interest such as thrust and stall margins. Investigations using the MBEC to provide a stall margin limit for the controller protection logic are presented that could provide benefits over a simple acceleration schedule that is currently used in traditional engine control architectures.

  20. Sensitivity Analysis of an ENteric Immunity SImulator (ENISI)-Based Model of Immune Responses to Helicobacter pylori Infection.

    Science.gov (United States)

    Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav

    2015-01-01

    Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.

  1. An automated process for building reliable and optimal in vitro/in vivo correlation models based on Monte Carlo simulations.

    Science.gov (United States)

    Sutton, Steven C; Hu, Mingxiu

    2006-05-05

    Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.

  2. Simulation of Ni-63 based nuclear micro battery using Monte Carlo modeling

    International Nuclear Information System (INIS)

    Kim, Tae Ho; Kim, Ji Hyun

    2013-01-01

    The radioisotope batteries have an energy density of 100-10000 times greater than chemical batteries. Also, Li ion battery has the fundamental problems such as short life time and requires recharge system. In addition to these things, the existing batteries are hard to operate at internal human body, national defense arms or space environment. Since the development of semiconductor process and materials technology, the micro device is much more integrated. It is expected that, based on new semiconductor technology, the conversion device efficiency of betavoltaic battery will be highly increased. Furthermore, the radioactivity from the beta particle cannot penetrate a skin of human body, so it is safer than Li battery which has the probability to explosion. In the other words, the interest for radioisotope battery is increased because it can be applicable to an artificial internal organ power source without recharge and replacement, micro sensor applied to arctic and special environment, small size military equipment and space industry. However, there is not enough data for beta particle fluence from radioisotope source using nuclear battery. Beta particle fluence directly influences on battery efficiency and it is seriously affected by radioisotope source thickness because of self-absorption effect. Therefore, in this article, we present a basic design of Ni-63 nuclear battery and simulation data of beta particle fluence with various thickness of radioisotope source and design of battery

  3. A Novel PSO Model Based on Simulating Human Social Communication Behavior

    Directory of Open Access Journals (Sweden)

    Yanmin Liu

    2012-01-01

    Full Text Available In order to solve the complicated multimodal problems, this paper presents a variant of particle swarm optimizer (PSO based on the simulation of the human social communication behavior (HSCPSO. In HSCPSO, each particle initially joins a default number of social circles (SC that consist of some particles, and its learning exemplars include three parts, namely, its own best experience, the experience of the best performing particle in all SCs, and the experiences of the particles of all SCs it is a member of. The learning strategy takes full advantage of the excellent information of each particle to improve the diversity of the swarm to discourage premature convergence. To weight the effects of the particles on the SCs, the worst performing particles will join more SCs to learn from other particles and the best performing particles will leave SCs to reduce their strong influence on other members. Additionally, to insure the effectiveness of solving multimodal problems, the novel parallel hybrid mutation is proposed to improve the particle’s ability to escape from the local optima. Experiments were conducted on a set of classical benchmark functions, and the results demonstrate the good performance of HSCPSO in escaping from the local optima and solving the complex multimodal problems compared with the other PSO variants.

  4. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  5. Modeling and simulating pedestrian shopping behavior based on principles of boundede rationality

    NARCIS (Netherlands)

    Zhu, W.; Timmermans, H.J.P.; Timmermans, H.J.P.

    2009-01-01

    Modeling pedestrian behavior and decision making has dominantly relied on rational choice models, especially discrete choice models. These models, however, may not be appropriate for modeling the decision processes because they assume unrealistic cognitive and computational abilities of decision

  6. L-py: an L-system simulation framework for modeling plant architecture development based on a dynamic language.

    Science.gov (United States)

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  7. L-Py: an L-System simulation framework for modeling plant development based on a dynamic language

    Directory of Open Access Journals (Sweden)

    Frederic eBoudon

    2012-05-01

    Full Text Available The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e. languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: i by keeping a simple syntax while allowing for high-level programming constructs, ii by making code execution easy and avoiding compilation overhead iii allowing a high level of model reusability and the building of complex modular models iv and by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  8. The modeling and simulation of thermal based modified solid oxide fuel cell (SOFC for grid-connected systems

    Directory of Open Access Journals (Sweden)

    Ayetül Gelen

    2015-05-01

    Full Text Available This paper presents a thermal based modified dynamic model of a Solid Oxide Fuel Cell (SOFC for grid-connected systems. The proposed fuel cell model involves ohmic, activation and concentration voltage losses, thermal dynamics, methanol reformer, fuel utilization factor and power limiting module. A power conditioning unit (PCU, which consists of a DC-DC boost converter and a DC-AC voltage-source inverter (VSI, their controller, transformer and filter, is designed for grid-connected systems. The voltage-source inverter with six Insulated Gate Bipolar Transistor (IGBT switches inverts the DC voltage that comes from the converter into a sinusoidal voltage synchronized with the grid. The simulations and modeling of the system are developed on Matlab/Simulink environment. The performance of SOFC with converter is examined under step and random load conditions. The simulation results show that the designed boost converter for the proposed thermal based modified SOFC model has fairly followed different DC load variations. Finally, the AC bus of 400 Volt and 50 Hz is connected to a single-machine infinite bus (SMIB through a transmission line. The real and reactive power managements of the inverter are analyzed by an infinite bus system. Thus, the desired nominal values are properly obtained by means of the inverter controller.

  9. Finite-time adaptive sliding mode force control for electro-hydraulic load simulator based on improved GMS friction model

    Science.gov (United States)

    Kang, Shuo; Yan, Hao; Dong, Lijing; Li, Changchun

    2018-03-01

    This paper addresses the force tracking problem of electro-hydraulic load simulator under the influence of nonlinear friction and uncertain disturbance. A nonlinear system model combined with the improved generalized Maxwell-slip (GMS) friction model is firstly derived to describe the characteristics of load simulator system more accurately. Then, by using particle swarm optimization (PSO) algorithm ​combined with the system hysteresis characteristic analysis, the GMS friction parameters are identified. To compensate for nonlinear friction and uncertain disturbance, a finite-time adaptive sliding mode control method is proposed based on the accurate system model. This controller has the ability to ensure that the system state moves along the nonlinear sliding surface to steady state in a short time as well as good dynamic properties under the influence of parametric uncertainties and disturbance, which further improves the force loading accuracy and rapidity. At the end of this work, simulation and experimental results are employed to demonstrate the effectiveness of the proposed sliding mode control strategy.

  10. The realistic consideration of human factors in model based simulation tools for the air traffic control domain.

    Science.gov (United States)

    Duca, Gabriella; Attaianese, Erminia

    2012-01-01

    Advanced Air Traffic Management (ATM) concepts related to automation, airspace organization and operational procedures are driven by the overall goal to increase ATM system performance. Independently on the nature and/or impact of envisaged changes (e.g. from a short term procedure adjustment to a very long term operational concept or aid tools completion), the preliminary assessment of possible gains in airspace/airport capacity, safety and cost-effectiveness is done by running Model Based Simulations (MBSs, also known as Fast Time Simulations - FTS). Being a not human-in-the-loop technique, the reliability of a MBS results depend on the accuracy and significance of modeled human factors. Despite that, it can be observed in the practice that modeling tools commonly assume a generalized standardization of human behaviors and tasks and consider a very few range of work environment factors that, in the reality, affect the actual human-system performance. The present paper is aimed at opening a discussion about the possibility to keep task description and related weight at a high/general level, suitable for an efficient use of MBSs and, at the same time, increasing simulations reliability adopting some adjustment coming from the elaboration of further variables related to the human aspects of controllers workload.

  11. Improving ADM1 model to simulate anaerobic digestion start-up with inhibition phase based on cattle slurry

    International Nuclear Information System (INIS)

    Normak, A.; Suurpere, J.; Suitso, I.; Jõgi, E.; Kokin, E.; Pitk, P.

    2015-01-01

    The Anaerobic Digestion Model No.1 (ADM1) was improved to simulate an anaerobic digestion start-up phase. To improve the ADM1, a combined hydrolysis equation was used based on the Contois model of bacterial growth and the function of hydrolysis inhibition by VFA. The start-up with fresh cattle slurry was carried out in a pilot-scale reactor to calibrate the chosen parameters of the ADM1. The important aspects of model calibration were hydrolysis rate, the number of anaerobic microbes in cattle slurry, and the growth rate of bacteria. Good simulation results were achieved after calibration for the independent start-up test with pre-conditioned cattle slurry. - Highlights: • Improved ADM1 can be used for simulation of reactor start-up with inhibition phase. • The hydrolysis rate had a decreased value in case of high VFA concentration or low number of hydrolytic bacteria. • Hydrolysis inhibitory threshold value of 9.85 g L −1 was obtained for VFA. • Start-up with pre-conditioned cattle slurry had a relatively short inhibition phase

  12. Simulation of the Role of Government in Spatial Agent-Based Model

    Directory of Open Access Journals (Sweden)

    Viktor Ivanovich Suslov

    2016-09-01

    Full Text Available The paper describes the further development of an agent-based multiregional input-output model of the Russian economy. We consider the idea of incorporating the government into the model and analyze the results of experimental calculations for the conditional example of spatial economy. New agents are included into the model such as the federal and regional governments, pension fund and also the state enterprises producing public goods at the federal and regional levels. The government sets four types of taxes (personal and business income taxes, VAT and payroll taxes, ensures the provision of public goods and provides social, investment and interbudgetary transfers to households, firms and budgets. Social transfers consist of social assistance and unemployment benefits. The utility functions of households are expanded by the terms associated with national and regional public goods. The budget policy is designed in accordance with the maximization of isoelastic function of social welfare that formalizes the choice between the different concepts of social justice. The Gini index is used for the monitoring the inequality of income distribution. The results of experimental calculations present the convergence of the new version of the model to the state of quasi-equilibrium. The special attention is paid an optimal level of the taxation maximizing the social welfare function. Four variants of the optimal tax rates are defined: for three major taxes at a fixed proportion of rates and for each of the tax separately at zero rates of two other taxes. The further directions of modelling are identified, they allow to investigate the spatial development of the Russian economy taking into account the decision-making by private agents in responding to government policies.

  13. Biomolecular modelling and simulations

    CERN Document Server

    Karabencheva-Christova, Tatyana

    2014-01-01

    Published continuously since 1944, the Advances in Protein Chemistry and Structural Biology series is the essential resource for protein chemists. Each volume brings forth new information about protocols and analysis of proteins. Each thematically organized volume is guest edited by leading experts in a broad range of protein-related topics. Describes advances in biomolecular modelling and simulations Chapters are written by authorities in their field Targeted to a wide audience of researchers, specialists, and students The information provided in the volume is well supported by a number of high quality illustrations, figures, and tables.

  14. Modeling and simulation of an activated carbon–CO2 four bed based adsorption cooling system

    International Nuclear Information System (INIS)

    Jribi, Skander; Saha, Bidyut Baran; Koyama, Shigeru; Bentaher, Hatem

    2014-01-01

    Highlights: • A transient mathematical model of a 4-bed adsorption chiller is proposed. • The performances of the cyclic-steady-state system are presented for different heating and cooling water inlet temperatures. • The desorption pressure has a big influence in the performances. • With 80 kg of Maxsorb III, the CO 2 based adsorption chiller produces 2 kW of cooling power and presents a COP of 0.1. - Abstract: In this study, a transient mathematical model of a 4-bed adsorption chiller using Maxsorb III as the adsorbent and CO 2 as the refrigerant has been analyzed. The performances of the cyclic-steady-state system are presented for different heating and cooling water inlet temperatures. It is found that the desorption pressure has a big influence in the performances due to the low critical point of CO 2 (T c = 31 °C). With 80 kg of Maxsorb III, the CO 2 based adsorption chiller produces 2 kW of cooling power and presents a COP of 0.1, at driving heat source temperature of 95 °C along with a cooling temperature of 27 °C and at optimum desorption pressure of 79 bar. The present thermal compression air-conditioning system could be driven with solar energy or waste heat from internal combustion engines and therefore is suitable for both residential and mobile air-conditioning applications

  15. [Simulation of vegetation indices optimizing under retrieval of vegetation biochemical parameters based on PROSPECT + SAIL model].

    Science.gov (United States)

    Wu, Ling; Liu, Xiang-Nan; Zhou, Bo-Tian; Liu, Chuan-Hao; Li, Lu-Feng

    2012-12-01

    This study analyzed the sensitivities of three vegetation biochemical parameters [chlorophyll content (Cab), leaf water content (Cw), and leaf area index (LAI)] to the changes of canopy reflectance, with the effects of each parameter on the wavelength regions of canopy reflectance considered, and selected three vegetation indices as the optimization comparison targets of cost function. Then, the Cab, Cw, and LAI were estimated, based on the particle swarm optimization algorithm and PROSPECT + SAIL model. The results showed that retrieval efficiency with vegetation indices as the optimization comparison targets of cost function was better than that with all spectral reflectance. The correlation coefficients (R2) between the measured and estimated values of Cab, Cw, and LAI were 90.8%, 95.7%, and 99.7%, and the root mean square errors of Cab, Cw, and LAI were 4.73 microg x cm(-2), 0.001 g x cm(-2), and 0.08, respectively. It was suggested that to adopt vegetation indices as the optimization comparison targets of cost function could effectively improve the efficiency and precision of the retrieval of biochemical parameters based on PROSPECT + SAIL model.

  16. Simulating boreal forest carbon dynamics after stand-replacing fire disturbance: insights from a global process-based vegetation model

    Science.gov (United States)

    Yue, C.; Ciais, P.; Luyssaert, S.; Cadule, P.; Harden, J.; Randerson, J.; Bellassen, V.; Wang, T.; Piao, S.L.; Poulter, B.; Viovy, N.

    2013-01-01

    Stand-replacing fires are the dominant fire type in North American boreal forests. They leave a historical legacy of a mosaic landscape of different aged forest cohorts. This forest age dynamics must be included in vegetation models to accurately quantify the role of fire in the historical and current regional forest carbon balance. The present study adapted the global process-based vegetation model ORCHIDEE to simulate the CO2 emissions from boreal forest fire and the subsequent recovery after a stand-replacing fire; the model represents postfire new cohort establishment, forest stand structure and the self-thinning process. Simulation results are evaluated against observations of three clusters of postfire forest chronosequences in Canada and Alaska. The variables evaluated include: fire carbon emissions, CO2 fluxes (gross primary production, total ecosystem respiration and net ecosystem exchange), leaf area index, and biometric measurements (aboveground biomass carbon, forest floor carbon, woody debris carbon, stand individual density, stand basal area, and mean diameter at breast height). When forced by local climate and the atmospheric CO2 history at each chronosequence site, the model simulations generally match the observed CO2 fluxes and carbon stock data well, with model-measurement mean square root of deviation comparable with the measurement accuracy (for CO2 flux ~100 g C m−2 yr−1, for biomass carbon ~1000 g C m−2 and for soil carbon ~2000 g C m−2). We find that the current postfire forest carbon sink at the evaluation sites, as observed by chronosequence methods, is mainly due to a combination of historical CO2 increase and forest succession. Climate change and variability during this period offsets some of these expected carbon gains. The negative impacts of climate were a likely consequence of increasing water stress caused by significant temperature increases that were not matched by concurrent increases in precipitation. Our simulation

  17. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    Science.gov (United States)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  18. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  19. An agent-based simulation model of patient choice of health care providers in accountable care organizations.

    Science.gov (United States)

    Alibrahim, Abdullah; Wu, Shinyi

    2018-03-01

    Accountable care organizations (ACO) in the United States show promise in controlling health care costs while preserving patients' choice of providers. Understanding the effects of patient choice is critical in novel payment and delivery models like ACO that depend on continuity of care and accountability. The financial, utilization, and behavioral implications associated with a patient's decision to forego local health care providers for more distant ones to access higher quality care remain unknown. To study this question, we used an agent-based simulation model of a health care market composed of providers able to form ACO serving patients and embedded it in a conditional logit decision model to examine patients capable of choosing their care providers. This simulation focuses on Medicare beneficiaries and their congestive heart failure (CHF) outcomes. We place the patient agents in an ACO delivery system model in which provider agents decide if they remain in an ACO and perform a quality improving CHF disease management intervention. Illustrative results show that allowing patients to choose their providers reduces the yearly payment per CHF patient by $320, reduces mortality rates by 0.12 percentage points and hospitalization rates by 0.44 percentage points, and marginally increases provider participation in ACO. This study demonstrates a model capable of quantifying the effects of patient choice in a theoretical ACO system and provides a potential tool for policymakers to understand implications of patient choice and assess potential policy controls.

  20. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  1. Dynamic Flight Simulation Utilizing High Fidelity CFD-Based Nonlinear Reduced Order Model, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The Nonlinear Dynamic Flight Simulation (NL-DFS) system will be developed in the Phase II project by combining the classical nonlinear rigid-body flight dynamics...

  2. Enabling the Analysis of Emergent Behavior in Future Electrical Distribution Systems Using Agent-Based Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Sonja Kolen

    2018-01-01

    Full Text Available In future electrical distribution systems, component heterogeneity and their cyber-physical interactions through electrical lines and communication lead to emergent system behavior. As the distribution systems represent the largest part of an energy system with respect to the number of nodes and components, large-scale studies of their emergent behavior are vital for the development of decentralized control strategies. This paper presents and evaluates DistAIX, a novel agent-based modeling and simulation tool to conduct such studies. The major novelty is a parallelization of the entire model—including the power system, communication system, control, and all interactions—using processes instead of threads. Thereby, a distribution of the simulation to multiple computing nodes with a distributed memory architecture becomes possible. This makes DistAIX scalable and allows the inclusion of as many processing units in the simulation as desired. The scalability of DistAIX is demonstrated by simulations of large-scale scenarios. Additionally, the capability of observing emergent behavior is demonstrated for an exemplary distribution grid with a large number of interacting components.

  3. Evaluating adaptation options for urban flooding based on new high-end emission scenario regional climate model simulations

    DEFF Research Database (Denmark)

    Arnbjerg-Nielsen, Karsten; Leonardsen, L.; Madsen, Henrik

    2015-01-01

    Climate change adaptation studies on urban flooding are often based on a model chain approach from climate forcing scenarios to analysis of adaptation measures. Previous analyses of climate change impacts in Copenhagen, Denmark, were supplemented by 2 high-end scenario simulations. These include...... a regional climate model projection forced to a global temperature increase of 6 degrees C in 2100 as well as a projection based on a high radiative forcing scenario (RCP8.5). With these scenarios, projected impacts of extreme precipitation increase significantly. For extreme sea surges, the impacts do...... by almost 4 and 8 times the current EAD for the RCP8.5 and 6 degrees C scenario, respectively. For both hazards, business-as-usual is not a possible scenario, since even in the absence of policy-driven changes, significant autonomous adaptation is likely to occur. Copenhagen has developed an adaptation plan...

  4. Analysis of Interactions of Key Stakeholders on B2C e-Markets - Agent Based Modelling and Simulation Approach

    Directory of Open Access Journals (Sweden)

    Marković Aleksandar

    2016-05-01

    Full Text Available Background/purpose: This paper discusses the application of ABMS - agent-based modelling and simulation in the analysis of customer behaviour on B2C e-commerce websites as well as in the analysis of various business decisions upon the effects of on-line sales. The continuous development and dynamics in the field of e-commerce requires application of advanced decision-making tools. These tools must be able to process, in a short time period, a large amount of data generated by the e-commerce systems and enable the use of acquired data for making quality business decisions.

  5. CDMetaPOP: An individual-based, eco-evolutionary model for spatially explicit simulation of landscape demogenetics

    Science.gov (United States)

    Landguth, Erin L; Bearlin, Andrew; Day, Casey; Dunham, Jason B.

    2016-01-01

    1. Combining landscape demographic and genetics models offers powerful methods for addressing questions for eco-evolutionary applications.2. Using two illustrative examples, we present Cost–Distance Meta-POPulation, a program to simulate changes in neutral and/or selection-driven genotypes through time as a function of individual-based movement, complex spatial population dynamics, and multiple and changing landscape drivers.3. Cost–Distance Meta-POPulation provides a novel tool for questions in landscape genetics by incorporating population viability analysis, while linking directly to conservation applications.

  6. Simulation of Nitrogen and Phosphorus Load Runoff by a GIS-based Distributed Model for Chikugo River Watershed

    Science.gov (United States)

    Iseri, Haruka; Hiramatsu, Kazuaki; Harada, Masayoshi

    A distributed model was developed in order to simulate the process of nitrogen and phosphorus load runoff in the semi-urban watershed of the Chikugo River, Japan. A grid of cells 1km in size was laid over the study area, and several input variables for each cell area including DEM, land use and statistical data were extracted by GIS. In the process of water runoff, hydrograph calculated at Chikugo Barrage was in close agreement with the observed one, which achieved Nash-Sutcliffe coefficient of 0.90. In addition, the model simulated reasonably well the movement of TN and TP at each station. The model was also used to analyze three scenarios based on the watershed management: (1) reduction of nutrient loads from livestock farm, (2) improvement of septic tanks' wastewater treatment system and (3) application of purification function of paddy fields. As a result, effectiveness of management strategy in each scenario depended on land use patterns. The reduction rates of nutrient load effluent in scenarios (1) and (3) were higher than that in scenario (2). The present result suggests that an appropriate management of livestock farm together with the effective use of paddy environment would have significant effects on the reduction of nutrient loads. A suitable management strategy should be planned based on the land use pattern in the watershed.

  7. Towards real-time communication between in vivo neurophysiological data sources and simulator-based brain biomimetic models.

    Science.gov (United States)

    Lee, Giljae; Matsunaga, Andréa; Dura-Bernal, Salvador; Zhang, Wenjie; Lytton, William W; Francis, Joseph T; Fortes, José Ab

    2014-11-01

    Development of more sophisticated implantable brain-machine interface (BMI) will require both interpretation of the neurophysiological data being measured and subsequent determination of signals to be delivered back to the brain. Computational models are the heart of the machine of BMI and therefore an essential tool in both of these processes. One approach is to utilize brain biomimetic models (BMMs) to develop and instantiate these algorithms. These then must be connected as hybrid systems in order to interface the BMM with in vivo data acquisition devices and prosthetic devices. The combined system then provides a test bed for neuroprosthetic rehabilitative solutions and medical devices for the repair and enhancement of damaged brain. We propose here a computer network-based design for this purpose, detailing its internal modules and data flows. We describe a prototype implementation of the design, enabling interaction between the Plexon Multichannel Acquisition Processor (MAP) server, a commercial tool to collect signals from microelectrodes implanted in a live subject and a BMM, a NEURON-based model of sensorimotor cortex capable of controlling a virtual arm. The prototype implementation supports an online mode for real-time simulations, as well as an offline mode for data analysis and simulations without real-time constraints, and provides binning operations to discretize continuous input to the BMM and filtering operations for dealing with noise. Evaluation demonstrated that the implementation successfully delivered monkey spiking activity to the BMM through LAN environments, respecting real-time constraints.

  8. Atomic Force Microscopy Based Nanorobotics Modelling, Simulation, Setup Building and Experiments

    CERN Document Server

    Xie, Hui; Régnier, Stéphane; Sitti, Metin

    2012-01-01

    The atomic force microscope (AFM) has been successfully used to perform nanorobotic manipulation operations on nanoscale entities such as particles, nanotubes, nanowires, nanocrystals, and DNA since 1990s. There have been many progress on modeling, imaging, teleoperated or automated control, human-machine interfacing, instrumentation, and applications of AFM based nanorobotic manipulation systems in literature. This book aims to include all of such state-of-the-art progress in an organized, structured, and detailed manner as a reference book and also potentially a textbook in nanorobotics and any other nanoscale dynamics, systems and controls related research and education. Clearly written and well-organized, this text introduces designs and prototypes of the nanorobotic systems in detail with innovative principles of three-dimensional manipulation force microscopy and parallel imaging/manipulation force microscopy.

  9. Determining the energy performance of manually controlled solar shades: A stochastic model based co-simulation analysis

    International Nuclear Information System (INIS)

    Yao, Jian

    2014-01-01

    Highlights: • Driving factor for adjustment of manually controlled solar shades was determined. • A stochastic model for manual solar shades was constructed using Markov method. • Co-simulation with Energyplus was carried out in BCVTB. • External shading even manually controlled should be used prior to LOW-E windows. • Previous studies on manual solar shades may overestimate energy savings. - Abstract: Solar shading devices play a significant role in reducing building energy consumption and maintaining a comfortable indoor condition. In this paper, a typical office building with internal roller shades in hot summer and cold winter zone was selected to determine the driving factor of control behavior of manual solar shades. Solar radiation was determined as the major factor in driving solar shading adjustment based on field measurements and logit analysis and then a stochastic model for manually adjusted solar shades was constructed by using Markov method. This model was used in BCVTB for further co-simulation with Energyplus to determine the impact of the control behavior of solar shades on energy performance. The results show that manually adjusted solar shades, whatever located inside or outside, have a relatively high energy saving performance than clear-pane windows while only external shades perform better than regularly used LOW-E windows. Simulation also indicates that using an ideal assumption of solar shade adjustment as most studies do in building simulation may lead to an overestimation of energy saving by about 16–30%. There is a need to improve occupants’ actions on shades to more effectively respond to outdoor conditions in order to lower energy consumption, and this improvement can be easily achieved by using simple strategies as a guide to control manual solar shades

  10. Improved Environmental Life Cycle Assessment of Crop Production at the Catchment Scale via a Process-Based Nitrogen Simulation Model.

    Science.gov (United States)

    Liao, Wenjie; van der Werf, Hayo M G; Salmon-Monviola, Jordy

    2015-09-15

    One of the major challenges in environmental life cycle assessment (LCA) of crop production is the nonlinearity between nitrogen (N) fertilizer inputs and on-site N emissions resulting from complex biogeochemical processes. A few studies have addressed this nonlinearity by combining process-based N simulation models with LCA, but none accounted for nitrate (NO3(-)) flows across fields. In this study, we present a new method, TNT2-LCA, that couples the topography-based simulation of nitrogen transfer and transformation (TNT2) model with LCA, and compare the new method with a current LCA method based on a French life cycle inventory database. Application of the two methods to a case study of crop production in a catchment in France showed that, compared to the current method, TNT2-LCA allows delineation of more appropriate temporal limits when developing data for on-site N emissions associated with specific crops in this catchment. It also improves estimates of NO3(-) emissions by better consideration of agricultural practices, soil-climatic conditions, and spatial interactions of NO3(-) flows across fields, and by providing predicted crop yield. The new method presented in this study provides improved LCA of crop production at the catchment scale.

  11. Simulation of debonding in Al/epoxy T-peel joints using a potential-based cohesive zone model

    KAUST Repository

    Alfano, Marco; Furgiuele, Franco; Lubineau, Gilles; Paulino, Glaucio H.

    2011-01-01

    In this work, a cohesive zone model of fracture is employed to study debonding in plastically deforming Al/epoxy T-peel joints. In order to model the adhesion between the bonded metal strips, the Park-Paulino-Roesler (PPR) potential based cohesive model (J Mech Phys Solids, 2009;57:891-908) is employed, and interface elements are implemented in a finite element com-mercial code. A study on the influence of the cohesive properties (i.e. cohesive strength, fracture energy, shape parameter and slope indicator) on the predicted peel-force versus displacement plots reveals that the numerical results are mostly sensitive to cohesive strength and fracture energy. In turn, these parameters are tuned until a match between experimental and simulated load displacement curves is achieved.

  12. Simulation of debonding in Al/epoxy T-peel joints using a potential-based cohesive zone model

    KAUST Repository

    Alfano, Marco

    2011-06-10

    In this work, a cohesive zone model of fracture is employed to study debonding in plastically deforming Al/epoxy T-peel joints. In order to model the adhesion between the bonded metal strips, the Park-Paulino-Roesler (PPR) potential based cohesive model (J Mech Phys Solids, 2009;57:891-908) is employed, and interface elements are implemented in a finite element com-mercial code. A study on the influence of the cohesive properties (i.e. cohesive strength, fracture energy, shape parameter and slope indicator) on the predicted peel-force versus displacement plots reveals that the numerical results are mostly sensitive to cohesive strength and fracture energy. In turn, these parameters are tuned until a match between experimental and simulated load displacement curves is achieved.

  13. Simulation of the migration in fracutred rock by a model based on capillary tubes

    International Nuclear Information System (INIS)

    Dahlbom, P.

    1992-05-01

    In this paper the ability of a model based upon capillary tubes to reproduce the hydrodynamic dispersion in connection with flow of contaminated groundwater in fractured rock is investigated. It is assumed that the cross sectional areas are circular and that the flow is laminar. The molecular diffusion is neglected as well as the impact of variations in velocity over the cross-sectional area. It is assumed that the cross sectional areas in an ensemble of tubes belong to a gamma distribution. The velocity differences between tubes having different cross sectional area cause hydrodynamic dispersion. The model is applied to field tracer experiments at two sites. It is shown that the mean size of the cavities is smaller at a large depth and that the distribution is more narrow. The parameter in the gamma distribution has to be given different values to reproduce the breakthrough course at the different sites. It is also pointed out that there is no general relation between conductivity of a porous medium and its porosity without consideration of the pore size distribution. (au)

  14. Derivation Method for the Foundation Boundaries of Hydraulic Numerical Simulation Models Based on the Elastic Boussinesq Solution

    Directory of Open Access Journals (Sweden)

    Jintao Song

    2015-01-01

    Full Text Available The foundation boundaries of numerical simulation models of hydraulic structures dominated by a vertical load are investigated. The method used is based on the stress formula for fundamental solutions to semi-infinite space body elastic mechanics under a vertical concentrated force. The limit method is introduced into the original formula, which is then partitioned and analyzed according to the direction of the depth extension of the foundation. The point load will be changed to a linear load with a length of 2a. Inverse proportion function assumptions are proposed at parameter a and depth l of the calculation points to solve the singularity questions of elastic stress in a semi-infinite space near the ground. Compared with the original formula, changing the point load to a linear load with a length of 2a is more reasonable. Finally, the boundary depth criterion of a hydraulic numerical simulation model is derived and applied to determine the depth boundary formula for gravity dam numerical simulations.

  15. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    Science.gov (United States)

    Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-03-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.

  16. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    International Nuclear Information System (INIS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-01-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc

  17. Spatiotemporal Simulation of Tourist Town Growth Based on the Cellular Automata Model: The Case of Sanpo Town in Hebei Province

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2013-01-01

    Full Text Available Spatiotemporal simulation of tourist town growth is important for research on land use/cover change under the influence of urbanization. Many scholars have shown great interest in the unique pattern of driving urban development with tourism development. Based on the cellular automata (CA model, we simulated and predicted the spatiotemporal growth of Sanpo town in Hebei Province, using the tourism urbanization growth model. Results showed that (1 average annual growth rate of the entire region was 1.5 Ha2 per year from 2005 to 2010, 4 Ha2 per year from 2010 to 2015, and 2.5 Ha2 per year from 2015 to 2020; (2 urban growth rate increased yearly, with regional differences, and had a high degree of correlation with the Euclidean distance of town center, traffic route, attractions, and other factors; (3 Gougezhuang, an important village center in the west of the town, demonstrated traffic advantages and increased growth rate since 2010; (4 Magezhuang village has the largest population in the region, so economic advantages have driven the development of rural urbanization. It showed that CA had high reliability in simulating the spatiotemporal evolution of tourist town, which assists the study of spatiotemporal growth under urbanization and rational protection of tourism resources.

  18. A simulation-based robust biofuel facility location model for an integrated bio-energy logistics network

    Directory of Open Access Journals (Sweden)

    Jae-Dong Hong

    2014-10-01

    Full Text Available Purpose: The purpose of this paper is to propose a simulation-based robust biofuel facility location model for solving an integrated bio-energy logistics network (IBLN problem, where biomass yield is often uncertain or difficult to determine.Design/methodology/approach: The IBLN considered in this paper consists of four different facilities: farm or harvest site (HS, collection facility (CF, biorefinery (BR, and blending station (BS. Authors propose a mixed integer quadratic modeling approach to simultaneously determine the optimal CF and BR locations and corresponding biomass and bio-energy transportation plans. The authors randomly generate biomass yield of each HS and find the optimal locations of CFs and BRs for each generated biomass yield, and select the robust locations of CFs and BRs to show the effects of biomass yield uncertainty on the optimality of CF and BR locations. Case studies using data from the State of South Carolina in the United State are conducted to demonstrate the developed model’s capability to better handle the impact of uncertainty of biomass yield.Findings: The results illustrate that the robust location model for BRs and CFs works very well in terms of the total logistics costs. The proposed model would help decision-makers find the most robust locations for biorefineries and collection facilities, which usually require huge investments, and would assist potential investors in identifying the least cost or important facilities to invest in the biomass and bio-energy industry.Originality/value: An optimal biofuel facility location model is formulated for the case of deterministic biomass yield. To improve the robustness of the model for cases with probabilistic biomass yield, the model is evaluated by a simulation approach using case studies. The proposed model and robustness concept would be a very useful tool that helps potential biofuel investors minimize their investment risk.

  19. Numerical simulation of cryogenic cavitating flow by an extended transport-based cavitation model with thermal effects

    Science.gov (United States)

    Zhang, Shaofeng; Li, Xiaojun; Zhu, Zuchao

    2018-06-01

    Thermodynamic effects on cryogenic cavitating flow is important to the accuracy of numerical simulations mainly because cryogenic fluids are thermo-sensitive, and the vapour saturation pressure is strongly dependent on the local temperature. The present study analyses the thermal cavitating flows in liquid nitrogen around a 2D hydrofoil. Thermal effects were considered using the RNG k-ε turbulence model with a modified turbulent eddy viscosity and the mass transfer homogenous cavitation model coupled with energy equation. In the cavitation model process, the saturated vapour pressure is modified based on the Clausius-Clapron equation. The convection heat transfer approach is also considered to extend the Zwart-Gerber-Belamri model. The predicted pressure and temperature inside the cavity under cryogenic conditions show that the modified Zwart-Gerber-Belamri model is in agreement with the experimental data of Hord et al. in NASA, especially in the thermal field. The thermal effect significantly affects the cavitation dynamics during phase-change process, which could delay or suppress the occurrence and development of cavitation behaviour. Based on the modified Zwart-Gerber-Belamri model proposed in this paper, better prediction of the cryogenic cavitation is attainable.

  20. Modeling and real time simulation of an HVDC inverter feeding a weak AC system based on commutation failure study.

    Science.gov (United States)

    Mankour, Mohamed; Khiat, Mounir; Ghomri, Leila; Chaker, Abdelkader; Bessalah, Mourad

    2018-06-01

    This paper presents modeling and study of 12-pulse HVDC (High Voltage Direct Current) based on real time simulation where the HVDC inverter is connected to a weak AC system. In goal to study the dynamic performance of the HVDC link, two serious kind of disturbance are applied at HVDC converters where the first one is the single phase to ground AC fault and the second one is the DC link to ground fault. The study is based on two different mode of analysis, which the first is to test the performance of the DC control and the second is focalized to study the effect of the protection function on the system behavior. This real time simulation considers the strength of the AC system to witch is connected and his relativity with the capacity of the DC link. The results obtained are validated by means of RT-lab platform using digital Real time simulator Hypersim (OP-5600), the results carried out show the effect of the DC control and the influence of the protection function to reduce the probability of commutation failures and also for helping inverter to take out from commutation failure even while the DC control fails to eliminate them. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Cyber-Physical Energy Systems Modeling, Test Specification, and Co-Simulation Based Testing

    DEFF Research Database (Denmark)

    van der Meer, A. A.; Palensky, P.; Heussen, Kai

    2017-01-01

    The gradual deployment of intelligent and coordinated devices in the electrical power system needs careful investigation of the interactions between the various domains involved. Especially due to the coupling between ICT and power systems a holistic approach for testing and validating is required....... Taking existing (quasi-) standardised smart grid system and test specification methods as a starting point, we are developing a holistic testing and validation approach that allows a very flexible way of assessing the system level aspects by various types of experiments (including virtual, real......, and mixed lab settings). This paper describes the formal holistic test case specification method and applies it to a particular co-simulation experimental setup. The various building blocks of such a simulation (i.e., FMI, mosaik, domain-specific simulation federates) are covered in more detail...

  2. Component-based framework for subsurface simulations

    International Nuclear Information System (INIS)

    Palmer, B J; Fang, Yilin; Hammond, Glenn; Gurumoorthi, Vidhya

    2007-01-01

    Simulations in the subsurface environment represent a broad range of phenomena covering an equally broad range of scales. Developing modelling capabilities that can integrate models representing different phenomena acting at different scales present formidable challenges both from the algorithmic and computer science perspective. This paper will describe the development of an integrated framework that will be used to combine different models into a single simulation. Initial work has focused on creating two frameworks, one for performing smooth particle hydrodynamics (SPH) simulations of fluid systems, the other for performing grid-based continuum simulations of reactive subsurface flow. The SPH framework is based on a parallel code developed for doing pore scale simulations, the continuum grid-based framework is based on the STOMP (Subsurface Transport Over Multiple Phases) code developed at PNNL Future work will focus on combining the frameworks together to perform multiscale, multiphysics simulations of reactive subsurface flow

  3. Evaluation of the interindividual human variation in bioactivation of methyleugenol using physiologically based kinetic modeling and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Al-Subeihi, Ala' A.A., E-mail: subeihi@yahoo.com [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); BEN-HAYYAN-Aqaba International Laboratories, Aqaba Special Economic Zone Authority (ASEZA), P. O. Box 2565, Aqaba 77110 (Jordan); Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Bladeren, Peter J. van [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Nestec S.A., Avenue Nestlé 55, 1800 Vevey (Switzerland); Rietjens, Ivonne M.C.M.; Punt, Ans [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands)

    2015-03-01

    The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling

  4. Evaluation of the interindividual human variation in bioactivation of methyleugenol using physiologically based kinetic modeling and Monte Carlo simulations

    International Nuclear Information System (INIS)

    Al-Subeihi, Ala' A.A.; Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert; Bladeren, Peter J. van; Rietjens, Ivonne M.C.M.; Punt, Ans

    2015-01-01

    The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling

  5. “Space, the Final Frontier”: How Good are Agent-Based Models at Simulating Individuals and Space in Cities?

    Directory of Open Access Journals (Sweden)

    Alison Heppenstall

    2016-01-01

    Full Text Available Cities are complex systems, comprising of many interacting parts. How we simulate and understand causality in urban systems is continually evolving. Over the last decade the agent-based modeling (ABM paradigm has provided a new lens for understanding the effects of interactions of individuals and how through such interactions macro structures emerge, both in the social and physical environment of cities. However, such a paradigm has been hindered due to computational power and a lack of large fine scale datasets. Within the last few years we have witnessed a massive increase in computational processing power and storage, combined with the onset of Big Data. Today geographers find themselves in a data rich era. We now have access to a variety of data sources (e.g., social media, mobile phone data, etc. that tells us how, and when, individuals are using urban spaces. These data raise several questions: can we effectively use them to understand and model cities as complex entities? How well have ABM approaches lent themselves to simulating the dynamics of urban processes? What has been, or will be, the influence of Big Data on increasing our ability to understand and simulate cities? What is the appropriate level of spatial analysis and time frame to model urban phenomena? Within this paper we discuss these questions using several examples of ABM applied to urban geography to begin a dialogue about the utility of ABM for urban modeling. The arguments that the paper raises are applicable across the wider research environment where researchers are considering using this approach.

  6. Coupling the WRF model with a temperature index model based on remote sensing for snowmelt simulations in a river basin in the Altay Mountains, northwest China

    Science.gov (United States)

    Wu, X.; Shen, Y.; Wang, N.; Pan, X.; Zhang, W.; He, J.; Wang, G.

    2017-12-01

    Snowmelt water is an important freshwater resource in the Altay Mountains in northwest China, and it is also crucial for local ecological system, economic and social sustainable development; however, warming climate and rapid spring snowmelt can cause floods that endanger both eco-environment and public and personal property and safety. This study simulates snowmelt in the Kayiertesi River catchment using a temperature-index model based on remote sensing coupled with high-resolution meteorological data obtained from NCEP reanalysis fields that were downscaled using Weather Research Forecasting model, then bias-corrected using a statistical downscaled model. Validation of the forcing data revealed that the high-resolution meteorological fields derived from downscaled NCEP reanalysis were reliable for driving the snowmelt model. Parameters of temperature-index model based on remote sensing were calibrated for spring 2014, and model performance was validated using MODIS snow cover and snow observations from spring 2012. The results show that the temperature-index model based on remote sensing performed well, with a simulation mean relative error of 6.7% and a Nash-Sutchliffe efficiency of 0.98 in spring 2012 in the river of Altay Mountains. Based on the reliable distributed snow water equivalent simulation, daily snowmelt runoff was calculated for spring 2012 in the basin. In the study catchment, spring snowmelt runoff accounts for 72% of spring runoff and 21% of annual runoff. Snowmelt is the main source of runoff for the catchment and should be managed and utilized effectively. The results provide a basis for snowmelt runoff predictions, so as to prevent snowmelt-induced floods, and also provide a generalizable approach that can be applied to other remote locations where high-density, long-term observational data is lacking.

  7. Research on the Integration of Bionic Geometry Modeling and Simulation of Robot Foot Based on Characteristic Curve

    Science.gov (United States)

    He, G.; Zhu, H.; Xu, J.; Gao, K.; Zhu, D.

    2017-09-01

    The bionic research of shape is an important aspect of the research on bionic robot, and its implementation cannot be separated from the shape modeling and numerical simulation of the bionic object, which is tedious and time-consuming. In order to improve the efficiency of shape bionic design, the feet of animals living in soft soil and swamp environment are taken as bionic objects, and characteristic skeleton curve, section curve, joint rotation variable, position and other parameters are used to describe the shape and position information of bionic object’s sole, toes and flipper. The geometry modeling of the bionic object is established by using the parameterization of characteristic curves and variables. Based on this, the integration framework of parametric modeling and finite element modeling, dynamic analysis and post-processing of sinking process in soil is proposed in this paper. The examples of bionic ostrich foot and bionic duck foot are also given. The parametric modeling and integration technique can achieve rapid improved design based on bionic object, and it can also greatly improve the efficiency and quality of robot foot bionic design, and has important practical significance to improve the level of bionic design of robot foot’s shape and structure.

  8. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach

    Science.gov (United States)

    Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London. PMID:26609369

  9. SimVascular 2.0: an Integrated Open Source Pipeline for Image-Based Cardiovascular Modeling and Simulation

    Science.gov (United States)

    Lan, Hongzhi; Merkow, Jameson; Updegrove, Adam; Schiavazzi, Daniele; Wilson, Nathan; Shadden, Shawn; Marsden, Alison

    2015-11-01

    SimVascular (www.simvascular.org) is currently the only fully open source software package that provides a complete pipeline from medical image based modeling to patient specific blood flow simulation and analysis. It was initially released in 2007 and has contributed to numerous advances in fundamental hemodynamics research, surgical planning, and medical device design. However, early versions had several major barriers preventing wider adoption by new users, large-scale application in clinical and research studies, and educational access. In the past years, SimVascular 2.0 has made significant progress by integrating open source alternatives for the expensive commercial libraries previously required for anatomic modeling, mesh generation and the linear solver. In addition, it simplified the across-platform compilation process, improved the graphical user interface and launched a comprehensive documentation website. Many enhancements and new features have been incorporated for the whole pipeline, such as 3-D segmentation, Boolean operation for discrete triangulated surfaces, and multi-scale coupling for closed loop boundary conditions. In this presentation we will briefly overview the modeling/simulation pipeline and advances of the new SimVascular 2.0.

  10. Automatic generation and simulation of urban building energy models based on city datasets for city-scale building retrofit analysis

    International Nuclear Information System (INIS)

    Chen, Yixing; Hong, Tianzhen; Piette, Mary Ann

    2017-01-01

    Highlights: •Developed methods and used data models to integrate city’s public building records. •Shading from neighborhood buildings strongly influences urban building performance. •A case study demonstrated the workflow, simulation and analysis of building retrofits. •CityBES retrofit analysis feature provides actionable information for decision making. •Discussed significance and challenges of urban building energy modeling. -- Abstract: Buildings in cities consume 30–70% of total primary energy, and improving building energy efficiency is one of the key strategies towards sustainable urbanization. Urban building energy models (UBEM) can support city managers to evaluate and prioritize energy conservation measures (ECMs) for investment and the design of incentive and rebate programs. This paper presents the retrofit analysis feature of City Building Energy Saver (CityBES) to automatically generate and simulate UBEM using EnergyPlus based on cities’ building datasets and user-selected ECMs. CityBES is a new open web-based tool to support city-scale building energy efficiency strategic plans and programs. The technical details of using CityBES for UBEM generation and simulation are introduced, including the workflow, key assumptions, and major databases. Also presented is a case study that analyzes the potential retrofit energy use and energy cost savings of five individual ECMs and two measure packages for 940 office and retail buildings in six city districts in northeast San Francisco, United States. The results show that: (1) all five measures together can save 23–38% of site energy per building; (2) replacing lighting with light-emitting diode lamps and adding air economizers to existing heating, ventilation and air-conditioning (HVAC) systems are most cost-effective with an average payback of 2.0 and 4.3 years, respectively; and (3) it is not economical to upgrade HVAC systems or replace windows in San Francisco due to the city’s mild

  11. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  12. Determining Nurse Aide Staffing Requirements to Provide Care Based on Resident Workload: A Discrete Event Simulation Model.

    Science.gov (United States)

    Schnelle, John F; Schroyer, L Dale; Saraf, Avantika A; Simmons, Sandra F

    2016-11-01

    Nursing aides provide most of the labor-intensive activities of daily living (ADL) care to nursing home (NH) residents. Currently, most NHs do not determine nurse aide staffing requirements based on the time to provide ADL care for their unique resident population. The lack of an objective method to determine nurse aide staffing requirements suggests that many NHs could be understaffed in their capacity to provide consistent ADL care to all residents in need. Discrete event simulation (DES) mathematically models key work parameters (eg, time to provide an episode of care and available staff) to predict the ability of the work setting to provide care over time and offers an objective method to determine nurse aide staffing needs in NHs. This study had 2 primary objectives: (1) to describe the relationship between ADL workload and the level of nurse aide staffing reported by NHs; and, (2) to use a DES model to determine the relationship between ADL workload and nurse aide staffing necessary for consistent, timely ADL care. Minimum Data Set data related to the level of dependency on staff for ADL care for residents in over 13,500 NHs nationwide were converted into 7 workload categories that captured 98% of all residents. In addition, data related to the time to provide care for the ADLs within each workload category was used to calculate a workload score for each facility. The correlation between workload and reported nurse aide staffing levels was calculated to determine the association between staffing reported by NHs and workload. Simulations to project staffing requirements necessary to provide ADL care were then conducted for 65 different workload scenarios, which included 13 different nurse aide staffing levels (ranging from 1.6 to 4.0 total hours per resident day) and 5 different workload percentiles (ranging from the 5th to the 95th percentile). The purpose of the simulation model was to determine the staffing necessary to provide care within each workload

  13. Particle Swarm Social Adaptive Model for Multi-Agent Based Insurgency Warfare Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2009-12-01

    To better understand insurgent activities and asymmetric warfare, a social adaptive model for modeling multiple insurgent groups attacking multiple military and civilian targets is proposed and investigated. This report presents a pilot study using the particle swarm modeling, a widely used non-linear optimal tool to model the emergence of insurgency campaign. The objective of this research is to apply the particle swarm metaphor as a model of insurgent social adaptation for the dynamically changing environment and to provide insight and understanding of insurgency warfare. Our results show that unified leadership, strategic planning, and effective communication between insurgent groups are not the necessary requirements for insurgents to efficiently attain their objective.

  14. Dynamic Traffic Congestion Simulation and Dissipation Control Based on Traffic Flow Theory Model and Neural Network Data Calibration Algorithm

    Directory of Open Access Journals (Sweden)

    Li Wang

    2017-01-01

    Full Text Available Traffic congestion is a common problem in many countries, especially in big cities. At present, China’s urban road traffic accidents occur frequently, the occurrence frequency is high, the accident causes traffic congestion, and accidents cause traffic congestion and vice versa. The occurrence of traffic accidents usually leads to the reduction of road traffic capacity and the formation of traffic bottlenecks, causing the traffic congestion. In this paper, the formation and propagation of traffic congestion are simulated by using the improved medium traffic model, and the control strategy of congestion dissipation is studied. From the point of view of quantitative traffic congestion, the paper provides the fact that the simulation platform of urban traffic integration is constructed, and a feasible data analysis, learning, and parameter calibration method based on RBF neural network is proposed, which is used to determine the corresponding decision support system. The simulation results prove that the control strategy proposed in this paper is effective and feasible. According to the temporal and spatial evolution of the paper, we can see that the network has been improved on the whole.

  15. Warranty optimisation based on the prediction of costs to the manufacturer using neural network model and Monte Carlo simulation

    Science.gov (United States)

    Stamenkovic, Dragan D.; Popovic, Vladimir M.

    2015-02-01

    Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.

  16. Abstraction of an Affective-Cognitive Decision Making Model Based on Simulated Behaviour and Perception Chains

    Science.gov (United States)

    Sharpanskykh, Alexei; Treur, Jan

    Employing rich internal agent models of actors in large-scale socio-technical systems often results in scalability issues. The problem addressed in this paper is how to improve computational properties of a complex internal agent model, while preserving its behavioral properties. The problem is addressed for the case of an existing affective-cognitive decision making model instantiated for an emergency scenario. For this internal decision model an abstracted behavioral agent model is obtained, which ensures a substantial increase of the computational efficiency at the cost of approximately 1% behavioural error. The abstraction technique used can be applied to a wide range of internal agent models with loops, for example, involving mutual affective-cognitive interactions.

  17. Performance of process-based models for simulation of grain N in crop rotations across Europe

    Czech Academy of Sciences Publication Activity Database

    Xiaogang, Y.; Kesebaum, K. C.; Kollas, C.; Manevski, K.; Baby, S.; Beaudoin, N.; Öztürk, I.; Gaiser, T.; Wu, L.; Hoffmann, M.; Charfeddine, M.; Conradt, T.; Constantin, J.; Ewert, F.; de Cortazar-Atauri, I. G.; Giglio, L.; Hlavinka, Petr; Hoffmann, H.; Launay, M.; Louarn, G.; Manderscheid, R.; Mary, B.; Mirschel, W.; Nendel, C.; Pacholski, A.; Palouso, T.; Ripoche-Wachter, D.; Rötter, R. P.; Ruget, F.; Sharif, B.; Trnka, Miroslav; Ventrella, D.; Weigel, H-J.; Olesen, J. E.

    2017-01-01

    Roč. 154, JUN (2017), s. 63-77 ISSN 0308-521X R&D Projects: GA MŠk(CZ) LO1415; GA MZe QJ1310123 Institutional support: RVO:67179843 Keywords : Calibration * Crop model * Crop rotation * Grain N content * Model evaluation * Model initialization Subject RIV: EH - Ecology, Behaviour OBOR OECD: Environmental sciences (social aspects to be 5.7) Impact factor: 2.571, year: 2016

  18. Stochastic Modeling and Simulation of Near-Fault Ground Motions for Performance-Based Earthquake Engineering

    OpenAIRE

    Dabaghi, Mayssa

    2014-01-01

    A comprehensive parameterized stochastic model of near-fault ground motions in two orthogonal horizontal directions is developed. The proposed model uniquely combines several existing and new sub-models to represent major characteristics of recorded near-fault ground motions. These characteristics include near-fault effects of directivity and fling step; temporal and spectral non-stationarity; intensity, duration and frequency content characteristics; directionality of components, as well as ...

  19. Antarctic 20th Century Accumulation Changes Based on Regional Climate Model Simulations

    Directory of Open Access Journals (Sweden)

    Klaus Dethloff

    2010-01-01

    investigated on the basis of ERA-40 data and HIRHAM simulations. It is shown that the regional accumulation changes are largely driven by changes in the transient activity around the Antarctic coasts due to the varying AAO phases. During positive AAO, more transient pressure systems travelling towards the continent, and Western Antarctica and parts of South-Eastern Antarctica gain more precipitation and mass. Over central Antarctica the prevailing anticyclone causes a strengthening of polar desertification connected with a reduced surface mass balance in the northern part of East Antarctica.

  20. Model-based design of a pilot-scale simulated moving bed for purification of citric acid from fermentation broth.

    Science.gov (United States)

    Wu, Jinglan; Peng, Qijun; Arlt, Wolfgang; Minceva, Mirjana

    2009-12-11

    One of the conventional processes used for the recovery of citric acid from its fermentation broth is environmentally harmful and cost intensive. In this work an innovative benign process, which comprises simulated moving bed (SMB) technology and use of a tailor-made tertiary poly(4-vinylpyridine) (PVP) resin as a stationary phase is proposed. This paper focuses on a model-based design of the operation conditions for an existing pilot-scale SMB plant. The SMB unit is modeled on the basis of experimentally determined hydrodynamics, thermodynamics and mass transfer characteristics in a single chromatographic column. Three mathematical models are applied and validated for the prediction of the experimentally attained breakthrough and elution profiles of citric acid and the main impurity component (glucose). The transport dispersive model was selected for the SMB simulation and design studies, since it gives a satisfactory prediction of the elution profiles within acceptable computational time. The equivalent true moving bed (TMB) and SMB models give a good prediction of the experimentally attained SMB separation performances, obtained with a real clarified and concentrated fermentation broth as a feed mixture. The SMB separation requirements are set to at least 99.8% citric acid purity and 90% citric acid recovery in the extract stream. The complete regeneration in sections 1 and 4 is unnecessary. Therefore the net flow rates in all four SMB sections have been considered in the unit design. The influences of the operating conditions (the flow rate in each section, switching time and unit configuration) on the SMB performances were investigated systematically. The resulting SMB design provides 99.8% citric acid purity and 97.2% citric acid recovery in the extract. In addition the citric acid concentration in the extract is a half of its concentration in the pretreated fermentation broth (feed).

  1. A theory-based parameterization for heterogeneous ice nucleation and implications for the simulation of ice processes in atmospheric models

    Science.gov (United States)

    Savre, J.; Ekman, A. M. L.

    2015-05-01

    A new parameterization for heterogeneous ice nucleation constrained by laboratory data and based on classical nucleation theory is introduced. Key features of the parameterization include the following: a consistent and modular modeling framework for treating condensation/immersion and deposition freezing, the possibility to consider various potential ice nucleating particle types (e.g., dust, black carbon, and bacteria), and the possibility to account for an aerosol size distribution. The ice nucleating ability of each aerosol type is described using a contact angle (θ) probability density function (PDF). A new modeling strategy is described to allow the θ PDF to evolve in time so that the most efficient ice nuclei (associated with the lowest θ values) are progressively removed as they nucleate ice. A computationally efficient quasi Monte Carlo method is used to integrate the computed ice nucleation rates over both size and contact angle distributions. The parameterization is employed in a parcel model, forced by an ensemble of Lagrangian trajectories extracted from a three-dimensional simulation of a springtime low-level Arctic mixed-phase cloud, in order to evaluate the accuracy and convergence of the method using different settings. The same model setup is then employed to examine the importance of various parameters for the simulated ice production. Modeling the time evolution of the θ PDF is found to be particularly crucial; assuming a time-independent θ PDF significantly overestimates the ice nucleation rates. It is stressed that the capacity of black carbon (BC) to form ice in the condensation/immersion freezing mode is highly uncertain, in particular at temperatures warmer than -20°C. In its current version, the parameterization most likely overestimates ice initiation by BC.

  2. Effect of Inquiry-Based Computer Simulation Modeling on Pre-Service Teachers' Understanding of Homeostasis and Their Perceptions of Design Features

    Science.gov (United States)

    Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn

    2016-01-01

    This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…

  3. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  4. Budget calculations for ozone and its precursors: Seasonal and episodic features based on model simulations

    NARCIS (Netherlands)

    Memmesheimer, M.; Ebel, A.; Roemer, M.

    1997-01-01

    Results from two air quality models (LOTOS, EURAD) have been used to analyse the contribution of the different terms in the continuity equation to the budget of ozone, NO(x) and PAN. Both models cover large parts of Europe and describe the processes relevant for tropospheric chemistry and dynamics.

  5. Numerical simulation of the aerodynamic field in complex terrain wind farm based on actuator disk model

    DEFF Research Database (Denmark)

    Xu, Chang; Li, Chen Qi; Han, Xing Xing

    2015-01-01

    Study on the aerodynamic field in complex terrain is significant to wind farm micro-sitting and wind power prediction. This paper modeled the wind turbine through an actuator disk model, and solved the aerodynamic field by CFD to study the influence of meshing, boundary conditions and turbulence ...

  6. The Basic Immune Simulator: An agent-based model to study the interactions between innate and adaptive immunity

    Directory of Open Access Journals (Sweden)

    Orosz Charles G

    2007-09-01

    Full Text Available Abstract Background We introduce the Basic Immune Simulator (BIS, an agent-based model created to study the interactions between the cells of the innate and adaptive immune system. Innate immunity, the initial host response to a pathogen, generally precedes adaptive immunity, which generates immune memory for an antigen. The BIS simulates basic cell types, mediators and antibodies, and consists of three virtual spaces representing parenchymal tissue, secondary lymphoid tissue and the lymphatic/humoral circulation. The BIS includes a Graphical User Interface (GUI to facilitate its use as an educational and research tool. Results The BIS was used to qualitatively examine the innate and adaptive interactions of the immune response to a viral infection. Calibration was accomplished via a parameter sweep of initial agent population size, and comparison of simulation patterns to those reported in the basic science literature. The BIS demonstrated that the degree of the initial innate response was a crucial determinant for an appropriate adaptive response. Deficiency or excess in innate immunity resulted in excessive proliferation of adaptive immune cells. Deficiency in any of the immune system components increased the probability of failure to clear the simulated viral infection. Conclusion The behavior of the BIS matches both normal and pathological behavior patterns in a generic viral infection scenario. Thus, the BIS effectively translates mechanistic cellular and molecular knowledge regarding the innate and adaptive immune response and reproduces the immune system's complex behavioral patterns. The BIS can be used both as an educational tool to demonstrate the emergence of these patterns and as a research tool to systematically identify potential targets for more effective treatment strategies for diseases processes including hypersensitivity reactions (allergies, asthma, autoimmunity and cancer. We believe that the BIS can be a useful addition to

  7. The SGP – Faulty by Design or Made Faulty by Politicians? An Assessment Based on a Simulation Model

    Directory of Open Access Journals (Sweden)

    Horáková Šárka

    2014-09-01

    Full Text Available By joining the European Monetary Union (the “EMU”, member countries lost the ability to use monetary policy as a tool for macroeconomic regulation. The attention was then focused on regulation of fiscal policy and Stability and Growth Pact (the “SGP” was the instrument agreed upon. The states of the EMU have agreed to meet the 3% of GDP requirement for the maximum annual public budget deficit. Based on evolution of public debt in member countries, we can say that the SGP has failed as a tool for fiscal discipline. In this paper, we answer the question of whether the failure was due to the incorrect concept of the SGP or whether the development of the debt was affected more by arbitrary disrespect of the agreed rules. The two reasons mentioned above are interdependent. To separate them, we construct a dynamic model of EU countries’ public debt. By using real data, we simulate the potential values of public debt in a situation where the SGP rules have been respected in recent years. Comparing the results for the potential debt given by simulation of the model with the current real values, we are able to quantify the impact of non-compliance for each country. The initial results indicate that there are both EU states where non-compliance led to a negligible increase in public debt - up to 7% of GDP - and other states where this factor caused the growth of public debt by more than 30% of GDP.

  8. Implications of the Abolition of Milk Quota System for Polish Agriculture – Simulation Results Based on the AG MEMOD Model

    Directory of Open Access Journals (Sweden)

    Mariusz Hamulczuk

    2009-09-01

    Full Text Available The objective of the study was to asses the economics effects of the dairy policy reform sanctioned by CAP Health Check on the agricultural market in Poland. The paper presents a theoretical study of the production control program as well as a model based quantitative analysis of the implications of the reform on the agricultural markets. The partial equilibrium model AGMEMOD was used for simulation. The results obtained indicate that the expansion and subsequently the elimination of milk quota system lead to the growth of milk production and consumption in Poland which confirms the hypothesis derived from theoretical study. As a consequence, the growth of the production of most of dairy products and the decrease of their prices is expected. As the growth of dairy consumption is smaller than the growth of milk production the increase of self-sufficiency in the dairy market is predicted. The comparison of the scale of price adjustment resulting from the dairy reform to the market price changes observed recently leads to the conclusion that global market factors will probably be more important for the future development of milk production and prices in Poland than the milk quota abolition. Nevertheless, the reform constitutes a significant change in business conditions for producers and consumers of milk and dairy products. As a consequence, milk production will become more market based, as far as market prices, production costs and milk yields are concerned. Simulation results from the AGMEMOD model confirm the opinion brought by other authors that the abolition of milk quotas will lead to the decline of dairy farmer income. The main beneficiaries of the reform would become the consumers who could take advantage of the decline in prices of the dairy products.

  9. A DYNAMIC PHYSIOLOGICALLY-BASED TOXICOKINETIC (DPBTK) MODEL FOR SIMULATION OF COMPLEX TOLUENE EXPOSURE SCENARIOS IN HUMANS

    Science.gov (United States)

    A GENERAL PHYSIOLOGICAL AND TOXICOKINETIC (GPAT) MODEL FOR SIMULATION OF COMPLEX TOLUENE EXPOSURE SCENARIOS IN HUMANS. E M Kenyon1, T Colemen2, C R Eklund1 and V A Benignus3. 1U.S. EPA, ORD, NHEERL, ETD, PKB, RTP, NC, USA; 2Biological Simulators, Inc., Jackson MS, USA, 3U.S. EP...

  10. Agent Based Modeling and Simulation of Pedestrian Crowds In Panic Situations

    KAUST Repository

    Alrashed, Mohammed

    2016-01-01

    to self-propelling interactions between pedestrians. Although every pedestrian has personal preferences, the motion dynamics can be modeled as a social force in such crowds. These forces are representations of internal preferences and objectives to perform

  11. A Physiologically Based Pharmacokinetic Model for the Oxime TMB-4: Simulation of Rodent and Human Data

    Science.gov (United States)

    2013-01-13

    later, Garrigue and other colleagues (Maurizis et al. 1992) pub- lished an in vitro binding study of TMB-4 with rabbit cartilaginous tissue cultures...as well as fat, kidney, liver, rapidly perfused tissues and slowly perfused tissues . All tissue compartments are diffusion limited. Model...pharmacokinetic data from the literature. The model was parameterized using rat plasma, tissue and urine ti