WorldWideScience

Sample records for models computer simulation

  1. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  2. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    Mexico , March 1979. 14. Kinney, G. F.,.::. IeiN, .hoce 1h Ir, McMillan, p. 57, 1962. 15. Courant and Friedrichs, ,U: r. on moca an.: Jho...AD 79 275 NEW MEXICO UNIV ALBUGUERGUE ERIC H WANG CIVIL ENGINE-ETC F/6 18/3 COMPUTATIONAL MODELING OF SIMULATION TESTS.(U) JUN 80 6 LEIGH, W CHOWN, B...COMPUTATIONAL MODELING OF SIMULATION TESTS00 0G. Leigh W. Chown B. Harrison Eric H. Wang Civil Engineering Research Facility University of New Mexico

  3. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented......, focusing on universality of the ac response in the extreme disorder limit. Finally, some important unsolved problems relating to hopping models for ac conduction are listed....

  4. Computer Modelling and Simulation for Inventory Control

    Directory of Open Access Journals (Sweden)

    G.K. Adegoke

    2012-07-01

    Full Text Available This study concerns the role of computer simulation as a device for conducting scientific experiments on inventory control. The stores function utilizes a bulk of physical assets and engages a bulk of financial resources in a manufacturing outfit therefore there is a need for an efficient inventory control. The reason being that inventory control reduces cost of production and thereby facilitates the effective and efficient accomplishment of production objectives of an organization. Some mathematical and statistical models were used to compute the Economic Order Quantity (EOQ. Test data were gotten from a manufacturing company and same were simulated. The results generated were used to predict a real life situation and have been presented and discussed. The language of implementation for the three models is Turbo Pascal due to its capability, generality and flexibility as a scientific programming language.

  5. Evaluation of Marine Corps Manpower Computer Simulation Model

    Science.gov (United States)

    2016-12-01

    MARINE CORPS MANPOWER COMPUTER SIMULATION MODEL by Eric S. Anderson December 2016 Thesis Advisor: Arnold Buss Second Reader: Neil Rowe...Master’s thesis 4. TITLE AND SUBTITLE EVALUATION OF MARINE CORPS MANPOWER COMPUTER SIMULATION MODEL 5. FUNDING NUMBERS ACCT: 622716 JON...overall end strength are maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language

  6. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...

  7. A simulation model of a star computer network

    CERN Document Server

    Gomaa, H

    1979-01-01

    A simulation model of the CERN (European Organization for Nuclear Research) SPS star computer network is described. The model concentrates on simulating the message handling computer, through which all messages in the network pass. The implementation of the model and its calibration are also described. (6 refs).

  8. Flow Through a Laboratory Sediment Sample by Computer Simulation Modeling

    Science.gov (United States)

    2006-09-07

    Flow through a laboratory sediment sample by computer simulation modeling R.B. Pandeya’b*, Allen H. Reeda, Edward Braithwaitea, Ray Seyfarth0, J.F...through a laboratory sediment sample by computer simulation modeling 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S

  9. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  10. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  11. Understanding Emergency Care Delivery through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2017-08-10

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This manuscript, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This manuscript discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo Simulation, System Dynamics modeling, Discrete-Event Simulation, and Agent Based Simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this manuscript, our goal is to enhance adoption of computer simulation, a set of methods which hold great promise in addressing emergency care organization and design challenges. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  12. A novel computer simulation for modeling grain growth

    Energy Technology Data Exchange (ETDEWEB)

    Chen, L.Q. (Pennsylvania State Univ., University Park, PA (United States). Dept. of Materials Science and Engineering)

    1995-01-01

    In this paper, the author proposes a new computer simulation model for investigating grain growth kinetics, born from the recent work on the domain growth kinetics of a quenched system with many non-conserved order parameters. A key new feature of this model for studying grain growth is that the grain boundaries are diffuse, as opposed to previous meanfield and statistical theories and Monte-Carlo simulations which assumed that grain boundaries were sharp. Unlike the Monte-Carlo simulations in which grain boundaries are made up of kinks, grain boundaries in the continuum model are smooth. Below, he describes this model in detail, give prescriptions for computer simulation, and then present computer simulation results on a two-dimensional model system.

  13. Computational challenges in modeling and simulating living matter

    Science.gov (United States)

    Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.; de Castro, Maria Clicia Stelling

    2016-12-01

    Computational modeling has been successfully used to help scientists understand physical and biological phenomena. Recent technological advances allowthe simulation of larger systems, with greater accuracy. However, devising those systems requires new approaches and novel architectures, such as the use of parallel programming, so that the application can run in the new high performance environments, which are often computer clusters composed of different computation devices, as traditional CPUs, GPGPUs, Xeon Phis and even FPGAs. It is expected that scientists take advantage of the increasing computational power to model and simulate more complex structures and even merge different models into larger and more extensive ones. This paper aims at discussing the challenges of using those devices to simulate such complex systems.

  14. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  15. Macro—Dataflow Computational Model and Its Simulation

    Institute of Scientific and Technical Information of China (English)

    孙昱东; 谢志良

    1990-01-01

    This paper discusses the relationship between parallelism granularity and system overhead of dataflow computer systems,and indicates that a trade-off between them should be determined to obtain optimal efficiency of the overall system.On the basis of this discussion,a macro-dataflow computational model is established to exploit the task-level parallelism.Working as a macro-dataflow computer,an Experimental Distributed Dataflow Simulation System(EDDSS)is developed to examine the effectiveness of the macro-dataflow computational model.

  16. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  17. Development of computer simulation models for pedestrian subsystem impact tests

    NARCIS (Netherlands)

    Kant, R.; Konosu, A.; Ishikawa, H.

    2000-01-01

    The European Enhanced Vehicle-safety Committee (EEVC/WG10 and WG17) proposed three component subsystem tests for cars to assess pedestrian protection. The objective of this study is to develop computer simulation models of the EEVC pedestrian subsystem tests. These models are available to develop a

  18. Modeling and Computer Simulation of AN Insurance Policy:

    Science.gov (United States)

    Acharyya, Muktish; Acharyya, Ajanta Bhowal

    We have developed a model for a life-insurance policy. In this model, the net gain is calculated by computer simulation for a particular type of lifetime distribution function. We observed that the net gain becomes maximum for a particular value of upper age for last premium.

  19. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    Science.gov (United States)

    2017-02-01

    O’Daniel, 2016. Blast load simulator experiments for computational model validation – Report 1. ERDC/GSL TR-16-27. Vicksburg, MS: U.S. Army Engineer ...ER D C/ G SL T R- 16 -2 7 Blast Load Simulator Experiments for Computational Model Validation Report 2 G eo te ch ni ca l a nd S tr uc...Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and Development Center (ERDC) solves the nation’s toughest

  20. Computer simulation modeling of abnormal behavior: a program approach.

    Science.gov (United States)

    Reilly, K D; Freese, M R; Rowe, P B

    1984-07-01

    A need for modeling abnormal behavior on a comprehensive, systematic basis exists. Computer modeling and simulation tools offer especially good opportunities to establish such a program of studies. Issues concern deciding which modeling tools to use, how to relate models to behavioral data, what level of modeling to employ, and how to articulate theory to facilitate such modeling. Four levels or types of modeling, two qualitative and two quantitative, are identified. Their properties are examined and interrelated to include illustrative applications to the study of abnormal behavior, with an emphasis on schizophrenia.

  1. Modeling and simulation the computer science of illusion

    CERN Document Server

    Raczynski, Stanislaw

    2006-01-01

    Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of

  2. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  3. Computer simulations for internal dosimetry using voxel models.

    Science.gov (United States)

    Kinase, Sakae; Mohammadi, Akram; Takahashi, Masa; Saito, Kimiaki; Zankl, Maria; Kramer, Richard

    2011-07-01

    In the Japan Atomic Energy Agency, several studies have been conducted on the use of voxel models for internal dosimetry. Absorbed fractions (AFs) and S values have been evaluated for preclinical assessments of radiopharmaceuticals using human voxel models and a mouse voxel model. Computational calibration of in vivo measurement system has been also made using Japanese and Caucasian voxel models. In addition, for radiation protection of the environment, AFs have been evaluated using a frog voxel model. Each study was performed by using Monte Carlo simulations. Consequently, it was concluded that these data of Monte Carlo simulations and voxel models could adequately reproduce measurement results. Voxel models were found to be a significant tool for internal dosimetry since the models are anatomically realistic. This fact indicates that several studies on correction of the in vivo measurement efficiency for the variability of human subjects and interspecies scaling of organ doses will succeed.

  4. Mathematical Model of ComputerHeat Treatment and Its Simulation

    Institute of Scientific and Technical Information of China (English)

    PanJiansheng; ZhangWeimin; TianDong; GuJianfeng; HuMingjuan

    2004-01-01

    Computer simulation on heat treatment is the foundation of intelligent heat treatment. The simulations of temperature field,phase transformation, stress/strain complicate quenching operation were realized by using the model of three dimensional non-linear finite element method and the treatment methods of abruptly changing interface conditions. The simulation results basically fit those measured in experiments. The intelligent sealed multipurpose furnace production line has been developed based on the combination of computer simulation on gaseous carburizing and computer control technology. More than 3000 batches of workpieces have been processed on this production line, and all are up to standard. The application of computer simulation technology can significantly improve the loading ability and reliability of nitriding and carburizing workpieces, reduce heat treatment distortion, and shorten carburizing duration. It is recommended that the reliable product design without redundancy should be performed with the combination of the CAD of mechanical products, the CAE of materials selection and heat treatment, and the dynamic evaluation technology of product reliability.

  5. Computational electronics semiclassical and quantum device modeling and simulation

    CERN Document Server

    Vasileska, Dragica; Klimeck, Gerhard

    2010-01-01

    Starting with the simplest semiclassical approaches and ending with the description of complex fully quantum-mechanical methods for quantum transport analysis of state-of-the-art devices, Computational Electronics: Semiclassical and Quantum Device Modeling and Simulation provides a comprehensive overview of the essential techniques and methods for effectively analyzing transport in semiconductor devices. With the transistor reaching its limits and new device designs and paradigms of operation being explored, this timely resource delivers the simulation methods needed to properly model state-of

  6. Ravenscar Computational Model compliant AADL Simulation on LEON2

    Directory of Open Access Journals (Sweden)

    Roberto Varona-Gómez

    2013-02-01

    Full Text Available AADL has been proposed for designing and analyzing SW and HW architectures for real-time mission-critical embedded systems. Although the Behavioral Annex improves its simulation semantics, AADL is a language for analyzing architectures and not for simulating them. AADS-T is an AADL simulation tool that supports the performance analysis of the AADL specification throughout the refinement process from the initial system architecture until the complete, detailed application and execution platform are developed. In this way, AADS-T enables the verification of the initial timing constraints during the complete design process. In this paper we focus on the compatibility of AADS-T with the Ravenscar Computational Model (RCM as part of the TASTE toolset. Its flexibility enables AADS-T to support different processors. In this work we have focused on performing the simulation on a LEON2 processor.

  7. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M&S) users. Performing large-scale, massively...

  8. A framework of modeling detector systems for computed tomography simulations

    Science.gov (United States)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  9. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  10. Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems

    Science.gov (United States)

    Gates, Thomas S.; Hinkley, Jeffrey A.

    2003-01-01

    The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.

  11. Protein adsorption on nanoparticles: model development using computer simulation

    Science.gov (United States)

    Shao, Qing; Hall, Carol K.

    2016-10-01

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.

  12. Computer simulation of hard-core models for liquid crystals

    NARCIS (Netherlands)

    Frenkel, D.

    1987-01-01

    A review is presented of computer simulations of liquid crystal systems. It will be shown that the shape of hard-core particles is of crucial importance for the stability of the phases. Both static and dynamic properties of the systems are obtained by means of computer simulation.

  13. Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing)

    Science.gov (United States)

    2016-04-01

    of Promises with Cloud • Cost efficiency • Unlimited storage • Backup and recovery • Automatic software integration • Easy access to information...discovered, at an abstract level, any advantage or disadvantage to M&S employed in a cloud infrastructure, that would not be true of any typical...Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing) Amy E. Henninger I N S T I T U T E F O R D E F E N S E A N A L

  14. A computational model for the numerical simulation of FSW processes

    OpenAIRE

    Agelet de Saracibar Bosch, Carlos; Chiumenti, Michèle; Santiago, Diego de; Cervera Ruiz, Miguel; Dialami, Narges; Lombera, Guillermo

    2010-01-01

    In this paper a computational model for the numerical simulation of Friction Stir Welding (FSW) processes is presented. FSW is a new method of welding in solid state in which a shouldered tool with a profile probe is rotated and slowly plunged into the joint line between two pieces of sheet or plate material which are butted together. Once the probe has been completely inserted, it is moved with a small tilt angle in the welding direction. Here a quasi-static, thermal transient, mixed mult...

  15. Experiments and simulation models of a basic computation element of an autonomous molecular computing system.

    Science.gov (United States)

    Takinoue, Masahiro; Kiga, Daisuke; Shohda, Koh-Ichiroh; Suyama, Akira

    2008-10-01

    Autonomous DNA computers have been attracting much attention because of their ability to integrate into living cells. Autonomous DNA computers can process information through DNA molecules and their molecular reactions. We have already proposed an idea of an autonomous molecular computer with high computational ability, which is now named Reverse-transcription-and-TRanscription-based Autonomous Computing System (RTRACS). In this study, we first report an experimental demonstration of a basic computation element of RTRACS and a mathematical modeling method for RTRACS. We focus on an AND gate, which produces an output RNA molecule only when two input RNA molecules exist, because it is one of the most basic computation elements in RTRACS. Experimental results demonstrated that the basic computation element worked as designed. In addition, its behaviors were analyzed using a mathematical model describing the molecular reactions of the RTRACS computation elements. A comparison between experiments and simulations confirmed the validity of the mathematical modeling method. This study will accelerate construction of various kinds of computation elements and computational circuits of RTRACS, and thus advance the research on autonomous DNA computers.

  16. Modeling and Simulation Reliable Spacecraft On-Board Computing

    Science.gov (United States)

    Park, Nohpill

    1999-01-01

    The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.

  17. Computer simulation study of water using a fluctuating charge model

    Indian Academy of Sciences (India)

    M Krishnan; A Verma; S Balasubramanian

    2001-10-01

    Hydrogen bonding in small water clusters is studied through computer simulation methods using a sophisticated, empirical model of interaction developed by Rick et al (S W Rick, S J Stuart and B J Berne 1994 J. Chem. Phys. 101 6141) and others. The model allows for the charges on the interacting sites to fluctuate as a function of time, depending on their local environment. The charge flow is driven by the difference in the electronegativity of the atoms within the water molecule, thus effectively mimicking the effects of polarization of the charge density. The potential model is thus transferable across all phases of water. Using this model, we have obtained the minimum energy structures of water clusters up to a size of ten. The cluster structures agree well with experimental data. In addition, we are able to distinctly identify the hydrogens that form hydrogen bonds based on their charges alone, a feature that is not possible in simulations using fixed charge models. We have also studied the structure of liquid water at ambient conditions using this fluctuating charge model.

  18. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2012-01-01

    Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.

  19. Macromolecular Chain at a Cellular Surface: a Computer Simulation Model

    Science.gov (United States)

    Xie, Jun; Pandey, Ras

    2001-06-01

    Computer simulations are performed to study conformation and dynamics of relatively large chain macromolecule at the surface of a model cell membrane - a preliminary attempt to ultimately realistic model for protein on a cell membrane. We use a discrete lattice of size Lx × L × L. The chain molecule of length Lc is modelled by consecutive nodes connected by bonds on the trail of a random walk with appropriate constraints such as excluded volume, energy dependent configurational bias, etc. Monte Carlo method is used to move chains via segmental dynamics, i.e., end-move, kink-jump, crank-shaft, reptation, etc. Membrane substrate is designed by an ensemble of short chains on a flat surface. Large chain molecule is then driven toward the membrane by a field. We plan to examine the dynamics of chain macromolecule, spread of its density, and its conformation.

  20. Quantum game simulator, using the circuit model of quantum computation

    Science.gov (United States)

    Vlachos, Panagiotis; Karafyllidis, Ioannis G.

    2009-10-01

    We present a general two-player quantum game simulator that can simulate any two-player quantum game described by a 2×2 payoff matrix (two strategy games).The user can determine the payoff matrices for both players, their strategies and the amount of entanglement between their initial strategies. The outputs of the simulator are the expected payoffs of each player as a function of the other player's strategy parameters and the amount of entanglement. The simulator also produces contour plots that divide the strategy spaces of the game in regions in which players can get larger payoffs if they choose to use a quantum strategy against any classical one. We also apply the simulator to two well-known quantum games, the Battle of Sexes and the Chicken game. Program summaryProgram title: Quantum Game Simulator (QGS) Catalogue identifier: AEED_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEED_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3416 No. of bytes in distributed program, including test data, etc.: 583 553 Distribution format: tar.gz Programming language: Matlab R2008a (C) Computer: Any computer that can sufficiently run Matlab R2008a Operating system: Any system that can sufficiently run Matlab R2008a Classification: 4.15 Nature of problem: Simulation of two player quantum games described by a payoff matrix. Solution method: The program calculates the matrices that comprise the Eisert setup for quantum games based on the quantum circuit model. There are 5 parameters that can be altered. We define 3 of them as constant. We play the quantum game for all possible values for the other 2 parameters and store the results in a matrix. Unusual features: The software provides an easy way of simulating any two-player quantum games. Running time: Approximately

  1. Computer Simulation Modeling: A Method for Predicting the Utilities of Alternative Computer-Aided Treat Evaluation Algorithms

    Science.gov (United States)

    1990-09-01

    0 Technical Report 911 D~i. FiLE COPY Computer Simulation Modeling : A Method for Predicting the Utilities of Alternative Computer-Aided Threat...63007A 793 1202 HI 11. TITLE (Include Security Classification) Computer Simulation Modeling : A Method for Predicting the Utilities of Alternative...SECURITY CLASSIFICATION OF THIS PAGE("wn Data Entered) ii Technical Report 911 Computer Simulation Modeling : A Method for Predicting the Utilities of

  2. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira

    2014-08-01

    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  3. Modelling of dusty plasma properties by computer simulation methods

    Energy Technology Data Exchange (ETDEWEB)

    Baimbetov, F B [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Ramazanov, T S [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Dzhumagulova, K N [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Kadyrsizov, E R [Institute for High Energy Densities of RAS, Izhorskaya 13/19, Moscow 125412 (Russian Federation); Petrov, O F [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Gavrikov, A V [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan)

    2006-04-28

    Computer simulation of dusty plasma properties is performed. The radial distribution functions, the diffusion coefficient are calculated on the basis of the Langevin dynamics. A comparison with the experimental data is made.

  4. A Computational Model for the Numerical Simulation of FSW Processes

    Science.gov (United States)

    Agelet de Saracibar, C.; Chiumenti, M.; Santiago, D.; Cervera, M.; Dialami, N.; Lombera, G.

    2010-06-01

    In this paper a computational model for the numerical simulation of Friction Stir Welding (FSW) processes is presented. FSW is a new method of welding in solid state in which a shouldered tool with a profile probe is rotated and slowly plunged into the joint line between two pieces of sheet or plate material which are butted together. Once the probe has been completely inserted, it is moved with a small tilt angle in the welding direction. Here a quasi-static, thermal transient, mixed multiscale stabilized Eulerian formulation is used. Norton-Hoff and Sheppard-Wright rigid thermo-viscoplastic material models have been considered. A staggered solution algorithm is defined such that for any time step, the mechanical problem is solved at constant temperature and then the thermal problem is solved keeping constant the mechanical variables. A pressure multiscale stabilized mixed linear velocity/linear pressure finite element interpolation formulation is used to solve the mechanical problem and a convection multiscale stabilized linear temperature interpolation formulation is used to solve the thermal problem. The model has been implemented into the in-house developed FE code COMET. Results obtained in the simulation of FSW process are compared to other numerical results or experimental results, when available.

  5. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  6. Analog models of computations \\& Effective Church Turing Thesis: Efficient simulation of Turing machines by the General Purpose Analog Computer

    CERN Document Server

    Pouly, Amaury; Graça, Daniel S

    2012-01-01

    \\emph{Are analog models of computations more powerful than classical models of computations?} From a series of recent papers, it is now clear that many realistic analog models of computations are provably equivalent to classical digital models of computations from a \\emph{computability} point of view. Take, for example, the probably most realistic model of analog computation, the General Purpose Analog Computer (GPAC) model from Claude Shannon, a model for Differential Analyzers, which are analog machines used from 1930s to early 1960s to solve various problems. It is now known that functions computable by Turing machines are provably exactly those that are computable by GPAC. This paper is about next step: understanding if this equivalence also holds at the \\emph{complexity} level. In this paper we show that the realistic models of analog computation -- namely the General Purpose Analog Computer (GPAC) -- can simulate Turing machines in a computationally efficient manner. More concretely we show that, modulo...

  7. COMPUTER SIMULATION OF ANTIFERROMAGNETIC STRUCTURES DESCRIBED BY THE THREE-VERTEX ANTIFERROMAGNETIC POTTS MODEL

    National Research Council Canada - National Science Library

    Yarash K. Abuev; Albert B. Babaev; Pharkhat E. Esetov

    2017-01-01

    Objectives A computer simulation of the antiferromagnetic structures described by the three-vertex Potts model on a triangular lattice is performed, taking into account the antiferromagnetic exchange...

  8. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software con

  9. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  10. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To addr......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns...

  11. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Christopher B.; Richmond, Marshall C.

    2001-05-01

    This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.

  12. Computer Modeling and Simulation of Ultrasonic Signal Processing and Measurements

    Directory of Open Access Journals (Sweden)

    Y. B. Gandole

    2012-01-01

    Full Text Available The system for simulation, measurement, and processing in Graphical User Interface implementation is presented. The received signal from the simulation is compared to that of an actual measurement in the time domain. The comparison of simulated, experimental data clearly shows that acoustic wave propagation can be modeled. The feasibility has been demonstrated in an ultrasound transducer setup for material property investigations. The results of simulation are compared to experimental measurements. The simulation result has good agreement with the experimental data which confirms the validity of the model. The simulation tool therefore provides a way to predict the received signal before anything is built. Furthermore, the use of an ultrasonic simulation package allows for the development of the associated electronics to amplify and process the received ultrasonic signals. Such a virtual design and testing procedure not only can save us time and money, but also can provide better understanding on design failures and allow us to modify designs more efficiently and economically.

  13. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  14. Modeling and computational simulation of the osmotic evaporation process

    Directory of Open Access Journals (Sweden)

    Freddy Forero Longas

    2016-09-01

    Full Text Available Context: Within the processing technologies with membranes, osmotic evaporation is a promising alternative for the transformation of exotic fruits, generating concentrated products that can be used in the daily diet, being easier to consume, reducing transportation costs and increasing shelf life.Method: In this research, it was studied and developed a comprehensive strategy for multiphysics modeling and simulation of mass and momentum transfer phenomena in the process of osmotic evaporation through Comsol® and Matlab® software. It was used an axial geometry approach in two dimensions as simplifications of real module and the finite element method for the numerical solution. The simulations were validated experimentally in an osmotic evaporation system of laboratory scale.Results: The models used and the generated simulations were statistically significant (p <0,05 in predicting the flux behavior, taking into account the effect of flow and temperature feed together with the brine flow, being obtained correlations above 96% between experimental and calculated data.Conclusions: It was found that for the conditions studied the Knudsen diffusion model is most suitable to describe the transfer of water vapor through the hydrophobic membrane. Simulations developed adequately describe the process of osmotic evaporation, becoming a tool for faster economic development of this technology.

  15. The design and calibration of a simulation model of a star computer network

    CERN Document Server

    Gomaa, H

    1982-01-01

    A simulation model of the CERN(European Organization for Nuclear Research) SPS star computer network is described. The model concentrates on simulating the message handling computer, through which all messages in the network pass. The paper describes the main features of the model, the transfer time parameters in the model and how performance measurements were used to assist in the calibration of the model.

  16. A demonstrative model of a lunar base simulation on a personal computer

    Science.gov (United States)

    1985-01-01

    The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

  17. Models for Planning and Simulation in Computer Assisted Orthognatic Surgery

    CERN Document Server

    Chabanas, M; Payan, Y; Boutault, F; Chabanas, Matthieu; Marecaux, Christophe; Payan, Yohan; Boutault, Franck

    2002-01-01

    Two aspects required to establish a planning in orthognatic surgery are addressed in this paper. First, a 3D cephalometric analysis, which is clini-cally essential for the therapeutic decision. Then, an original method to build a biomechanical model of patient face soft tissue, which provides evaluation of the aesthetic outcomes of an intervention. Both points are developed within a clinical application context for computer aided maxillofacial surgery.

  18. A transport model for computer simulation of wildfires

    Energy Technology Data Exchange (ETDEWEB)

    Linn, R. [Los Alamos National Lab., NM (United States)

    1997-12-31

    Realistic self-determining simulation of wildfires is a difficult task because of a large variety of important length scales (including scales on the size of twigs or grass and the size of large trees), imperfect data, complex fluid mechanics and heat transfer, and very complicated chemical reactions. The author uses a transport approach to produce a model that exhibits a self-determining propagation rate. The transport approach allows him to represent a large number of environments such as those with nonhomogeneous vegetation and terrain. He accounts for the microscopic details of a fire with macroscopic resolution by dividing quantities into mean and fluctuating parts similar to what is done in traditional turbulence modeling. These divided quantities include fuel, wind, gas concentrations, and temperature. Reaction rates are limited by the mixing process and not the chemical kinetics. The author has developed a model that includes the transport of multiple gas species, such as oxygen and volatile hydrocarbons, and tracks the depletion of various fuels and other stationary solids and liquids. From this model he develops a simplified local burning model with which he performs a number of simulations that demonstrate that he is able to capture the important physics with the transport approach. With this simplified model he is able to pick up the essence of wildfire propagation, including such features as acceleration when transitioning to upsloping terrain, deceleration of fire fronts when they reach downslopes, and crowning in the presence of high winds.

  19. Computer model simulation of null-flux magnetic suspension and guidance

    Energy Technology Data Exchange (ETDEWEB)

    He, Jianliang; Rote, D.M.

    1992-06-01

    This paper discusses the magnetic force computations in a null-flux suspension system using dynamic circuit theory. A computer simulation model that can be used to compute magnetic forces and predict the system performance is developed on the basis of dynamic circuit theory. Numerical examples are presented to demonstrate the application of the model. The performance of the null-flux suspension system is simulated and discussed. 8 refs.

  20. Computer model simulation of null-flux magnetic suspension and guidance

    Energy Technology Data Exchange (ETDEWEB)

    He, Jianliang; Rote, D.M.

    1992-01-01

    This paper discusses the magnetic force computations in a null-flux suspension system using dynamic circuit theory. A computer simulation model that can be used to compute magnetic forces and predict the system performance is developed on the basis of dynamic circuit theory. Numerical examples are presented to demonstrate the application of the model. The performance of the null-flux suspension system is simulated and discussed. 8 refs.

  1. Computer modeling and simulators as part of university training for NPP operating personnel

    Science.gov (United States)

    Volman, M.

    2017-01-01

    This paper considers aspects of a program for training future nuclear power plant personnel developed by the NPP Department of Ivanovo State Power Engineering University. Computer modeling is used for numerical experiments on the kinetics of nuclear reactors in Mathcad. Simulation modeling is carried out on the computer and full-scale simulator of water-cooled power reactor for the simulation of neutron-physical reactor measurements and the start-up - shutdown process.

  2. Advances in simulated modeling of vibration systems based on computational intelligence

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Computational intelligence is the computational simulation of the bio-intelligence, which includes artificial neural networks, fuzzy systems and evolutionary computations. This article summarizes the state of the art in the field of simulated modeling of vibration systems using methods of computational intelligence, based on some relevant subjects and the authors' own research work. First, contributions to the applications of computational intelligence to the identification of nonlinear characteristics of packaging are reviewed. Subsequently, applications of the newly developed training algorithms for feedforward neural networks to the identification of restoring forces in multi-degree-of-freedom nonlinear systems are discussed. Finally, the neural-network-based method of model reduction for the dynamic simulation of microelectromechanical systems (MEMS) using generalized Hebbian algorithm (GHA) and robust GHA is outlined. The prospects of the simulated modeling of vibration systems using techniques of computational intelligence are also indicated.

  3. Realistic modeling of clinical laboratory operation by computer simulation.

    Science.gov (United States)

    Vogt, W; Braun, S L; Hanssmann, F; Liebl, F; Berchtold, G; Blaschke, H; Eckert, M; Hoffmann, G E; Klose, S

    1994-06-01

    An important objective of laboratory management is to adjust the laboratory's capability to the needs of patients' care as well as economy. The consequences of management may be changes in laboratory organization, equipment, or personnel planning. At present only one's individual experience can be used for making such decisions. We have investigated whether the techniques of operations research could be transferred to a clinical laboratory and whether an adequate simulation model of the laboratory could be realized. First we listed and documented the system design and the process flow for each single laboratory request. These input data were linked by the simulation model (programming language SIMSCRIPT II.5). The output data (turnaround times, utilization rates, and analysis of queue length) were validated by comparison with the current performance data obtained by tracking specimen flow. Congruence of the data was excellent (within +/- 4%). In planning experiments we could study the consequences of changes in order entry, staffing, and equipment on turnaround times, utilization, and queue lengths. We conclude that simulation can be a valuable tool for better management decisions.

  4. A computational fluid dynamics model for wind simulation:model implementation and experimental validation

    Institute of Scientific and Technical Information of China (English)

    Zhuo-dong ZHANG; Ralf WIELAND; Matthias REICHE; Roger FUNK; Carsten HOFFMANN; Yong LI; Michael SOMMER

    2012-01-01

    To provide physically based wind modelling for wind erosion research at regional scale,a 3D computational fluid dynamics (CFD) wind model was developed.The model was programmed in C language based on the Navier-Stokes equations,and it is freely available as open source.Integrated with the spatial analysis and modelling tool (SAMT),the wind model has convenient input preparation and powerful output visualization.To validate the wind model,a series of experiments was conducted in a wind tunnel.A blocking inflow experiment was designed to test the performance of the model on simulation of basic fluid processes.A round obstacle experiment was designed to check if the model could simulate the influences of the obstacle on wind field.Results show that measured and simulated wind fields have high correlations,and the wind model can simulate both the basic processes of the wind and the influences of the obstacle on the wind field.These results show the high reliability of the wind model.A digital elevation model (DEM) of an area (3800 m long and 1700 m wide) in the Xilingele grassland in Inner Mongolia (autonomous region,China) was applied to the model,and a 3D wind field has been successfully generated.The clear implementation of the model and the adequate validation by wind tunnel experiments laid a solid foundation for the prediction and assessment of wind erosion at regional scale.

  5. Computer Modelling and Simulation of Solar PV Array Characteristics

    Science.gov (United States)

    Gautam, Nalin Kumar

    2003-02-01

    The main objective of my PhD research work was to study the behaviour of inter-connected solar photovoltaic (PV) arrays. The approach involved the construction of mathematical models to investigate different types of research problems related to the energy yield, fault tolerance, efficiency and optimal sizing of inter-connected solar PV array systems. My research work can be divided into four different types of research problems: 1. Modeling of inter-connected solar PV array systems to investigate their electrical behavior, 2. Modeling of different inter-connected solar PV array networks to predict their expected operational lifetimes, 3. Modeling solar radiation estimation and its variability, and 4. Modeling of a coupled system to estimate the size of PV array and battery-bank in the stand-alone inter-connected solar PV system where the solar PV system depends on a system providing solar radiant energy. The successful application of mathematics to the above-m entioned problems entailed three phases: 1. The formulation of the problem in a mathematical form using numerical, optimization, probabilistic and statistical methods / techniques, 2. The translation of mathematical models using C++ to simulate them on a computer, and 3. The interpretation of the results to see how closely they correlated with the real data. Array is the most cost-intensive component of the solar PV system. Since the electrical performances as well as life properties of an array are highly sensitive to field conditions, different characteristics of the arrays, such as energy yield, operational lifetime, collector orientation, and optimal sizing were investigated in order to improve their efficiency, fault-tolerance and reliability. Three solar cell interconnection configurations in the array - series-parallel, total-cross-tied, and bridge-linked, were considered. The electrical characteristics of these configurations were investigated to find out one that is comparatively less susceptible to

  6. Investigating interventions in Alzheimer's disease with computer simulation models.

    Directory of Open Access Journals (Sweden)

    Carole J Proctor

    Full Text Available Progress in the development of therapeutic interventions to treat or slow the progression of Alzheimer's disease has been hampered by lack of efficacy and unforeseen side effects in human clinical trials. This setback highlights the need for new approaches for pre-clinical testing of possible interventions. Systems modelling is becoming increasingly recognised as a valuable tool for investigating molecular and cellular mechanisms involved in ageing and age-related diseases. However, there is still a lack of awareness of modelling approaches in many areas of biomedical research. We previously developed a stochastic computer model to examine some of the key pathways involved in the aggregation of amyloid-beta (Aβ and the micro-tubular binding protein tau. Here we show how we extended this model to include the main processes involved in passive and active immunisation against Aβ and then demonstrate the effects of this intervention on soluble Aβ, plaques, phosphorylated tau and tangles. The model predicts that immunisation leads to clearance of plaques but only results in small reductions in levels of soluble Aβ, phosphorylated tau and tangles. The behaviour of this model is supported by neuropathological observations in Alzheimer patients immunised against Aβ. Since, soluble Aβ, phosphorylated tau and tangles more closely correlate with cognitive decline than plaques, our model suggests that immunotherapy against Aβ may not be effective unless it is performed very early in the disease process or combined with other therapies.

  7. A Computational Model for Simulating Spaceflight Induced Bone Remodeling

    Science.gov (United States)

    Pennline, James A.; Mulugeta, Lealem

    2014-01-01

    An overview of an initial development of a model of bone loss due to skeletal unloading in weight bearing sites is presented. The skeletal site chosen for the initial application of the model is the femoral neck region because hip fractures can be debilitating to the overall performance health of astronauts. The paper begins with the motivation for developing such a model of the time course of change in bone in order to understand the mechanism of bone demineralization experienced by astronauts in microgravity, to quantify the health risk, and to establish countermeasures. Following this, a general description of a mathematical formulation of the process of bone remodeling is discussed. Equations governing the rate of change of mineralized bone volume fraction and active osteoclast and osteoblast are illustrated. Some of the physiology of bone remodeling, the theory of how imbalance in remodeling can cause bone loss, and how the model attempts to capture this is discussed. The results of a preliminary validation analysis that was carried out are presented. The analysis compares a set of simulation results against bone loss data from control subjects who participated in two different bed rest studies. Finally, the paper concludes with outlining the current limitations and caveats of the model, and planned future work to enhance the state of the model.

  8. Computational modeling, optimization and manufacturing simulation of advanced engineering materials

    CERN Document Server

    2016-01-01

    This volume presents recent research work focused in the development of adequate theoretical and numerical formulations to describe the behavior of advanced engineering materials.  Particular emphasis is devoted to applications in the fields of biological tissues, phase changing and porous materials, polymers and to micro/nano scale modeling. Sensitivity analysis, gradient and non-gradient based optimization procedures are involved in many of the chapters, aiming at the solution of constitutive inverse problems and parameter identification. All these relevant topics are exposed by experienced international and inter institutional research teams resulting in a high level compilation. The book is a valuable research reference for scientists, senior undergraduate and graduate students, as well as for engineers acting in the area of computational material modeling.

  9. A Computer Simulation Modeling Approach to Estimating Utility in Several Air Force Specialties

    Science.gov (United States)

    1992-05-01

    AL-TR-1992-0006 AD-A252 322 /II" A COMPUTER SIMULATION MODELING A APPROACH TO ESTIMATING UTILITY IN R SEVERAL AIR FORCE SPECIALTIES M Brice M. Stone...I 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED IU 1Q::l.n1 Umrjh 1100 4. TITLE AND SUBTITLE S. FUNDING NUMBERS A Computer Simulation Modeling Approach...I DTIC TAB 0 Unannounced 0 justificatlon- By Distribut On . Availability Codes Avai an /r Dist Special v A COMPUTER SIMULATION MODELING APPROACH TO

  10. Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System

    Science.gov (United States)

    2017-08-01

    ARL-TR-8074 ● AUG 2017 US Army Research Laboratory Quantifying Uncertainty from Computational Factors in Simulations of a Model...Uncertainty from Computational Factors in Simulations of a Model Ballistic System by Daniel J Hornbaker Weapons and Materials Research...penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM

  11. Global economic burden of Chagas disease: a computational simulation model

    Science.gov (United States)

    Lee, Bruce Y; Bacon, Kristina M; Bottazzi, Maria Elena; Hotez, Peter J

    2013-01-01

    Summary Background As Chagas disease continues to expand beyond tropical and subtropical zones, a growing need exists to better understand its resulting economic burden to help guide stakeholders such as policy makers, funders, and product developers. We developed a Markov simulation model to estimate the global and regional health and economic burden of Chagas disease from the societal perspective. Methods Our Markov model structure had a 1 year cycle length and consisted of five states: acute disease, indeterminate disease, cardiomyopathy with or without congestive heart failure, megaviscera, and death. Major model parameter inputs, including the annual probabilities of transitioning from one state to another, and present case estimates for Chagas disease came from various sources, including WHO and other epidemiological and disease-surveillance-based reports. We calculated annual and lifetime health-care costs and disability-adjusted life-years (DALYs) for individuals, countries, and regions. We used a discount rate of 3% to adjust all costs and DALYs to present-day values. Findings On average, an infected individual incurs US$474 in health-care costs and 0·51 DALYs annually. Over his or her lifetime, an infected individual accrues an average net present value of $3456 and 3·57 DALYs. Globally, the annual burden is $627·46 million in health-care costs and 806 170 DALYs. The global net present value of currently infected individuals is $24·73 billion in health-care costs and 29 385 250 DALYs. Conversion of this burden into costs results in annual per-person costs of $4660 and lifetime per-person costs of $27 684. Global costs are $7·19 billion per year and $188·80 billion per lifetime. More than 10% of these costs emanate from the USA and Canada, where Chagas disease has not been traditionally endemic. A substantial proportion of the burden emerges from lost productivity from cardiovascular disease-induced early mortality. Interpretation The economic burden

  12. Exploring Students' Computational Thinking Skills in Modeling and Simulation Projects: : A Pilot Study

    NARCIS (Netherlands)

    Grgurina, Natasa; van Veen, Klaas; Barendsen, Erik; Zwaneveld, Bert; Suhre, Cor; Gal-Ezer, Judith; Sentance, Sue; Vahrenhold, Jan

    2015-01-01

    Computational Thinking (CT) is gaining a lot of attention in education. We explored how to discern the occurrences of CT in the projects of 12th grade high school students in the computer science (CS) course. Within the projects, they constructed models and ran simulations of phenomena from other (S

  13. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique

    Directory of Open Access Journals (Sweden)

    Toru Higaki

    2017-08-01

    Full Text Available This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1–6 mm made by 3D printer was scanned using 320-row detector computed tomography (CT. Hybrid iterative reconstruction (hybrid IR and model-based iterative reconstruction (MBIR were performed for the image reconstruction.

  14. Dynamic Factor Method of Computing Dynamic Mathematical Model for System Simulation

    Institute of Scientific and Technical Information of China (English)

    老大中; 吴娟; 杨策; 蒋滋康

    2003-01-01

    The computational methods of a typical dynamic mathematical model that can describe the differential element and the inertial element for the system simulation are researched. The stability of numerical solutions of the dynamic mathematical model is researched. By means of theoretical analysis, the error formulas, the error sign criteria and the error relationship criterion of the implicit Euler method and the trapezoidal method are given, the dynamic factor affecting the computational accuracy has been found, the formula and the methods of computing the dynamic factor are given. The computational accuracy of the dynamic mathematical model like this can be improved by use of the dynamic factor.

  15. Description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory test model

    Science.gov (United States)

    Woolley, C. T.; Groom, N. J.

    1981-01-01

    A description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory model is presented. The AMCD is a momentum exchange device which is under development as an advanced control effector for spacecraft attitude control systems. The digital computer simulation of this device incorporates the following models: six degree of freedom rigid body dynamics; rim warp; controller dynamics; nonlinear distributed element axial bearings; as well as power driver and power supply current limits. An annotated FORTRAN IV source code listing of the computer program is included.

  16. Computer Modeling and Simulation Evaluation of High Power LED Sources for Secondary Optical Design

    Institute of Scientific and Technical Information of China (English)

    SU Hong-dong; WANG Ya-jun; DONG Ji-yang; CHEN Zhong

    2007-01-01

    Proposed and demonstrated is a novel computer modeling method for high power light emitting diodes(LEDs). It contains geometrical structure and optical property of high power LED as well as LED dies definition with its spatial and angular distribution. Merits and non-merits of traditional modeling methods when applied to high power LEDs based on secondary optical design are discussed. Two commercial high power LEDs are simulated using the proposed computer modeling method. Correlation coefficient is proposed to compare and analyze the simulation results and manufacturing specifications. The source model is precisely demonstrated by obtaining above 99% in correlation coefficient with different surface incident angle intervals.

  17. REXOR Rotorcraft Simulation Model. Volume 2. Computer Implementation

    Science.gov (United States)

    1976-07-01

    computation nucleus of REXOR. ACCEL gathers the information to form the generalized mass and force matricies , and controls the accelera- tion update...information to form the generalized mass and force matricies , and controls the accelera- tion update sequence. The majority of the development of

  18. 9th Annual Conference of the North East Polytechnics Mathematical Modelling & Computer Simulation Group

    CERN Document Server

    Bradley, R

    1987-01-01

    In recent years, mathematical modelling allied to computer simulation has emerged as en effective and invaluable design tool for industry and a discipline in its own right. This has been reflected in the popularity of the growing number of courses and conferences devoted to the area. The North East Polytechnics Mathematical Modelling and Computer Simulation Group has a balanced representation of academics and industrialists and, as a Group, has the objective of promoting a continuing partnership between the Polytechnics in the North East and local industry. Prior to the present conference the Group has organised eight conferences with a variety of themes related to mathematical modelling and computer simulation. The theme chosen for the Polymodel 9 Conference held in Newcastle upon Tyne in May 1986 was Industrial Vibration Modelling, which is particularly approp riate for 'Industry Year' and is an area which continues to present industry and academics with new and challenging problems. The aim of the Conferen...

  19. Computational Modeling and Simulation of Film-Condensation

    Science.gov (United States)

    2013-01-18

    Two different problems are simulated and analyzed, discussed below in separate subsections. 3.1. CFD Analysis of Film Condensation on the underside...ndergo trans 1981; Hu and iterature. The egimes and t ion is shown n. Saturated v h wets the un vity and surf por. This can ier fluid sits th at which λd...configuratio own in Fig ditions; corre s the veloci all, the surf ump across t ilibrium. As main at satur al BCs are ap t ) and sub c exists a dif ich leads to

  20. Mathematical and computational modeling simulation of solar drying Systems

    Science.gov (United States)

    Mathematical modeling of solar drying systems has the primary aim of predicting the required drying time for a given commodity, dryer type, and environment. Both fundamental (Fickian diffusion) and semi-empirical drying models have been applied to the solar drying of a variety of agricultural commo...

  1. Virtually expert: modes of environmental computer simulation modeling.

    Science.gov (United States)

    Landström, Catharina; Whatmore, Sarah J

    2014-12-01

    This paper challenges three assumptions common in the literature on expertise: that expertise is linearly derived from scientific knowledge; that experts always align with the established institutional order; and that expertise is a property acquired by individuals. We criticize these ideas by juxtaposing three distinct expert practices involved with flood risk management in England. Virtual engineering is associated with commercial consultancy and relies on standardized software packages to assess local flood inundation. Mathematical experimentation refers to academic scientists creating new digital renderings of the physical dynamics of flooding. Participatory modeling denotes research projects that aim to transform the relationships between experts and local communities. Focusing on different modes of modeling we contribute an analysis of how particular models articulate with specific politics of knowledge as experts form relationships with flood risk management actors. Our empirical study also shows how models can contribute to re-distribution of expertise in local flood risk management.

  2. Quasi-monte carlo simulation and variance reduction techniques substantially reduce computational requirements of patient-level simulation models: An application to a discrete event simulation model

    NARCIS (Netherlands)

    Treur, M.; Postma, M.

    2014-01-01

    Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using quasi-mont

  3. Quantification of remodeling parameter sensitivity - assessed by a computer simulation model

    DEFF Research Database (Denmark)

    Thomsen, J.S.; Mosekilde, Li.; Mosekilde, Erik

    1996-01-01

    We have used a computer simulation model to evaluate the effect of several bone remodeling parameters on vertebral cancellus bone. The menopause was chosen as the base case scenario, and the sensitivity of the model to the following parameters was investigated: activation frequency, formation...

  4. Mechanical Modeling and Computer Simulation of Protein Folding

    Science.gov (United States)

    Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene

    2014-01-01

    In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…

  5. Systematic review of the use of computer simulation modeling of patient flow in surgical care.

    Science.gov (United States)

    Sobolev, Boris G; Sanchez, Victor; Vasilakis, Christos

    2011-02-01

    Computer simulation has been employed to evaluate proposed changes in the delivery of health care. However, little is known about the utility of simulation approaches for analysis of changes in the delivery of surgical care. We searched eight bibliographic databases for this comprehensive review of the literature published over the past five decades, and found 34 publications that reported on simulation models for the flow of surgical patients. The majority of these publications presented a description of the simulation approach: 91% outlined the underlying assumptions for modeling, 88% presented the system requirements, and 91% described the input and output data. However, only half of the publications reported that models were constructed to address the needs of policy-makers, and only 26% reported some involvement of health system managers and policy-makers in the simulation study. In addition, we found a wide variation in the presentation of assumptions, system requirements, input and output data, and results of simulation-based policy analysis.

  6. Computer Modeling and Simulation of Geofluids: A General Review and Sample Description

    Institute of Scientific and Technical Information of China (English)

    胡文XUAN; 段振豪; 等

    1997-01-01

    Thermodynamic properties of fluids are essential for the understanding of the geochemical behavior of various processes,The paper introduces the most updated computer modeling and simulation methods in the study of thermodynamics of geofluids,inclduing semiempirical models(such as equation of state)and molecular dynamics and Monte Carlo simulation.A well-established semi-empirical model can interpolate and extrapolate experimental data and yield much physicochemical information.Computer modeling may produce"experimental data" yield much physicochemical information.Computer modeling may produce"experimental data"even under experimentally difficult conditions.They provide important methods for the study of geological fluid systems on the quantitative basis.

  7. Two-dimensional surrogate contact modeling for computationally efficient dynamic simulation of total knee replacements.

    Science.gov (United States)

    Lin, Yi-Chung; Haftka, Raphael T; Queipo, Nestor V; Fregly, Benjamin J

    2009-04-01

    Computational speed is a major limiting factor for performing design sensitivity and optimization studies of total knee replacements. Much of this limitation arises from extensive geometry calculations required by contact analyses. This study presents a novel surrogate contact modeling approach to address this limitation. The approach involves fitting contact forces from a computationally expensive contact model (e.g., a finite element model) as a function of the relative pose between the contacting bodies. Because contact forces are much more sensitive to displacements in some directions than others, standard surrogate sampling and modeling techniques do not work well, necessitating the development of special techniques for contact problems. We present a computational evaluation and practical application of the approach using dynamic wear simulation of a total knee replacement constrained to planar motion in a Stanmore machine. The sample points needed for surrogate model fitting were generated by an elastic foundation (EF) contact model. For the computational evaluation, we performed nine different dynamic wear simulations with both the surrogate contact model and the EF contact model. In all cases, the surrogate contact model accurately reproduced the contact force, motion, and wear volume results from the EF model, with computation time being reduced from 13 min to 13 s. For the practical application, we performed a series of Monte Carlo analyses to determine the sensitivity of predicted wear volume to Stanmore machine setup issues. Wear volume was highly sensitive to small variations in motion and load inputs, especially femoral flexion angle, but not to small variations in component placements. Computational speed was reduced from an estimated 230 h to 4 h per analysis. Surrogate contact modeling can significantly improve the computational speed of dynamic contact and wear simulations of total knee replacements and is appropriate for use in design sensitivity

  8. First steps in computational systems biology: A practical session in metabolic modeling and simulation.

    Science.gov (United States)

    Reyes-Palomares, Armando; Sánchez-Jiménez, Francisca; Medina, Miguel Ángel

    2009-05-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever faster numerical simulations of mathematical models. Mathematical modeling plays an essential role in new systems biology approaches. As a complex, integrated system, metabolism is a suitable topic of study for systems biology approaches. However, up until recently, this topic has not been properly covered in biochemistry courses. This communication reports the development and implementation of a practical lesson plan on metabolic modeling and simulation.

  9. Efficient scatter model for simulation of ultrasound images from computed tomography data

    Science.gov (United States)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  10. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    Science.gov (United States)

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  11. Simulation of Human Episodic Memory by Using a Computational Model of the Hippocampus

    Directory of Open Access Journals (Sweden)

    Naoyuki Sato

    2010-01-01

    Full Text Available The episodic memory, the memory of personal events and history, is essential for understanding the mechanism of human intelligence. Neuroscience evidence has shown that the hippocampus, a part of the limbic system, plays an important role in the encoding and the retrieval of the episodic memory. This paper reviews computational models of the hippocampus and introduces our own computational model of human episodic memory based on neural synchronization. Results from computer simulations demonstrate that our model provides advantage for instantaneous memory formation and selective retrieval enabling memory search. Moreover, this model was found to have the ability to predict human memory recall by integrating human eye movement data during encoding. The combined approach between computational models and experiment is efficient for theorizing the human episodic memory.

  12. Defining epidemics in computer simulation models: How do definitions influence conclusions?

    Directory of Open Access Journals (Sweden)

    Carolyn Orbann

    2017-06-01

    Full Text Available Computer models have proven to be useful tools in studying epidemic disease in human populations. Such models are being used by a broader base of researchers, and it has become more important to ensure that descriptions of model construction and data analyses are clear and communicate important features of model structure. Papers describing computer models of infectious disease often lack a clear description of how the data are aggregated and whether or not non-epidemic runs are excluded from analyses. Given that there is no concrete quantitative definition of what constitutes an epidemic within the public health literature, each modeler must decide on a strategy for identifying epidemics during simulation runs. Here, an SEIR model was used to test the effects of how varying the cutoff for considering a run an epidemic changes potential interpretations of simulation outcomes. Varying the cutoff from 0% to 15% of the model population ever infected with the illness generated significant differences in numbers of dead and timing variables. These results are important for those who use models to form public health policy, in which questions of timing or implementation of interventions might be answered using findings from computer simulation models.

  13. Designing Open Source Computer Models for Physics by Inquiry using Easy Java Simulation

    CERN Document Server

    Wee, Loo Kang

    2012-01-01

    The Open Source Physics community has created hundreds of physics computer models (Wolfgang Christian, Esquembre, & Barbato, 2011; F. K. Hwang & Esquembre, 2003) which are mathematical computation representations of real-life Physics phenomenon. Since the source codes are available and can be modified for redistribution licensed Creative Commons Attribution or other compatible copyrights like GNU General Public License (GPL), educators can customize (Wee & Mak, 2009) these models for more targeted productive (Wee, 2012) activities for their classroom teaching and redistribute them to benefit all humankind. In this interactive event, we will share the basics of using the free authoring toolkit called Easy Java Simulation (W. Christian, Esquembre, & Mason, 2010; Esquembre, 2010) so that participants can modify the open source computer models for their own learning and teaching needs. These computer models has the potential to provide the experience and context, essential for deepening students c...

  14. Programming of a flexible computer simulation to visualize pharmacokinetic-pharmacodynamic models.

    Science.gov (United States)

    Lötsch, J; Kobal, G; Geisslinger, G

    2004-01-01

    Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.

  15. A generic simulation cell method for developing extensible, efficient and readable parallel computational models

    Science.gov (United States)

    Honkonen, I.

    2015-03-01

    I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring modification of existing code. This is an advantage for the development and testing of, e.g., geoscientific software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. An implementation of the generic simulation cell method presented here, generic simulation cell class (gensimcell), also includes support for parallel programming by allowing model developers to select which simulation variables of, e.g., a domain-decomposed model to transfer between processes via a Message Passing Interface (MPI) library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class requires a C++ compiler that supports a version of the language standardized in 2011 (C++11). The code is available at https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those who do are kindly requested to acknowledge and cite this work.

  16. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  17. Computer simulation in materials science

    Energy Technology Data Exchange (ETDEWEB)

    Arsenault, R.J.; Beeler, J.R.; Esterling, D.M.

    1988-01-01

    This book contains papers on the subject of modeling in materials science. Topics include thermodynamics of metallic solids and fluids, grain-boundary modeling, fracture from an atomistic point of view, and computer simulation of dislocations on an atomistic level.

  18. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    Directory of Open Access Journals (Sweden)

    Supat Faarungsang

    2017-04-01

    Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.

  19. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  20. A new switching parameter varying optoelectronic delayed feedback model with computer simulation

    Science.gov (United States)

    Liu, Lingfeng; Miao, Suoxia; Cheng, Mengfan; Gao, Xiaojing

    2016-02-01

    In this paper, a new switching parameter varying optoelectronic delayed feedback model is proposed and analyzed by computer simulation. This model is switching between two parameter varying optoelectronic delayed feedback models based on chaotic pseudorandom sequences. Complexity performance results show that this model has a high complexity compared to the original model. Furthermore, this model can conceal the time delay effectively against the auto-correlation function, delayed mutual information and permutation information analysis methods, and can extent the key space, which greatly improve its security.

  1. Modelling of the diffusion of carbon dioxide in polyimide matrices by computer simulation

    OpenAIRE

    1992-01-01

    Computer aided molecular modelling is used to visualize the motion of CO2 gas molecules inside a polyimide polymer matrix. The polymers simulated are two 6FDA-bases polyimides, 6FDA-4PDA and 6FDA-44ODA. These polymers have also been synthesized in our laboratory, and thus the simulated properties could directly be compared with “real-world” data. The simulation experiments have been performed using the GROMOS1 package. The polymer boxes were created using the soft-core method, with short (11 ...

  2. Development of a Computational Simulation Model for Conflict Management in Team Building

    Directory of Open Access Journals (Sweden)

    W. M. Wang

    2011-05-01

    Full Text Available Conflict management is one of the most important issues in leveraging organizational competitiveness. However, traditional social scientists built theories or models in this area which were mostly expressed in words and diagrams are insufficient. Social science research based on computational modeling and simulation is beginning to augment traditional theory building. Simulation provides a method for people to try their actions out in a way that is cost effective, faster, appropriate, flexible, and ethical. In this paper, a computational simulation model for conflict management in team building is presented. The model is designed and used to explore the individual performances related to the combination of individuals who have a range of conflict handling styles, under various types of resources and policies. The model is developed based on agent-based modeling method. Each of the agents has one of the five conflict handling styles: accommodation, compromise, competition, contingency, and learning. There are three types of scenarios: normal, convex, and concave. There are two types of policies: no policy, and a reward and punishment policy. Results from running the model are also presented. The simulation has led us to derive two implications concerning conflict management. First, a concave type of resource promotes competition, while convex type of resource promotes compromise and collaboration. Second, the performance ranking of different styles can be influenced by introducing different policies. On the other hand, it is possible for us to promote certain style by introducing different policies.

  3. Simulation step size analysis of a whole-cell computational model of bacteria

    Science.gov (United States)

    Abreu, Raphael; Castro, Maria Clicia S.; Silva, Fabrício Alves B.

    2016-12-01

    Understanding how complex phenotypes arise from individual molecules and their interactions is a major challenge in biology and, to meet this challenge, computational approaches are increasingly employed. As an example, a recent paper [1] proposed a whole-cell model Mycoplasma genitalium including all cell components and their interactions. 28 modules representing several cell functions were modeled independently, and then integrated into a single computational model. One assumption considered in the whole-cell model of M.Genitalium is that all 28 modules can be modeled independently given the 1 second step size used in simulations. This is a major assumption, since it simplifies the modeling of several cell functions and makes the modeling of the system as a whole feasible. In this paper we investigate the dependency of experimental results on that assumption. We have simulated the M.Genitalium cell cycle using several simulation time step sizes and compared the results to the ones obtained with the system using 1 second simulation time step.

  4. Simulation of Mental Disorders: II. Computer Models, Purposes and Future Directions.

    Science.gov (United States)

    Gold, Azgad; Dudai, Yadin

    2016-01-01

    The complexity of the human brain and the difficulties in identifying and dissecting the biological, social and contextual underpinnings of mental functions confound the study of the etiology and pathophysiology of mental disorders. Simulating mental disorders in animal models or in computer programs may contribute to the understanding of such disorders. In the companion paper (30), we discussed selected concepts and pragmatics pertaining to mental illness simulation in general, and then focused on issues pertaining to animal models of mental disease. In this paper, we focus on selected aspects of the merits and limitations of the use of large scale computer simulation in investigating mental disorders. We argue that at the current state of knowledge, the biological-phenomenological gap in understanding mental disorders markedly limits the ability to generate high-fidelity computational models of mental illness. We conclude that similarly to the animal model approach, brain simulation focusing on limited realistic objectives, such as mimicking the emergence of selected distinct attributes of specific mental symptoms in a virtual brain or parts thereof, may serve as a useful tool in exploring mental disorders.

  5. The generic simulation cell method for developing extensible, efficient and readable parallel computational models

    Science.gov (United States)

    Honkonen, I.

    2014-07-01

    I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring any modification of existing code. This is an advantage for the development and testing of computational modeling software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. Support for parallel programming is also provided by allowing users to select which simulation variables to transfer between processes via a Message Passing Interface library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class presented here requires a C++ compiler that supports variadic templates which were standardized in 2011 (C++11). The code is available at: https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those that do are kindly requested to cite this work.

  6. The generic simulation cell method for developing extensible, efficient and readable parallel computational models

    Directory of Open Access Journals (Sweden)

    I. Honkonen

    2014-07-01

    Full Text Available I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring any modification of existing code. This is an advantage for the development and testing of computational modeling software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. Support for parallel programming is also provided by allowing users to select which simulation variables to transfer between processes via a Message Passing Interface library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class presented here requires a C++ compiler that supports variadic templates which were standardized in 2011 (C++11. The code is available at: https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those that do are kindly requested to cite this work.

  7. Computer simulation of quantum effects in Tavis-Cummings model and its applications

    Science.gov (United States)

    Ozhigov, Yuri I.; Skovoroda, Nikita A.; Ladunov, Vitalii Y.

    2016-12-01

    We describe computer methods of simulation of Tavis-Cummings based quantum models, and apply those methods to specific tasks, conductivity measurements of atomic excitations in short chains of optical cavities with two-level atoms, C-Sign optical model, and dark states. For the conductivity measurements, we reproduce the dephasing assisted transport and quantum bottleneck effects and show their relation, and study the "which way?" problem. For the C-Sign optical model, we use the model to find optimal parameters of the system to minimize the error. For dark states, we study their collapse due to dephasing noise.

  8. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  9. Computational models for simulations of lithium-ion battery cells under constrained compression tests

    Science.gov (United States)

    Ali, Mohammed Yusuf; Lai, Wei-Jen; Pan, Jwo

    2013-11-01

    In this paper, computational models are developed for simulations of representative volume element (RVE) specimens of lithium-ion battery cells under in-plane constrained compression tests. For cell components in the finite element analyses, the effective compressive moduli are obtained from in-plane constrained compressive tests, the Poisson's ratios are based on the rule of mixture, and the stress-plastic strain curves are obtained from the tensile tests and the rule of mixture. The Gurson's material model is adopted to account for the effect of porosity in separator and electrode sheets. The computational results show that the computational models can be used to examine the micro buckling of the component sheets, the macro buckling of the cell RVE specimens, and the formation of the kinks and shear bands observed in experiments, and to simulate the load-displacement curves of the cell RVE specimens. The initial micro buckling mode of the cover sheets in general agrees with that of an approximate elastic buckling solution. Based on the computational models, the effects of the friction on the deformation pattern and void compaction are identified. Finally, the effects of the initial clearance and biaxial compression on the deformation patterns of the cell RVE specimens are demonstrated.

  10. Computational Investigations on Polymerase Actions in Gene Transcription and Replication Combining Physical Modeling and Atomistic Simulations

    CERN Document Server

    Yu, Jin

    2015-01-01

    Polymerases are protein enzymes that move along nucleic acid chains and catalyze template-based polymerization reactions during gene transcription and replication. The polymerases also substantially improve transcription or replication fidelity through the non-equilibrium enzymatic cycles. We briefly review computational efforts that have been made toward understanding mechano-chemical coupling and fidelity control mechanisms of the polymerase elongation. The polymerases are regarded as molecular information motors during the elongation process. It requires a full spectrum of computational approaches from multiple time and length scales to understand the full polymerase functional cycle. We keep away from quantum mechanics based approaches to the polymerase catalysis due to abundant former surveys, while address only statistical physics modeling approach and all-atom molecular dynamics simulation approach. We organize this review around our own modeling and simulation practices on a single-subunit T7 RNA poly...

  11. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  12. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    Science.gov (United States)

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  13. Towards the global complexity, topology and chaos in modelling, simulation and computation

    CERN Document Server

    Meyer, D A

    1997-01-01

    Topological effects produce chaos in multiagent simulation and distributed computation. We explain this result by developing three themes concerning complex systems in the natural and social sciences: (i) Pragmatically, a system is complex when it is represented efficiently by different models at different scales. (ii) Nontrivial topology, identifiable as we scale towards the global, induces complexity in this sense. (iii) Complex systems with nontrivial topology are typically chaotic.

  14. Simulation of windblown dust transport from a mine tailings impoundment using a computational fluid dynamics model

    Science.gov (United States)

    Stovern, Michael; Felix, Omar; Csavina, Janae; Rine, Kyle P.; Russell, MacKenzie R.; Jones, Robert M.; King, Matt; Betterton, Eric A.; Sáez, A. Eduardo

    2014-09-01

    Mining operations are potential sources of airborne particulate metal and metalloid contaminants through both direct smelter emissions and wind erosion of mine tailings. The warmer, drier conditions predicted for the Southwestern US by climate models may make contaminated atmospheric dust and aerosols increasingly important, due to potential deleterious effects on human health and ecology. Dust emissions and dispersion of dust and aerosol from the Iron King Mine tailings in Dewey-Humboldt, Arizona, a Superfund site, are currently being investigated through in situ field measurements and computational fluid dynamics modeling. These tailings are heavily contaminated with lead and arsenic. Using a computational fluid dynamics model, we model dust transport from the mine tailings to the surrounding region. The model includes gaseous plume dispersion to simulate the transport of the fine aerosols, while individual particle transport is used to track the trajectories of larger particles and to monitor their deposition locations. In order to improve the accuracy of the dust transport simulations, both regional topographical features and local weather patterns have been incorporated into the model simulations. Results show that local topography and wind velocity profiles are the major factors that control deposition.

  15. Simulation of windblown dust transport from a mine tailings impoundment using a computational fluid dynamics model

    Science.gov (United States)

    Stovern, Michael; Felix, Omar; Csavina, Janae; Rine, Kyle P.; Russell, MacKenzie R.; Jones, Robert M.; King, Matt; Betterton, Eric A.; Sáez, A. Eduardo

    2014-01-01

    Mining operations are potential sources of airborne particulate metal and metalloid contaminants through both direct smelter emissions and wind erosion of mine tailings. The warmer, drier conditions predicted for the Southwestern US by climate models may make contaminated atmospheric dust and aerosols increasingly important, due to potential deleterious effects on human health and ecology. Dust emissions and dispersion of dust and aerosol from the Iron King Mine tailings in Dewey-Humboldt, Arizona, a Superfund site, are currently being investigated through in situ field measurements and computational fluid dynamics modeling. These tailings are heavily contaminated with lead and arsenic. Using a computational fluid dynamics model, we model dust transport from the mine tailings to the surrounding region. The model includes gaseous plume dispersion to simulate the transport of the fine aerosols, while individual particle transport is used to track the trajectories of larger particles and to monitor their deposition locations. In order to improve the accuracy of the dust transport simulations, both regional topographical features and local weather patterns have been incorporated into the model simulations. Results show that local topography and wind velocity profiles are the major factors that control deposition. PMID:25621085

  16. [Economic benefits of overlapping induction: investigation using a computer simulation model].

    Science.gov (United States)

    Hunziker, S; Baumgart, A; Denz, C; Schüpfer, G

    2009-06-01

    The aim of this study was to investigate the potential economic benefit of overlapping anaesthesia induction given that all patient diagnosis-related groups (AP DRG) are used as the model for hospital reimbursement. A computer simulation model was used for this purpose. Due to the resource-intensive production process, the operating room (OR) environment is the most expensive part of the supply chain for surgical disciplines. The economical benefit of a parallel production process (additional personnel, adaptation of the process) as compared to a conventional serial layout was assessed. A computer-based simulation method was used with commercially available simulation software. Assumptions for revenues were made by reimbursement based on AP DRG. Based on a system analysis a model for the computer simulation was designed on a step-by-step abstraction process. In the model two operating rooms were used for parallel processing and two operating rooms for a serial production process. Six different types of surgical procedures based on historical case durations were investigated. The contribution margin was calculated based on the increased revenues minus the cost for the additional anaesthesia personnel. Over a period of 5 weeks 41 additional surgical cases were operated under the assumption of duration of surgery of 89+/-4 min (mean+/-SD). The additional contribution margin was CHF 104,588. In the case of longer surgical procedures with 103+/-25 min duration (mean+/-SD), an increase of 36 cases was possible in the same time period and the contribution margin was increased by CHF 384,836. When surgical cases with a mean procedural time of 243+/-55 min were simulated, 15 additional cases were possible. Therefore, the additional contribution margin was CHF 321,278. Although costs increased in this simulation when a serial production process was changed to a parallel system layout due to more personnel, an increase of the contribution margin was possible, especially with

  17. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    Science.gov (United States)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  18. Hierarchical Acceleration of Multilevel Monte Carlo Methods for Computationally Expensive Simulations in Reservoir Modeling

    Science.gov (United States)

    Zhang, G.; Lu, D.; Webster, C.

    2014-12-01

    The rational management of oil and gas reservoir requires an understanding of its response to existing and planned schemes of exploitation and operation. Such understanding requires analyzing and quantifying the influence of the subsurface uncertainties on predictions of oil and gas production. As the subsurface properties are typically heterogeneous causing a large number of model parameters, the dimension independent Monte Carlo (MC) method is usually used for uncertainty quantification (UQ). Recently, multilevel Monte Carlo (MLMC) methods were proposed, as a variance reduction technique, in order to improve computational efficiency of MC methods in UQ. In this effort, we propose a new acceleration approach for MLMC method to further reduce the total computational cost by exploiting model hierarchies. Specifically, for each model simulation on a new added level of MLMC, we take advantage of the approximation of the model outputs constructed based on simulations on previous levels to provide better initial states of new simulations, which will help improve efficiency by, e.g. reducing the number of iterations in linear system solving or the number of needed time-steps. This is achieved by using mesh-free interpolation methods, such as Shepard interpolation and radial basis approximation. Our approach is applied to a highly heterogeneous reservoir model from the tenth SPE project. The results indicate that the accelerated MLMC can achieve the same accuracy as standard MLMC with a significantly reduced cost.

  19. HIGH-FIDELITY SIMULATION-DRIVEN MODEL DEVELOPMENT FOR COARSE-GRAINED COMPUTATIONAL FLUID DYNAMICS

    Energy Technology Data Exchange (ETDEWEB)

    Hanna, Botros N.; Dinh, Nam T.; Bolotnov, Igor A.

    2016-06-01

    Nuclear reactor safety analysis requires identifying various credible accident scenarios and determining their consequences. For a full-scale nuclear power plant system behavior, it is impossible to obtain sufficient experimental data for a broad range of risk-significant accident scenarios. In single-phase flow convective problems, Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) can provide us with high fidelity results when physical data are unavailable. However, these methods are computationally expensive and cannot be afforded for simulation of long transient scenarios in nuclear accidents despite extraordinary advances in high performance scientific computing over the past decades. The major issue is the inability to make the transient computation parallel, thus making number of time steps required in high-fidelity methods unaffordable for long transients. In this work, we propose to apply a high fidelity simulation-driven approach to model sub-grid scale (SGS) effect in Coarse Grained Computational Fluid Dynamics CG-CFD. This approach aims to develop a statistical surrogate model instead of the deterministic SGS model. We chose to start with a turbulent natural convection case with volumetric heating in a horizontal fluid layer with a rigid, insulated lower boundary and isothermal (cold) upper boundary. This scenario of unstable stratification is relevant to turbulent natural convection in a molten corium pool during a severe nuclear reactor accident, as well as in containment mixing and passive cooling. The presented approach demonstrates how to create a correction for the CG-CFD solution by modifying the energy balance equation. A global correction for the temperature equation proves to achieve a significant improvement to the prediction of steady state temperature distribution through the fluid layer.

  20. Benchmarking Computational Fluid Dynamics Models for Application to Lava Flow Simulations and Hazard Assessment

    Science.gov (United States)

    Dietterich, H. R.; Lev, E.; Chen, J.; Cashman, K. V.; Honor, C.

    2015-12-01

    Recent eruptions in Hawai'i, Iceland, and Cape Verde highlight the need for improved lava flow models for forecasting and hazard assessment. Existing models used for lava flow simulation range in assumptions, complexity, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess the capabilities of existing models and test the development of new codes, we conduct a benchmarking study of computational fluid dynamics models for lava flows, including VolcFlow, OpenFOAM, Flow3D, and COMSOL. Using new benchmark scenarios defined in Cordonnier et al. (2015) as a guide, we model Newtonian, Herschel-Bulkley and cooling flows over inclined planes, obstacles, and digital elevation models with a wide range of source conditions. Results are compared to analytical theory, analogue and molten basalt experiments, and measurements from natural lava flows. Our study highlights the strengths and weakness of each code, including accuracy and computational costs, and provides insights regarding code selection. We apply the best-fit codes to simulate the lava flows in Harrat Rahat, a predominately mafic volcanic field in Saudi Arabia. Input parameters are assembled from rheology and volume measurements of past flows using geochemistry, crystallinity, and present-day lidar and photogrammetric digital elevation models. With these data, we use our verified models to reconstruct historic and prehistoric events, in order to assess the hazards posed by lava flows for Harrat Rahat.

  1. Computational and Simulation Modeling of Political Attitudes: The 'Tiger' Area of Political Culture Research

    Directory of Open Access Journals (Sweden)

    Voinea, Camelia Florela

    2016-01-01

    Full Text Available In almost one century long history, political attitudes modeling research has accumulated a critical mass of theory and method. Its characteristics and particularities have often suggested that political attitude approach to political persuasion modeling reveals a strong theoretical autonomy of concept which entitles it to become a new separate discipline of research. Though this did not actually happen, political attitudes modeling research has remained the most challenging area – the “tiger” – of political culture modeling research. This paper reviews the research literature on the conceptual, computational and simulation modeling of political attitudes developed starting with the beginning of the 20th century until the present times. Several computational and simulation modeling paradigms have provided support to political attitudes modeling research. These paradigms and the shift from one to another are briefly presented for a period of time of almost one century. The dominant paradigmatic views are those inspired by the Newtonian mechanics, and those based on the principle of methodological individualism and the emergence of macro phenomena from the individual interactions at the micro level of a society. This period of time is divided in eight ages covering the history of ideas in a wide range of political domains, going from political attitudes to polity modeling. Internal and external pressures for paradigmatic change are briefly explained.

  2. Trends in Social Science: The Impact of Computational and Simulative Models

    Science.gov (United States)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  3. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Energy Technology Data Exchange (ETDEWEB)

    Guerrier, C. [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Holcman, D., E-mail: david.holcman@ens.fr [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Mathematical Institute, Oxford OX2 6GG, Newton Institute (United Kingdom)

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  4. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Science.gov (United States)

    Guerrier, C.; Holcman, D.

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  5. Building a three-dimensional model of the upper gastrointestinal tract for computer simulations of swallowing.

    Science.gov (United States)

    Gastelum, Alfonso; Mata, Lucely; Brito-de-la-Fuente, Edmundo; Delmas, Patrice; Vicente, William; Salinas-Vázquez, Martín; Ascanio, Gabriel; Marquez, Jorge

    2016-03-01

    We aimed to provide realistic three-dimensional (3D) models to be used in numerical simulations of peristaltic flow in patients exhibiting difficulty in swallowing, also known as dysphagia. To this end, a 3D model of the upper gastrointestinal tract was built from the color cryosection images of the Visible Human Project dataset. Regional color heterogeneities were corrected by centering local histograms of the image difference between slices. A voxel-based model was generated by stacking contours from the color images. A triangle mesh was built, smoothed and simplified. Visualization tools were developed for browsing the model at different stages and for virtual endoscopy navigation. As result, a computer model of the esophagus and the stomach was obtained, mainly for modeling swallowing disorders. A central-axis curve was also obtained for virtual navigation and to replicate conditions relevant to swallowing disorders modeling. We show renderings of the model and discuss its use for simulating swallowing as a function of bolus rheological properties. The information obtained from simulation studies with our model could be useful for physicians in selecting the correct nutritional emulsions for patients with dysphagia.

  6. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB

    OpenAIRE

    Sinha, Shriprakash

    2016-01-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational mo...

  7. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB

    OpenAIRE

    Sinha, Shriprakash

    2016-01-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational mo...

  8. Computational Modeling and Simulation of Attitude Change. Part 1, Connectionist Models and Simulations of Cognitive Dissonance: an Overview

    OpenAIRE

    Voinea, Camelia Florela

    2013-01-01

    Cognitive Dissonance Theory is considered part of the cognitive consistency theories in Social Psychology. They uncover a class of conceptual models which describe the attitude change as a cognitive consistency-seeking issue. As these conceptual models requested more complex operational expression, algebraic, mathematical and, lately, computational modeling approaches of cognitive consistency have been developed. Part 1 of this work provides an overview of the connectionist modeling of cognit...

  9. Computational Modeling and Simulation of Attitude Change. Part 1, Connectionist Models and Simulations of Cognitive Dissonance: an Overview

    OpenAIRE

    Voinea, Camelia Florela

    2013-01-01

    Cognitive Dissonance Theory is considered part of the cognitive consistency theories in Social Psychology. They uncover a class of conceptual models which describe the attitude change as a cognitive consistency-seeking issue. As these conceptual models requested more complex operational expression, algebraic, mathematical and, lately, computational modeling approaches of cognitive consistency have been developed. Part 1 of this work provides an overview of the connectionist modeling of cognit...

  10. Investigation of different modeling approaches for computational fluid dynamics simulation of high-pressure rocket combustors

    Science.gov (United States)

    Ivancic, B.; Riedmann, H.; Frey, M.; Knab, O.; Karl, S.; Hannemann, K.

    2016-07-01

    The paper summarizes technical results and first highlights of the cooperation between DLR and Airbus Defence and Space (DS) within the work package "CFD Modeling of Combustion Chamber Processes" conducted in the frame of the Propulsion 2020 Project. Within the addressed work package, DLR Göttingen and Airbus DS Ottobrunn have identified several test cases where adequate test data are available and which can be used for proper validation of the computational fluid dynamics (CFD) tools. In this paper, the first test case, the Penn State chamber (RCM1), is discussed. Presenting the simulation results from three different tools, it is shown that the test case can be computed properly with steady-state Reynolds-averaged Navier-Stokes (RANS) approaches. The achieved simulation results reproduce the measured wall heat flux as an important validation parameter very well but also reveal some inconsistencies in the test data which are addressed in this paper.

  11. Electromagnetic Computation and Visualization of Transmission Particle Model and Its Simulation Based on GPU

    Directory of Open Access Journals (Sweden)

    Yingnian Wu

    2014-01-01

    Full Text Available Electromagnetic calculation plays an important role in both military and civic fields. Some methods and models proposed for calculation of electromagnetic wave propagation in a large range bring heavy burden in CPU computation and also require huge amount of memory. Using the GPU to accelerate computation and visualization can reduce the computational burden on the CPU. Based on forward ray-tracing method, a transmission particle model (TPM for calculating electromagnetic field is presented to combine the particle method. The movement of a particle obeys the principle of the propagation of electromagnetic wave, and then the particle distribution density in space reflects the electromagnetic distribution status. The algorithm with particle transmission, movement, reflection, and diffraction is described in detail. Since the particles in TPM are completely independent, it is very suitable for the parallel computing based on GPU. Deduction verification of TPM with the electric dipole antenna as the transmission source is conducted to prove that the particle movement itself represents the variation of electromagnetic field intensity caused by diffusion. Finally, the simulation comparisons are made against the forward and backward ray-tracing methods. The simulation results verified the effectiveness of the proposed method.

  12. Computational simulation methodologies for mechanobiological modelling: a cell-centred approach to neointima development in stents.

    Science.gov (United States)

    Boyle, C J; Lennon, A B; Early, M; Kelly, D J; Lally, C; Prendergast, P J

    2010-06-28

    The design of medical devices could be very much improved if robust tools were available for computational simulation of tissue response to the presence of the implant. Such tools require algorithms to simulate the response of tissues to mechanical and chemical stimuli. Available methodologies include those based on the principle of mechanical homeostasis, those which use continuum models to simulate biological constituents, and the cell-centred approach, which models cells as autonomous agents. In the latter approach, cell behaviour is governed by rules based on the state of the local environment around the cell; and informed by experiment. Tissue growth and differentiation requires simulating many of these cells together. In this paper, the methodology and applications of cell-centred techniques--with particular application to mechanobiology--are reviewed, and a cell-centred model of tissue formation in the lumen of an artery in response to the deployment of a stent is presented. The method is capable of capturing some of the most important aspects of restenosis, including nonlinear lesion growth with time. The approach taken in this paper provides a framework for simulating restenosis; the next step will be to couple it with more patient-specific geometries and quantitative parameter data.

  13. Computer Simulation and Modeling of CO2 Removal Systems for Exploration 2013-2014

    Science.gov (United States)

    Coker, R.; Knox, J.; Gomez, C.

    2015-01-01

    The Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project was initiated in September of 2011 as part of the Advanced Exploration Systems (AES) program. Under the ARREM project and the follow-on Life Support Systems (LSS) project, testing of sub-scale and full-scale systems has been combined with multiphysics computer simulations for evaluation and optimization of subsystem approaches. In particular, this paper will describes the testing and 1-D modeling of the combined water desiccant and carbon dioxide sorbent subsystems of the carbon dioxide removal assembly (CDRA). The goal is a full system predictive model of CDRA to guide system optimization and development.

  14. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  15. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  16. Computer simulations of the restricted primitive model at very low temperature and density.

    Science.gov (United States)

    Valeriani, Chantal; Camp, Philip J; Zwanikken, Jos W; van Roij, René; Dijkstra, Marjolein

    2010-03-17

    The problem of successfully simulating ionic fluids at low temperature and low density states is well known in the simulation literature: using conventional methods, the system is not able to equilibrate rapidly due to the presence of strongly associated cation-anion pairs. In this paper we present a numerical method for speeding up computer simulations of the restricted primitive model (RPM) at low temperatures (around the critical temperature) and at very low densities (down to 10(-10)σ(-3), where σ is the ion diameter). Experimentally, this regime corresponds to typical concentrations of electrolytes in nonaqueous solvents. As far as we are aware, this is the first time that the RPM has been equilibrated at such extremely low concentrations. More generally, this method could be used to equilibrate other systems that form aggregates at low concentrations.

  17. Easy Java Simulation, an innovative tool for teacher as designers of gravity-physics computer models

    CERN Document Server

    Wee, Loo Kang; Lim, Ee-Peow

    2014-01-01

    This paper is on customization of computer models using the Easy Java Simulation authoring toolkit for the Singapore syllabus, based on real astronomical data, supported with literature reviewed researched pedagogical features. These 4 new computer models serves to support the enactment of scientific work that are inquiry centric and evidence based that are more likely to promote enjoyment and inspire imagination having experienced gravity-physics than traditional pen and paper problem solving. Pilot research suggests students enactment of investigative learning like scientist is now possible, where gravity-physics comes alive. Download simulations https://dl.dropboxusercontent.com/u/44365627/lookangEJSworkspace/export/ejs_model_GField_and_Potential_1D_v8wee.jar https://dl.dropboxusercontent.com/u/44365627/lookangEJSworkspace/export/ejs_model_GFieldandPotential1Dv7EarthMoon.jar https://dl.dropboxusercontent.com/u/44365627/lookangEJSworkspace/export/ejs_model_KeplerSystem3rdLaw09.jar https://dl.dropboxusercont...

  18. Computational investigations on polymerase actions in gene transcription and replication: Combining physical modeling and atomistic simulations

    Science.gov (United States)

    Jin, Yu

    2016-01-01

    Polymerases are protein enzymes that move along nucleic acid chains and catalyze template-based polymerization reactions during gene transcription and replication. The polymerases also substantially improve transcription or replication fidelity through the non-equilibrium enzymatic cycles. We briefly review computational efforts that have been made toward understanding mechano-chemical coupling and fidelity control mechanisms of the polymerase elongation. The polymerases are regarded as molecular information motors during the elongation process. It requires a full spectrum of computational approaches from multiple time and length scales to understand the full polymerase functional cycle. We stay away from quantum mechanics based approaches to the polymerase catalysis due to abundant former surveys, while addressing statistical physics modeling approaches along with all-atom molecular dynamics simulation studies. We organize this review around our own modeling and simulation practices on a single subunit T7 RNA polymerase, and summarize commensurate studies on structurally similar DNA polymerases as well. For multi-subunit RNA polymerases that have been actively studied in recent years, we leave systematical reviews of the simulation achievements to latest computational chemistry surveys, while covering only representative studies published very recently, including our own work modeling structure-based elongation kinetic of yeast RNA polymerase II. In the end, we briefly go through physical modeling on elongation pauses and backtracking activities of the multi-subunit RNAPs. We emphasize on the fluctuation and control mechanisms of the polymerase actions, highlight the non-equilibrium nature of the operation system, and try to build some perspectives toward understanding the polymerase impacts from the single molecule level to a genome-wide scale. Project supported by the National Natural Science Foundation (Grant No. 11275022).

  19. Computer aided modeling and simulation of hydroforming on tubular engineering products

    Directory of Open Access Journals (Sweden)

    Jeremy (Zheng Li

    2013-01-01

    Full Text Available Hydroforming processes have been widely applied in many different industrial fields including aerospace, automotive, and modern plastics for weight-reduction and strength enhancement. Hydroforming on tubular products is better than the process of welding tubular assemblies from stampings including increased strength, reduction of work applied to the unit weight, decreased processing and tool costs, improved structural stability, less secondary operation, enhanced stiffness, and more uniform in product thickness. Although the hydroforming becomes popular manufacturing methodology, few researches have been done to study the hydroforming mechanism through computational modeling and simulation. This paper focuses on the study of hydroforming process and mechanism based on computer-aided modeling (FEA and prototype testing to determine the material behaviors in hydroforming process. The objective of this research is to verify the effects of major manufacturing parameters on hydroforming processes. The computational analysis and prototype testing indicate that some factors including applied internal pressure path and compressive axial loading play important roles in hydroforming deformation. Both computer-aided modeling and prototyping experiment show close results which verifies the credibility of this research and analytic methodology.

  20. HRP's Healthcare Spin-Offs Through Computational Modeling and Simulation Practice Methodologies

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Peng, Grace; Morrison, Tina; Erdemir, Ahmet; Myers, Jerry

    2014-01-01

    Spaceflight missions expose astronauts to novel operational and environmental conditions that pose health risks that are currently not well understood, and perhaps unanticipated. Furthermore, given the limited number of humans that have flown in long duration missions and beyond low Earth-orbit, the amount of research and clinical data necessary to predict and mitigate these health and performance risks are limited. Consequently, NASA's Human Research Program (HRP) conducts research and develops advanced methods and tools to predict, assess, and mitigate potential hazards to the health of astronauts. In this light, NASA has explored the possibility of leveraging computational modeling since the 1970s as a means to elucidate the physiologic risks of spaceflight and develop countermeasures. Since that time, substantial progress has been realized in this arena through a number of HRP funded activates such as the Digital Astronaut Project (DAP) and the Integrated Medical Model (IMM). Much of this success can be attributed to HRP's endeavor to establish rigorous verification, validation, and credibility (VV&C) processes that ensure computational models and simulations (M&S) are sufficiently credible to address issues within their intended scope. This presentation summarizes HRP's activities in credibility of modeling and simulation, in particular through its outreach to the community of modeling and simulation practitioners. METHODS: The HRP requires all M&S that can have moderate to high impact on crew health or mission success must be vetted in accordance to NASA Standard for Models and Simulations, NASA-STD-7009 (7009) [5]. As this standard mostly focuses on engineering systems, the IMM and DAP have invested substantial efforts to adapt the processes established in this standard for their application to biological M&S, which is more prevalent in human health and performance (HHP) and space biomedical research and operations [6,7]. These methods have also generated

  1. Credibility Assessment of Deterministic Computational Models and Simulations for Space Biomedical Research and Operations

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry

    2015-01-01

    Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.

  2. Computational Modeling and Simulations of Bioparticle Internalization Through Clathrin-mediated Endocytosis

    Science.gov (United States)

    Deng, Hua; Dutta, Prashanta; Liu, Jin

    2016-11-01

    Clathrin-mediated endocytosis (CME) is one of the most important endocytic pathways for the internalization of bioparticles at lipid membrane of cells, which plays crucial roles in fundamental understanding of viral infections and interacellular/transcelluar targeted drug delivery. During CME, highly dynamic clathrin-coated pit (CCP), formed by the growth of ordered clathrin lattices, is the key scaffolding component that drives the deformation of plasma membrane. Experimental studies have shown that CCP alone can provide sufficient membrane curvature for facilitating membrane invagination. However, currently there is no computational model that could couple cargo receptor binding with membrane invagination process, nor simulations of the dynamic growing process of CCP. We develop a stochastic computational model for the clathrin-mediated endocytosis based on Metropolis Monte Carlo simulations. In our model, the energetic costs of bending membrane and CCP are linked with antigen-antibody interactions. The assembly of clathrin lattices is a dynamic process that correlates with antigen-antibody bond formation. This model helps study the membrane deformation and the effects of CCP during functionalized bioparticles internalization through CME. This work is supported by NSF Grants: CBET-1250107 and CBET-1604211.

  3. COMPUTER SIMULATION OF ANTIFERROMAGNETIC STRUCTURES DESCRIBED BY THE THREE-VERTEX ANTIFERROMAGNETIC POTTS MODEL

    Directory of Open Access Journals (Sweden)

    Yarash K. Abuev

    2017-01-01

    Full Text Available Abstract. Objectives A computer simulation of the antiferromagnetic structures described by the three-vertex Potts model on a triangular lattice is performed, taking into account the antiferromagnetic exchange interactions between the nearest J1 and second J2 neighbours. The main goal of the computer simulation was to elucidate the effects of ground state and areas of frustration on the thermodynamic and magnetic properties of antiferromagnetic structures described by the lowdimensional Potts model. Method The computer simulation is based on the Monte Carlo method. This method is implemented using the Metropolis algorithm in combination with the Wolff claster algorithm. The computer simulation was carried out for low-dimensional systems with periodic boundary conditions and linear dimensions L = 24124. Results On the basis of heat capacity and entropy analysis, phase transitions were observed in the considered model to possess exchange interaction parameters J1 <0 and J2 <0 in the variation intervals 0r<0.2 and 1.0simulated system. It is proved that the competition between the exchange parameters of the first and second nearest neighbors in the r variation interval 0,2r1,0 leads to a degeneracy in the examined structure ground state; frustrations are additionally observed in the interval under consideration. On the basis of the

  4. Computer simulation of magnetic resonance angiography imaging: model description and validation.

    Directory of Open Access Journals (Sweden)

    Artur Klepaczko

    Full Text Available With the development of medical imaging modalities and image processing algorithms, there arises a need for methods of their comprehensive quantitative evaluation. In particular, this concerns the algorithms for vessel tracking and segmentation in magnetic resonance angiography images. The problem can be approached by using synthetic images, where true geometry of vessels is known. This paper presents a framework for computer modeling of MRA imaging and the results of its validation. A new model incorporates blood flow simulation within MR signal computation kernel. The proposed solution is unique, especially with respect to the interface between flow and image formation processes. Furthermore it utilizes the concept of particle tracing. The particles reflect the flow of fluid they are immersed in and they are assigned magnetization vectors with temporal evolution controlled by MR physics. Such an approach ensures flexibility as the designed simulator is able to reconstruct flow profiles of any type. The proposed model is validated in a series of experiments with physical and digital flow phantoms. The synthesized 3D images contain various features (including artifacts characteristic for the time-of-flight protocol and exhibit remarkable correlation with the data acquired in a real MR scanner. The obtained results support the primary goal of the conducted research, i.e. establishing a reference technique for a quantified validation of MR angiography image processing algorithms.

  5. A computational platform for modeling and simulation of pipeline georeferencing systems

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, A.G.; Pellanda, P.C.; Gois, J.A. [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil); Roquette, P.; Pinto, M.; Durao, R. [Instituto de Pesquisas da Marinha (IPqM), Rio de Janeiro, RJ (Brazil); Silva, M.S.V.; Martins, W.F.; Camillo, L.M.; Sacsa, R.P.; Madeira, B. [Ministerio de Ciencia e Tecnologia (CT-PETRO2006MCT), Brasilia, DF (Brazil). Financiadora de Estudos e Projetos (FINEP). Plano Nacional de Ciencia e Tecnologia do Setor Petroleo e Gas Natural

    2009-07-01

    This work presents a computational platform for modeling and simulation of pipeline geo referencing systems, which was developed based on typical pipeline characteristics, on the dynamical modeling of Pipeline Inspection Gauge (PIG) and on the analysis and implementation of an inertial navigation algorithm. The software environment of PIG trajectory simulation and navigation allows the user, through a friendly interface, to carry-out evaluation tests of the inertial navigation system under different scenarios. Therefore, it is possible to define the required specifications of the pipeline geo referencing system components, such as: required precision of inertial sensors, characteristics of the navigation auxiliary system (GPS surveyed control points, odometers etc.), pipeline construction information to be considered in order to improve the trajectory estimation precision, and the signal processing techniques more suitable for the treatment of inertial sensors data. The simulation results are analyzed through the evaluation of several performance metrics usually considered in inertial navigation applications, and 2D and 3D plots of trajectory estimation error and of recovered trajectory in the three coordinates are made available to the user. This paper presents the simulation platform and its constituting modules and defines their functional characteristics and interrelationships.(author)

  6. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  7. Space Shuttle Propulsion Systems Plume Modeling and Simulation for the Lift-Off Computational Fluid Dynamics Model

    Science.gov (United States)

    Strutzenberg, L. L.; Dougherty, N. S.; Liever, P. A.; West, J. S.; Smith, S. D.

    2007-01-01

    This paper details advances being made in the development of Reynolds-Averaged Navier-Stokes numerical simulation tools, models, and methods for the integrated Space Shuttle Vehicle at launch. The conceptual model and modeling approach described includes the development of multiple computational models to appropriately analyze the potential debris transport for critical debris sources at Lift-Off. The conceptual model described herein involves the integration of propulsion analysis for the nozzle/plume flow with the overall 3D vehicle flowfield at Lift-Off. Debris Transport Analyses are being performed using the Shuttle Lift-Off models to assess the risk to the vehicle from Lift-Off debris and appropriately prioritized mitigation of potential debris sources to continue to reduce vehicle risk. These integrated simulations are being used to evaluate plume-induced debris environments where the multi-plume interactions with the launch facility can potentially accelerate debris particles toward the vehicle.

  8. Spreading of a chain macromolecule onto a cell membrane by a computer simulation Model

    Science.gov (United States)

    Xie, Jun; Pandey, Ras

    2002-03-01

    Computer simulations are performed to study conformation and dynamics of a relatively large chain macromolecule at the surface of a model membrane - a preliminary attempt to ultimately realistic model for protein on a cell membrane. We use a discrete lattice of size Lx × L × L. The chain molecule of length Lc is modeled by consecutive nodes connected by bonds on the trail of a random walk with appropriate constraints such as excluded volume, energy dependent configurational bias, etc. Monte Carlo method is used to move chains via segmental dynamics, i.e., end-move, kink-jump, crank-shaft, reptation, etc. Membrane substrate is designed by a self-assemble biased short chains on a substrate. Large chain molecule is then driven toward the membrane by a field. We investigate the dynamics of chain macromolecule, spread of its density, and conformation.

  9. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  10. A hybrid model for the computationally-efficient simulation of the cerebellar granular layer

    Directory of Open Access Journals (Sweden)

    Anna eCattani

    2016-04-01

    Full Text Available The aim of the present paper is to efficiently describe the membrane potential dynamics of neural populations formed by species having a high density difference in specific brain areas. We propose a hybrid model whose main ingredients are a conductance-based model (ODE system and its continuous counterpart (PDE system obtained through a limit process in which the number of neurons confined in a bounded region of the brain tissue is sent to infinity. Specifically, in the discrete model, each cell is described by a set of time-dependent variables, whereas in the continuum model, cells are grouped into populations that are described by a set of continuous variables.Communications between populations, which translate into interactions among the discrete and the continuous models, are the essence of the hybrid model we present here. The cerebellum and cerebellum-like structures show in their granular layer a large difference in the relative density of neuronal species making them a natural testing ground for our hybrid model. By reconstructing the ensemble activity of the cerebellar granular layer network and by comparing our results to a more realistic computational network, we demonstrate that our description of the network activity, even though it is not biophysically detailed, is still capable of reproducing salient features of neural network dynamics. Our modeling approach yields a significant computational cost reduction by increasing the simulation speed at least $270$ times. The hybrid model reproduces interesting dynamics such as local microcircuit synchronization, traveling waves, center-surround and time-windowing.

  11. Building Model for the University of Mosul Computer Network Using OPNET Simulator

    Directory of Open Access Journals (Sweden)

    Modhar Modhar A. Hammoudi

    2013-04-01

    Full Text Available This paper aims at establishing a model in OPNET (Optimized Network Engineering Tool simulator for the University of Mosul computer network. The proposed network model was made up of two routers (Cisco 2600, core switch (Cisco6509, two servers, ip 32 cloud and 37 VLANs. These VLANs were connected to the core switch using fiber optic cables (1000BaseX. Three applications were added to test the network model. These applications were FTP (File Transfer Protocol, HTTP (Hyper Text Transfer Protocol and VoIP (Voice over Internet Protocol. The results showed that the proposed model had a positive efficiency on designing and managing the targeted network and can be used to view the data flow in it. Also, the simulation results showed that the maximum number of VoIP service users could be raised upto 5000 users when working under IP Telephony. This means that the ability to utilize VoIP service in this network can be maintained and is better when subjected to IP telephony scheme.

  12. Development of computer program for simulation of an ice bank system operation, Part I: Mathematical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Halasz, Boris; Grozdek, Marino; Soldo, Vladimir [Faculty of Mechanical Engineering and Naval Architecture, University of Zagreb, Ivana Lucica 5, 10 000 Zagreb (Croatia)

    2009-09-15

    Since the use of standard engineering methods in the process of an ice bank performance evaluation offers neither adequate flexibility nor accuracy, the aim of this research was to provide a powerful tool for an industrial design of an ice storage system allowing to account for the various design parameters and system arrangements over a wide range of time varying operating conditions. In this paper the development of a computer application for the prediction of an ice bank system operation is presented. Static, indirect, cool thermal storage systems with external ice on coil building/melting were considered. The mathematical model was developed by means of energy and mass balance relations for each component of the system and is basically divided into two parts, the model of an ice storage system and the model of a refrigeration unit. Heat transfer processes in an ice silo were modelled by use of empirical correlations while the performance of refrigeration unit components were based on manufacturers data. Programming and application design were made in Fortran 95 language standard. Input of data is enabled through drop down menus and dialog boxes, while the results are presented via figures, diagrams and data (ASCII) files. In addition, to demonstrate the necessity for development of simulation program a case study was performed. Simulation results clearly indicate that no simple engineering methods or rule of thumb principles could be utilised in order to validate performance of an ice bank system properly. (author)

  13. MISCONCEPTION REMEDIATION OF ATOMIC ORBITAL, MOLECULAR ORBITAL, AND HIBRIDIZIATION CONCEPTS BY COMPUTER ASISSTED INSTRUCTION WITH ANIMATION AND SIMULATION MODEL

    Directory of Open Access Journals (Sweden)

    Sri Mursiti

    2010-06-01

    Full Text Available The research of Computer Asissted Instruction with animation and simulation was used to misconception remediation of atomic orbital, molecular orbital, and hibridiziation concepts. The applicated instruction model was focused on concept approach with macromedia flash player and power point programme. The subject of this research were the 2nd semestre students of Chemistry Department. The data were collected by using of true-false pre-test and post- test followed by the reason of its. The analysis reveals that the Computer Asissted Instruction with animation and simulation model increased the understanding of atomic orbital, molecular orbital, and hibridiziation concepts or remediation of concepts missconception, shown by the significant score gained between before and after the implementation of Computer Asissted Instruction with animation and simulation model. The instruction model developed the students's generic skills too.   Keywords: animation simulation,misconception remediation, orbital, hibridization

  14. Computer-Aided Design, Modeling and Simulation of a New Solar Still Design

    Directory of Open Access Journals (Sweden)

    Jeremy (Zheng Li

    2011-01-01

    Full Text Available The clean and pure drinking water is important in today's life but current water sources are usually brackish with bacteria that cannot be used for drinking. About 78% of water available in the sea is salty, 21% of water is brackish, and only 1% of water is fresh. Distillation is one of the feasible processes applied to water purification, and it requires the energy inputs, such as solar radiation. Water is evaporated in this distillation process and water vapor can be separated and condensed to pure water. Now, with the change from conventional fuels to renewable and environment friendly fuels sources, the modern technology allows to use the abundant energy from the sun. It is better to use solar energy to process the water desalination since it is more economical than the use of conventional energies. The main focus of this paper is applying computer-aided modeling and simulation to design a less complex solar water distillation system. The prototype of this solar still system is also built to verify its feasibility, functionality, and reliability. The computational simulation and prototype testing show the reliability and proper functionality of this solar water distillation system.

  15. Modeling and Simulation of Scalable Cloud Computing Environments and the CloudSim Toolkit: Challenges and Opportunities

    CERN Document Server

    Buyya, Rajkumar; Calheiros, Rodrigo N

    2009-01-01

    Cloud computing aims to power the next generation data centers and enables application service providers to lease data center capabilities for deploying applications depending on user QoS (Quality of Service) requirements. Cloud applications have different composition, configuration, and deployment requirements. Quantifying the performance of resource allocation policies and application scheduling algorithms at finer details in Cloud computing environments for different application and service models under varying load, energy performance (power consumption, heat dissipation), and system size is a challenging problem to tackle. To simplify this process, in this paper we propose CloudSim: an extensible simulation toolkit that enables modelling and simulation of Cloud computing environments. The CloudSim toolkit supports modelling and creation of one or more virtual machines (VMs) on a simulated node of a Data Center, jobs, and their mapping to suitable VMs. It also allows simulation of multiple Data Centers to...

  16. Partitioning and packing mathematical simulation models for calculation on parallel computers

    Science.gov (United States)

    Arpasi, D. J.; Milner, E. J.

    1986-01-01

    The development of multiprocessor simulations from a serial set of ordinary differential equations describing a physical system is described. Degrees of parallelism (i.e., coupling between the equations) and their impact on parallel processing are discussed. The problem of identifying computational parallelism within sets of closely coupled equations that require the exchange of current values of variables is described. A technique is presented for identifying this parallelism and for partitioning the equations for parallel solution on a multiprocessor. An algorithm which packs the equations into a minimum number of processors is also described. The results of the packing algorithm when applied to a turbojet engine model are presented in terms of processor utilization.

  17. Numerical computation of the linear stability of the diffusion model for crystal growth simulation

    Energy Technology Data Exchange (ETDEWEB)

    Yang, C.; Sorensen, D.C. [Rice Univ., Houston, TX (United States); Meiron, D.I.; Wedeman, B. [California Institute of Technology, Pasadena, CA (United States)

    1996-12-31

    We consider a computational scheme for determining the linear stability of a diffusion model arising from the simulation of crystal growth. The process of a needle crystal solidifying into some undercooled liquid can be described by the dual diffusion equations with appropriate initial and boundary conditions. Here U{sub t} and U{sub a} denote the temperature of the liquid and solid respectively, and {alpha} represents the thermal diffusivity. At the solid-liquid interface, the motion of the interface denoted by r and the temperature field are related by the conservation relation where n is the unit outward pointing normal to the interface. A basic stationary solution to this free boundary problem can be obtained by writing the equations of motion in a moving frame and transforming the problem to parabolic coordinates. This is known as the Ivantsov parabola solution. Linear stability theory applied to this stationary solution gives rise to an eigenvalue problem of the form.

  18. A Computational Modeling and Simulation Approach to Investigate Mechanisms of Subcellular cAMP Compartmentation.

    Science.gov (United States)

    Yang, Pei-Chi; Boras, Britton W; Jeng, Mao-Tsuen; Docken, Steffen S; Lewis, Timothy J; McCulloch, Andrew D; Harvey, Robert D; Clancy, Colleen E

    2016-07-01

    Subcellular compartmentation of the ubiquitous second messenger cAMP has been widely proposed as a mechanism to explain unique receptor-dependent functional responses. How exactly compartmentation is achieved, however, has remained a mystery for more than 40 years. In this study, we developed computational and mathematical models to represent a subcellular sarcomeric space in a cardiac myocyte with varying detail. We then used these models to predict the contributions of various mechanisms that establish subcellular cAMP microdomains. We used the models to test the hypothesis that phosphodiesterases act as functional barriers to diffusion, creating discrete cAMP signaling domains. We also used the models to predict the effect of a range of experimentally measured diffusion rates on cAMP compartmentation. Finally, we modeled the anatomical structures in a cardiac myocyte diad, to predict the effects of anatomical diffusion barriers on cAMP compartmentation. When we incorporated experimentally informed model parameters to reconstruct an in silico subcellular sarcomeric space with spatially distinct cAMP production sites linked to caveloar domains, the models predict that under realistic conditions phosphodiesterases alone were insufficient to generate significant cAMP gradients. This prediction persisted even when combined with slow cAMP diffusion. When we additionally considered the effects of anatomic barriers to diffusion that are expected in the cardiac myocyte dyadic space, cAMP compartmentation did occur, but only when diffusion was slow. Our model simulations suggest that additional mechanisms likely contribute to cAMP gradients occurring in submicroscopic domains. The difference between the physiological and pathological effects resulting from the production of cAMP may be a function of appropriate compartmentation of cAMP signaling. Therefore, understanding the contribution of factors that are responsible for coordinating the spatial and temporal

  19. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  20. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB.

    Science.gov (United States)

    Sinha, Shriprakash

    2016-12-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d-connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.

  1. Computer simulation of active suspension based on the full-vehicle model

    Institute of Scientific and Technical Information of China (English)

    李军; CHEN; Shanguo

    2002-01-01

    The current method to solve the problem of active suspension control for a vehicle is often dealt with a quarter-car or half-car model.But it is not enough to use this kind of model for practical applications.In this paper,based on considering the influence of factors such as,seat and passengers a MDOF(multi-degree-of-freedom)model describing the vehicle motion is set up.The MODF model,which is 8DOF of four independent suspensions and four wheel tracks,is more applicable by comparison of its analysis result with some conventional vehicle models.Therefore,it is more suitable to use the 8DOF full-car model than a conventional 4DOF half-car model in the active control design for car vibration.Based on the derived 8DOF odel,a controller is designed by using LQ(linear quadratic)control theory,and the appropriate control scheme is selected by testing various performance indexes.Computer simulation 8is carried out for a passenger car running on a road with step disturbance and random road disturbance expressed by Power Spectral Density(PSD).Vibrations corresponding to ride comfort are derived under the foregoing road disturbances.The response results for uncontrolled and controlled system are compared.The response of vehicle vibration is greatly suppressed and quickly damped.which testifies the effect of the active suspenson.The results achieved for various controllers are compared to invesigate the influence of different control schemes on the control effect.

  2. Using Physical and Computer Simulations of Collective Behaviour as an Introduction to Modelling Concepts for Applied Biologists

    Science.gov (United States)

    Rands, Sean A.

    2012-01-01

    Models are an important tool in science: not only do they act as a convenient device for describing a system or problem, but they also act as a conceptual tool for framing and exploring hypotheses. Models, and in particular computer simulations, are also an important education tool for training scientists, but it is difficult to teach students the…

  3. The evolutionary forces maintaining a wild polymorphism of Littorina saxatilis: model selection by computer simulations.

    Science.gov (United States)

    Pérez-Figueroa, A; Cruz, F; Carvajal-Rodríguez, A; Rolán-Alvarez, E; Caballero, A

    2005-01-01

    Two rocky shore ecotypes of Littorina saxatilis from north-west Spain live at different shore levels and habitats and have developed an incomplete reproductive isolation through size assortative mating. The system is regarded as an example of sympatric ecological speciation. Several experiments have indicated that different evolutionary forces (migration, assortative mating and habitat-dependent selection) play a role in maintaining the polymorphism. However, an assessment of the combined contributions of these forces supporting the observed pattern in the wild is absent. A model selection procedure using computer simulations was used to investigate the contribution of the different evolutionary forces towards the maintenance of the polymorphism. The agreement between alternative models and experimental estimates for a number of parameters was quantified by a least square method. The results of the analysis show that the fittest evolutionary model for the observed polymorphism is characterized by a high gene flow, intermediate-high reproductive isolation between ecotypes, and a moderate to strong selection against the nonresident ecotypes on each shore level. In addition, a substantial number of additive loci contributing to the selected trait and a narrow hybrid definition with respect to the phenotype are scenarios that better explain the polymorphism, whereas the ecotype fitnesses at the mid-shore, the level of phenotypic plasticity, and environmental effects are not key parameters.

  4. Computer Models Design for Teaching and Learning using Easy Java Simulation

    CERN Document Server

    Wee, Loo Kang Lawrence; Goh, Khoon Song Aloysius; LyeYE, Sze Yee; Lee, Tat Leong; Xu, Weiming; Goh, Giam Hwee Jimmy; Ong, Chee Wah; Ng, Soo Kok; Lim, Ee-Peow; Lim, Chew Ling; Yeo, Wee Leng Joshua; Ong, Matthew; LimI, Kenneth Y T

    2012-01-01

    We are teachers who have benefited from the Open Source Physics (Brown, 2012; Christian, 2010; Esquembre, 2012) community's work and we would like to share some of the computer models and lesson packages that we have designed and implemented in five schools grade 11 to 12 classes. In a ground-up teacher-leadership (MOE, 2010) approach, we came together to learn, advancing the professionalism (MOE, 2009) of physics educators and improve students' learning experiences through suitable blend (Jaakkola, 2012) of real equipment and computer models where appropriate . We will share computer models that we have remixed from existing library of computer models into suitable learning environments for inquiry of physics customized (Wee & Mak, 2009) for the Advanced Level Physics syllabus (SEAB, 2010, 2012). We hope other teachers would find these computer models useful and remix them to suit their own context, design better learning activities and share them to benefit all humankind, becoming citizens for the world...

  5. Plasma physics via computer simulation

    CERN Document Server

    Birdsall, CK

    2004-01-01

    PART 1: PRIMER Why attempting to do plasma physics via computer simulation using particles makes good sense Overall view of a one dimensional electrostatic program A one dimensional electrostatic program ES1 Introduction to the numerical methods used Projects for ES1 A 1d electromagnetic program EM1 Projects for EM1 PART 2: THEORY Effects of the spatial grid Effects of the finitw time ste Energy-conserving simulation models Multipole models Kinetic theory for fluctuations and noise; collisions Kinetic properties: theory, experience and heuristic estimates PART 3: PRACTIC

  6. Mathematical modeling of cancer cell invasion of tissue: biological insight from mathematical analysis and computational simulation.

    Science.gov (United States)

    Andasari, Vivi; Gerisch, Alf; Lolas, Georgios; South, Andrew P; Chaplain, Mark A J

    2011-07-01

    The ability of cancer cells to break out of tissue compartments and invade locally gives solid tumours a defining deadly characteristic. One of the first steps of invasion is the remodelling of the surrounding tissue or extracellular matrix (ECM) and a major part of this process is the over-expression of proteolytic enzymes, such as the urokinase-type plasminogen activator (uPA) and matrix metalloproteinases (MMPs), by the cancer cells to break down ECM proteins. Degradation of the matrix enables the cancer cells to migrate through the tissue and subsequently to spread to secondary sites in the body, a process known as metastasis. In this paper we undertake an analysis of a mathematical model of cancer cell invasion of tissue, or ECM, which focuses on the role of the urokinase plasminogen activation system. The model consists of a system of five reaction-diffusion-taxis partial differential equations describing the interactions between cancer cells, uPA, uPA inhibitors, plasmin and the host tissue. Cancer cells react chemotactically and haptotactically to the spatio-temporal effects of the uPA system. The results obtained from computational simulations carried out on the model equations produce dynamic heterogeneous spatio-temporal solutions and using linear stability analysis we show that this is caused by a taxis-driven instability of a spatially homogeneous steady-state. Finally we consider the biological implications of the model results, draw parallels with clinical samples and laboratory based models of cancer cell invasion using three-dimensional invasion assay, and go on to discuss future development of the model.

  7. An imaging-based computational model for simulating angiogenesis and tumour oxygenation dynamics

    Science.gov (United States)

    Adhikarla, Vikram; Jeraj, Robert

    2016-05-01

    Tumour growth, angiogenesis and oxygenation vary substantially among tumours and significantly impact their treatment outcome. Imaging provides a unique means of investigating these tumour-specific characteristics. Here we propose a computational model to simulate tumour-specific oxygenation changes based on the molecular imaging data. Tumour oxygenation in the model is reflected by the perfused vessel density. Tumour growth depends on its doubling time (T d) and the imaged proliferation. Perfused vessel density recruitment rate depends on the perfused vessel density around the tumour (sMVDtissue) and the maximum VEGF concentration for complete vessel dysfunctionality (VEGFmax). The model parameters were benchmarked to reproduce the dynamics of tumour oxygenation over its entire lifecycle, which is the most challenging test. Tumour oxygenation dynamics were quantified using the peak pO2 (pO2peak) and the time to peak pO2 (t peak). Sensitivity of tumour oxygenation to model parameters was assessed by changing each parameter by 20%. t peak was found to be more sensitive to tumour cell line related doubling time (~30%) as compared to tissue vasculature density (~10%). On the other hand, pO2peak was found to be similarly influenced by the above tumour- and vasculature-associated parameters (~30-40%). Interestingly, both pO2peak and t peak were only marginally affected by VEGFmax (~5%). The development of a poorly oxygenated (hypoxic) core with tumour growth increased VEGF accumulation, thus disrupting the vessel perfusion as well as further increasing hypoxia with time. The model with its benchmarked parameters, is applied to hypoxia imaging data obtained using a [64Cu]Cu-ATSM PET scan of a mouse tumour and the temporal development of the vasculature and hypoxia maps are shown. The work underscores the importance of using tumour-specific input for analysing tumour evolution. An extended model incorporating therapeutic effects can serve as a powerful tool for analysing

  8. Material characterization and computer model simulation of low density polyurethane foam used in a rodent traumatic brain injury model.

    Science.gov (United States)

    Zhang, Liying; Gurao, Manish; Yang, King H; King, Albert I

    2011-05-15

    Computer models of the head can be used to simulate the events associated with traumatic brain injury (TBI) and quantify biomechanical response within the brain. Marmarou's impact acceleration rodent model is a widely used experimental model of TBI mirroring axonal pathology in humans. The mechanical properties of the low density polyurethane (PU) foam, an essential piece of energy management used in Marmarou's impact device, has not been fully characterized. The foam used in Marmarou's device was tested at seven strain rates ranging from quasi-static to dynamic (0.014-42.86 s⁻¹) to quantify the stress-strain relationships in compression. Recovery rate of the foam after cyclic compression was also determined through the periods of recovery up to three weeks. The experimentally determined stress-strain curves were incorporated into a material model in an explicit Finite Element (FE) solver to validate the strain rate dependency of the FE foam model. Compression test results have shown that the foam used in the rodent impact acceleration model is strain rate dependent. The foam has been found to be reusable for multiple impacts. However the stress resistance of used foam is reduced to 70% of the new foam. The FU_CHANG_FOAM material model in an FE solver has been found to be adequate to simulate this rate sensitive foam.

  9. Approximate Bayesian Computation by Subset Simulation using hierarchical state-space models

    Science.gov (United States)

    Vakilzadeh, Majid K.; Huang, Yong; Beck, James L.; Abrahamsson, Thomas

    2017-02-01

    A new multi-level Markov Chain Monte Carlo algorithm for Approximate Bayesian Computation, ABC-SubSim, has recently appeared that exploits the Subset Simulation method for efficient rare-event simulation. ABC-SubSim adaptively creates a nested decreasing sequence of data-approximating regions in the output space that correspond to increasingly closer approximations of the observed output vector in this output space. At each level, multiple samples of the model parameter vector are generated by a component-wise Metropolis algorithm so that the predicted output corresponding to each parameter value falls in the current data-approximating region. Theoretically, if continued to the limit, the sequence of data-approximating regions would converge on to the observed output vector and the approximate posterior distributions, which are conditional on the data-approximation region, would become exact, but this is not practically feasible. In this paper we study the performance of the ABC-SubSim algorithm for Bayesian updating of the parameters of dynamical systems using a general hierarchical state-space model. We note that the ABC methodology gives an approximate posterior distribution that actually corresponds to an exact posterior where a uniformly distributed combined measurement and modeling error is added. We also note that ABC algorithms have a problem with learning the uncertain error variances in a stochastic state-space model and so we treat them as nuisance parameters and analytically integrate them out of the posterior distribution. In addition, the statistical efficiency of the original ABC-SubSim algorithm is improved by developing a novel strategy to regulate the proposal variance for the component-wise Metropolis algorithm at each level. We demonstrate that Self-regulated ABC-SubSim is well suited for Bayesian system identification by first applying it successfully to model updating of a two degree-of-freedom linear structure for three cases: globally

  10. The effects of computer simulation models on middle school students' understanding of the anatomy and morphology of the frog

    Science.gov (United States)

    Akpan, Joseph Paul

    Science teachers, school administrators, educators, and the scientific community are faced with ethical controversies over animal dissection in school biology classrooms. Computer simulation has been proposed as a way of dealing with this issue. One intriguing tentative finding in previous research was that use of an interactive videodisc dissection facilitated performance on a subsequent actual dissection. This study was designed to replicate and extend that finding to computer-based dissection. The purpose of this study was twofold: (1) to examine the effectiveness of a computer simulation model of frog dissection in improving students' actual dissection performance and learning of frog anatomy and morphology and (2) to determine whether the effectiveness of the simulation is dependent upon the sequence in which simulation is presented. Class periods were randomly assigned to three experimental conditions: simulation before dissection, dissection before simulation, or dissection-only. Results of the study indicated that students in the simulation before dissection condition (SBD) performed significantly better than the dissection before simulation (DBS) and dissection-only (DO) conditions on both the actual dissection and on knowledge of the anatomy and morphology. There were no significant differences between the latter two conditions. Students attitudes toward the use of animals for dissection did not change significantly from pretest to posttest and did not interact with treatment. The genders did not differ in achievement, but males were more favorable towards dissection and computers than were females. Attitudes were not influenced by the experimental treatments.

  11. Computer Simulations on a Multidimensional Continuum:

    DEFF Research Database (Denmark)

    Girault, Isabelle; Pfeffer, Melanie; Chiocarriello, Augusto

    2016-01-01

    Computer simulations exist on a multidimensional continuum with other educational technologies including static animations, serious games, and virtual worlds. The act of defining simulations is context dependent. In our context of science education, we define simulations as algorithmic, dynamic...... with emphasis on simulations’ algorithmic, dynamic, and simple features. Defined as models, simulations can be computational or conceptual in nature and may reflect hypothetical or real events; such distinctions are addressed. Examples of programs that demonstrate the features of simulations emphasized in our...

  12. Computationally efficient models for simulation of non-ideal DC–DC converters operating in continuous and discontinuous conduction modes

    Indian Academy of Sciences (India)

    Challa Mohana Krishna; Saritha B; Narayanan G

    2015-10-01

    This paper discusses dynamic modeling of non-isolated DC–DC converters (buck, boost and buck–boost) under continuous and discontinuous modes of operation. Three types of models are presented for each converter, namely, switching model, average model and harmonic model. These models include significant nonidealities of the converters. The switching model gives the instantaneous currents and voltages of the converter. The average model provides the ripple-free currents and voltages, averaged over a switching cycle. The harmonic model gives the peak to peak values of ripple in currents and voltages. The validity of all these models is established by comparing the simulation results with the experimental results from laboratory prototypes, at different steady state and transient conditions. Simulation based on a combination of average and harmonic models is shown to provide all relevant information as obtained from the switching model, while consuming less computation time than the latter.

  13. Why applicants should use computer simulation models to comply with the FERC`s new merger policy

    Energy Technology Data Exchange (ETDEWEB)

    Frankena, M.W.; Morris, J.R. [Economists Inc., Washington, DC (United States)

    1997-02-01

    Computer models for electric utility use in complying with the US Federal Energy Regulatory Commission policy on mergers are described. Four types of simulation models that are widely used in the electric power industry are considered as tools for analyzing market power issues: dispatch/transportation models, dispatch/unit-commitment models, load-flow models, and load-flow/dispatch models. Basic model capabilities and limitations are described. Uses of the models for other purposes are also noted, including regulatory filings, antitrust litigation, and evaluation of pricing strategies.

  14. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    Science.gov (United States)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  15. Theoretical modeling and computational simulation of robust control for Mars aircraft

    Science.gov (United States)

    Oh, Seyool

    The focus of this dissertation is the development of control system design algorithms for autonomous operation of an aircraft in the Martian atmosphere. This research will show theoretical modeling and computational simulation of robust control and gain scheduling for a prototype Mars aircraft. A few hundred meters above the surface of Mars, the air density is less than 1% of the density of the Earth's atmosphere at sea level. However, at about 33 km (110,000 ft) above the Earth, the air density is similar to that near the surface of Mars. Marsflyer II was designed to investigate these flight regimes: 33 km above the Earth and the actual Mars environment. The fuselage for the preliminary design was cylindrical with a length of 2.59 m (8.49 ft), the wing span was 3.98 m (13.09 ft). The total weight of the demonstrator aircraft was around 4.54 kg (10.02 lb). Aircraft design tools have been developed based on successful aircraft for the Earth`s atmosphere. However, above Mars an airborne robotic explorer would encounter low Reynolds Number flow phenomena combined with high Mach numbers, a region that is unknown for normal Earth aerodynamic applications. These flows are more complex than those occurring at high Reynolds numbers. The performance of airfoils at low Reynolds numbers is poorly understood and generally results in unfavorable aerodynamic characteristics. Design and simulation tools for the low Reynolds number Martian environment could be used to develop Unmanned Aerial Vehicles (UAV). In this study, a robust control method is used to analyze a prototype Mars aircraft. The purpose of this aircraft is to demonstrate stability, control, and performance within a simulated Mars environment. Due to uncertainty regarding the actual Martian environment, flexibility in the operation of the aircraft`s control system is important for successful performance. The stability and control derivatives of Marsflyer II were obtained by using the Advanced Aircraft Analysis (AAA

  16. Computer simulation of liquid crystals

    Energy Technology Data Exchange (ETDEWEB)

    McBride, C.

    1999-01-01

    Molecular dynamics simulation performed on modern computer workstations provides a powerful tool for the investigation of the static and dynamic characteristics of liquid crystal phases. In this thesis molecular dynamics computer simulations have been performed for two model systems. Simulations of 4,4'-di-n-pentyl-bibicyclo[2.2.2]octane demonstrate the growth of a structurally ordered phase directly from an isotropic fluid. This is the first time that this has been achieved for an atomistic model. The results demonstrate a strong coupling between orientational ordering and molecular shape, but indicate that the coupling between molecular conformational changes and molecular reorientation is relatively weak. Simulations have also been performed for a hybrid Gay-Berne/Lennard-Jones model resulting in thermodynamically stable nematic and smectic phases. Frank elastic constants have been calculated for the nematic phase formed by the hybrid model through analysis of the fluctuations of the nematic director, giving results comparable with those found experimentally. Work presented in this thesis also describes the parameterization of the torsional potential of a fragment of a dimethyl siloxane polymer chain, disiloxane diol (HOMe[sub 2]Si)[sub 2]O, using ab initio quantum mechanical calculations. (author)

  17. Computer simulation of liquid crystals

    Energy Technology Data Exchange (ETDEWEB)

    McBride, C

    1999-09-01

    Molecular dynamics simulation performed on modern computer workstations provides a powerful tool for the investigation of the static and dynamic characteristics of liquid crystal phases. In this thesis molecular dynamics computer simulations have been performed for two model systems. Simulations of 4,4`-di-n-pentyl-bibicyclo[2.2.2]octane demonstrate the growth of a structurally ordered phase directly from an isotropic fluid. This is the first time that this has been achieved for an atomistic model. The results demonstrate a strong coupling between orientational ordering and molecular shape, but indicate that the coupling between molecular conformational changes and molecular reorientation is relatively weak. Simulations have also been performed for a hybrid Gay-Berne/Lennard-Jones model resulting in thermodynamically stable nematic and smectic phases. Frank elastic constants have been calculated for the nematic phase formed by the hybrid model through analysis of the fluctuations of the nematic director, giving results comparable with those found experimentally. Work presented in this thesis also describes the parameterization of the torsional potential of a fragment of a dimethyl siloxane polymer chain, disiloxane diol (HOMe{sub 2}Si){sub 2}O, using ab initio quantum mechanical calculations. (author)

  18. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  19. Energy consumption program: A computer model simulating energy loads in buildings

    Science.gov (United States)

    Stoller, F. W.; Lansing, F. L.; Chai, V. W.; Higgins, S.

    1978-01-01

    The JPL energy consumption computer program developed as a useful tool in the on-going building modification studies in the DSN energy conservation project is described. The program simulates building heating and cooling loads and computes thermal and electric energy consumption and cost. The accuracy of computations are not sacrificed, however, since the results lie within + or - 10 percent margin compared to those read from energy meters. The program is carefully structured to reduce both user's time and running cost by asking minimum information from the user and reducing many internal time-consuming computational loops. Many unique features were added to handle two-level electronics control rooms not found in any other program.

  20. Energy consumption program: A computer model simulating energy loads in buildings

    Science.gov (United States)

    Stoller, F. W.; Lansing, F. L.; Chai, V. W.; Higgins, S.

    1978-01-01

    The JPL energy consumption computer program developed as a useful tool in the on-going building modification studies in the DSN energy conservation project is described. The program simulates building heating and cooling loads and computes thermal and electric energy consumption and cost. The accuracy of computations are not sacrificed, however, since the results lie within + or - 10 percent margin compared to those read from energy meters. The program is carefully structured to reduce both user's time and running cost by asking minimum information from the user and reducing many internal time-consuming computational loops. Many unique features were added to handle two-level electronics control rooms not found in any other program.

  1. Land use--energy simulation model: a computer-based model for exploring land use and energy relationships

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, T.O.; Kydes, A.S.; Sanborn, J.

    1977-06-01

    There is no doubt that major conservation of future regional energy expenditures can be achieved through the propitious allocation and configuring of land-use activities. The task of searching for and selecting strategies and measures which will bring about energy conservation vis-a-vis land use becomes that of understanding and defining relationships between sets of possible land use activities in a given region and the resultant energy end use demand. The outcome of the search is the determination of the relative impact of the strategies and measures upon both the regional and national energy system. The Land Use-Energy Simulation Model with integrated capability for generating energy demand is an extension of the classic Lowry model. Such a model framework captures two essential features of the land use-energy utilization interaction; first, the spatial location of land use activity is implicit, and second, transportation energy demand is determined as an integral part of the spatial configuration. The model is divided both conceptually and computationally into three parts; the land use model, a submodel for transportation which provides the work and shop trip distributions for spatial allocation of activities within the land use submodel, and an energy submodel which determines the energy demand from the land use configuration. Two specific types of applications of thecomputer model are described. The model was utilized to assess the energy demand of the Long Island region in New York. Second, the model was applied to study the generic relationships between energy utilization and urban form.

  2. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  3. Computer Simulation Study of Human Locomotion with a Three-Dimensional Entire-Body Neuro-Musculo-Skeletal Model

    Science.gov (United States)

    Hase, Kazunori; Obinata, Goro

    It is essential for the biomechanical study of human walking motion to consider not only in vivo mechanical load and energy efficiency but also aspects of motor control such as walking stability. In this study, walking stability was investigated using a three-dimensional entire-body neuro-musculo-skeletal model in the computer simulation. In the computational experiments, imaginary constraints, such as no muscular system, were set in the neuro-musculo-skeletal model to investigate their influence on walking stability. The neuronal parameters were adjusted using numerical search techniques in order to adapt walking patterns to constraints on the neuro-musculo-skeletal system. Simulation results revealed that the model of the normal neuro-musculo-skeletal system yielded a higher stability than the imaginary models. Unstable walking by a model with a time delay in the neuronal system suggested significant unknown mechanisms which stabilized walking patterns that have been neglected in previous studies.

  4. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  5. LIAR: A COMPUTER PROGRAM FOR THE SIMULATION AND MODELING OF HIGH PERFORMANCE LINACS

    Energy Technology Data Exchange (ETDEWEB)

    Adolphsen, Chris

    2003-05-01

    The computer program LIAR (''LInear Accelerator Research code'') is a numerical simulation and tracking program for linear colliders. The LIAR project was started at SLAC in August 1995 in order to provide a computing and simulation tool that specifically addresses the needs of high energy linear colliders. LIAR is designed to be used for a variety of different linear accelerators. It has been applied for and checked against the existing Stanford Linear Collider (SLC) as well as the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS). The program includes wakefield effects, a 4D coupled beam description, specific optimization algorithms and other advanced features. We describe the most important concepts and highlights of the program. After having presented the LIAR program at the LINAC96 and the PAC97 conferences, we do now introduce it to the European particle accelerator community.

  6. Multiscale paradigms in integrated computational materials science and engineering materials theory, modeling, and simulation for predictive design

    CERN Document Server

    Runge, Keith; Muralidharan, Krishna

    2016-01-01

    This book presents cutting-edge concepts, paradigms, and research highlights in the field of computational materials science and engineering, and provides a fresh, up-to-date perspective on solving present and future materials challenges. The chapters are written by not only pioneers in the fields of computational materials chemistry and materials science, but also experts in multi-scale modeling and simulation as applied to materials engineering. Pedagogical introductions to the different topics and continuity between the chapters are provided to ensure the appeal to a broad audience and to address the applicability of integrated computational materials science and engineering for solving real-world problems.

  7. The updated algorithm of the Energy Consumption Program (ECP): A computer model simulating heating and cooling energy loads in buildings

    Science.gov (United States)

    Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.

    1979-01-01

    The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.

  8. Blood flow in intracranial aneurysms treated with Pipeline embolization devices: computational simulation and verification with Doppler ultrasonography on phantom models

    Directory of Open Access Journals (Sweden)

    Anderson Chun On Tsang

    2015-04-01

    Full Text Available Purpose: The aim of this study was to validate a computational fluid dynamics (CFD simulation of flow-diverter treatment through Doppler ultrasonography measurements in patient-specific models of intracranial bifurcation and side-wall aneurysms. Methods: Computational and physical models of patient-specific bifurcation and sidewall aneurysms were constructed from computed tomography angiography with use of stereolithography, a three-dimensional printing technology. Flow dynamics parameters before and after flow-diverter treatment were measured with pulse-wave and color Doppler ultrasonography, and then compared with CFD simulations. Results: CFD simulations showed drastic flow reduction after flow-diverter treatment in both aneurysms. The mean volume flow rate decreased by 90% and 85% for the bifurcation aneurysm and the side-wall aneurysm, respectively. Velocity contour plots from computer simulations before and after flow diversion closely resembled the patterns obtained by color Doppler ultrasonography. Conclusion: The CFD estimation of flow reduction in aneurysms treated with a flow-diverting stent was verified by Doppler ultrasonography in patient-specific phantom models of bifurcation and side-wall aneurysms. The combination of CFD and ultrasonography may constitute a feasible and reliable technique in studying the treatment of intracranial aneurysms with flow-diverting stents.

  9. Simulation of worms transmission in computer network based on SIRS fuzzy epidemic model

    Science.gov (United States)

    Darti, I.; Suryanto, A.; Yustianingsih, M.

    2015-03-01

    In this paper we study numerically the behavior of worms transmission in a computer network. The model of worms transmission is derived by modifying a SIRS epidemic model. In this case, we consider that the transmission rate, recovery rate and rate of susceptible after recovery follows fuzzy membership functions, rather than constants. To study the transmission of worms in a computer network, we solve the model using the fourth order Runge-Kutta method. Our numerical results show that the fuzzy transmission rate and fuzzy recovery rate may lead to a changing of basic reproduction number which therefore also changes the stability properties of equilibrium points.

  10. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    Science.gov (United States)

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  11. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  12. Computational modeling to predict mechanical function of joints: application to the lower leg with simulation of two cadaver studies.

    Science.gov (United States)

    Liacouras, Peter C; Wayne, Jennifer S

    2007-12-01

    Computational models of musculoskeletal joints and limbs can provide useful information about joint mechanics. Validated models can be used as predictive devices for understanding joint function and serve as clinical tools for predicting the outcome of surgical procedures. A new computational modeling approach was developed for simulating joint kinematics that are dictated by bone/joint anatomy, ligamentous constraints, and applied loading. Three-dimensional computational models of the lower leg were created to illustrate the application of this new approach. Model development began with generating three-dimensional surfaces of each bone from CT images and then importing into the three-dimensional solid modeling software SOLIDWORKS and motion simulation package COSMOSMOTION. Through SOLIDWORKS and COSMOSMOTION, each bone surface file was filled to create a solid object and positioned necessary components added, and simulations executed. Three-dimensional contacts were added to inhibit intersection of the bones during motion. Ligaments were represented as linear springs. Model predictions were then validated by comparison to two different cadaver studies, syndesmotic injury and repair and ankle inversion following ligament transection. The syndesmotic injury model was able to predict tibial rotation, fibular rotation, and anterior/posterior displacement. In the inversion simulation, calcaneofibular ligament extension and angles of inversion compared well. Some experimental data proved harder to simulate accurately, due to certain software limitations and lack of complete experimental data. Other parameters that could not be easily obtained experimentally can be predicted and analyzed by the computational simulations. In the syndesmotic injury study, the force generated in the tibionavicular and calcaneofibular ligaments reduced with the insertion of the staple, indicating how this repair technique changes joint function. After transection of the calcaneofibular

  13. Theoretic model and computer simulation of separating mixture metal particles from waste printed circuit board by electrostatic separator.

    Science.gov (United States)

    Li, Jia; Xu, Zhenming; Zhou, Yaohe

    2008-05-30

    Traditionally, the mixture metals from waste printed circuit board (PCB) were sent to the smelt factory to refine pure copper. Some valuable metals (aluminum, zinc and tin) with low content in PCB were lost during smelt. A new method which used roll-type electrostatic separator (RES) to recovery low content metals in waste PCB was presented in this study. The theoretic model which was established from computing electric field and the analysis of forces on the particles was used to write a program by MATLAB language. The program was design to simulate the process of separating mixture metal particles. Electrical, material and mechanical factors were analyzed to optimize the operating parameters of separator. The experiment results of separating copper and aluminum particles by RES had a good agreement with computer simulation results. The model could be used to simulate separating other metal (tin, zinc, etc.) particles during the process of recycling waste PCBs by RES.

  14. Inversion based on computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  15. Computer simulation models relevant to ground water contamination from EOR or other fluids - state-of-the-art

    Energy Technology Data Exchange (ETDEWEB)

    Kayser, M.B.; Collins, A.G.

    1986-03-01

    Ground water contamination is a serious national problem. The use of computers to simulate the behavior of fluids in the subsurface has proliferated extensively over the last decade. Numerical models are being used to solve water supply problems, various kinds of enertgy production problems, and ground water contamination problems. Modeling techniques have progressed to the point that their accuracy is only limited by the modeller's ability to describe the reservoir in question and the heterogeneities therein. Pursuant to the Task and Milestone Update of Project BE3A, this report summarizes the state of the art of computer simulation models relevant to contamination of ground water by enhanced oil recovery (EOR) chemicals and/or waste fluids. 150 refs., 6 tabs.

  16. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and probab

  17. Design, simulation, and experimental verification of a computer model and enhanced position estimator for the NPS AUV II

    OpenAIRE

    Warner, David C.

    1991-01-01

    A full six-degree-of-freedom computer model of the Naval Postgraduate School Autonomous Underwater Vehicle (NPS AUV II) is developed. Hydrodynamic Coefficients are determined by geometric similarity with an existing swimmer delivery vehicle and analysis of initial open loop AUV II trials. Comparisons between simulated and experimental results demonstrate the validity of the model and the techniques used. A reduced order observer of lateral velocity was produced to provide an input for an enha...

  18. Modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Casetti, E.; Vogt, W.G.; Mickle, M.H.

    1984-01-01

    This conference includes papers on the uses of supercomputers, multiprocessors, artificial intelligence and expert systems in various energy applications. Topics considered include knowledge-based expert systems for power engineering, a solar air conditioning laboratory computer system, multivariable control systems, the impact of power system disturbances on computer systems, simulating shared-memory parallel computers, real-time image processing with multiprocessors, and network modeling and simulation of greenhouse solar systems.

  19. Speed Spatial Distribution Models for Traffic Accident Section of Freeway Based on Computer Simulation

    Institute of Scientific and Technical Information of China (English)

    Decai Li; Jiangwei Chu; Wenhui Zhang; Xiaojuan Wang; Guosheng Zhang

    2015-01-01

    Simulation models for accident section on freeway are built in microscopic traffic flow simulation environment. In these models involving 2⁃lane, 3⁃lane and 4⁃lane freeway, one detector is set every 10 m to measure section running speed. According to the simulation results, speed spatial distribution curves for traffic accident section on freeway are drawn which help to determine dangerous sections on upstream of accident section. Furthermore, the speed spatial distribution models are obtained for every speed distribution curve. The results provide theoretical basis for determination on temporal and spatial influence ranges of traffic accident and offer reference to formulation of speed limit scheme and other management measures.

  20. Factors influencing QTL mapping accuracy under complicated genetic models by computer simulation.

    Science.gov (United States)

    Su, C F; Wang, W; Gong, S L; Zuo, J H; Li, S J

    2016-12-19

    The accuracy of quantitative trait loci (QTLs) identified using different sample sizes and marker densities was evaluated in different genetic models. Model I assumed one additive QTL; Model II assumed three additive QTLs plus one pair of epistatic QTLs; and Model III assumed two additive QTLs with opposite genetic effects plus two pairs of epistatic QTLs. Recombinant inbred lines (RILs) (50-1500 samples) were simulated according to the Models to study the influence of different sample sizes under different genetic models on QTL mapping accuracy. RILs with 10-100 target chromosome markers were simulated according to Models I and II to evaluate the influence of marker density on QTL mapping accuracy. Different marker densities did not significantly influence accurate estimation of genetic effects with simple additive models, but influenced QTL mapping accuracy in the additive and epistatic models. The optimum marker density was approximately 20 markers when the recombination fraction between two adjacent markers was 0.056 in the additive and epistatic models. A sample size of 150 was sufficient for detecting simple additive QTLs. Thus, a sample size of approximately 450 is needed to detect QTLs with additive and epistatic models. Sample size must be approximately 750 to detect QTLs with additive, epistatic, and combined effects between QTLs. The sample size should be increased to >750 if the genetic models of the data set become more complicated than Model III. Our results provide a theoretical basis for marker-assisted selection breeding and molecular design breeding.

  1. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies dif

  2. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies

  3. Simple Urban Simulation Atop Complicated Models: Multi-Scale Equation-Free Computing of Sprawl Using Geographic Automata

    Directory of Open Access Journals (Sweden)

    Yu Zou

    2013-07-01

    Full Text Available Reconciling competing desires to build urban models that can be simple and complicated is something of a grand challenge for urban simulation. It also prompts difficulties in many urban policy situations, such as urban sprawl, where simple, actionable ideas may need to be considered in the context of the messily complex and complicated urban processes and phenomena that work within cities. In this paper, we present a novel architecture for achieving both simple and complicated realizations of urban sprawl in simulation. Fine-scale simulations of sprawl geography are run using geographic automata to represent the geographical drivers of sprawl in intricate detail and over fine resolutions of space and time. We use Equation-Free computing to deploy population as a coarse observable of sprawl, which can be leveraged to run automata-based models as short-burst experiments within a meta-simulation framework.

  4. TCM-1: a nonlinear dynamical computational model to simulate cellular changes in the T cell system; conceptional design and validation.

    Science.gov (United States)

    Krueger, Gerhard R F; Brandt, Michael E; Wang, Guanyu; Buja, L Maximilian

    2003-01-01

    Based upon a previously developed theory of dysregulative lymphoma pathogenesis, a computer model is designed in order to simulate cell changes occurring in disturbances of the T cell immune system and in lymphoproliferative diseases. The model is based upon the concept that factors identified as proliferation factors, differentiation factors and inhibition factors exert a network regulation upon development and function of the T cell system, and that selective disturbances of these factors may lead to hyperplastic, aplastic or neoplastic diseases. The resulting computer model (TCM-1) was validated by comparing it with data from human diseases such as acute HHV-6 (viral) infection, chronic persistent HHV-6 infection, progressive HIV1 infection and HTLV-1 infection, and comparing the simulation results with the actual cell data in the human patients. All these infections target the same T cell population (i.e. CD4 + T helper cells), yet cause different prototypical reactions (hyperplastic, aplastic, neoplastic). The described computer model, which was successfully used to simulate changes in the benign lymphoproliferative disease, Canale-Smith syndrome, will serve as the basis model for further supplementation to accommodate identified factorial influences such as by cytokines, chemokines and others.

  5. A high performance computing framework for physics-based modeling and simulation of military ground vehicles

    Science.gov (United States)

    Negrut, Dan; Lamb, David; Gorsich, David

    2011-06-01

    This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and

  6. Computer Simulation and Computabiblity of Biological Systems

    CERN Document Server

    Baianu, I C

    2004-01-01

    The ability to simulate a biological organism by employing a computer is related to the ability of the computer to calculate the behavior of such a dynamical system, or the "computability" of the system. However, the two questions of computability and simulation are not equivalent. Since the question of computability can be given a precise answer in terms of recursive functions, automata theory and dynamical systems, it will be appropriate to consider it first. The more elusive question of adequate simulation of biological systems by a computer will be then addressed and a possible connection between the two answers given will be considered as follows. A symbolic, algebraic-topological "quantum computer" (as introduced in Baianu, 1971b) is here suggested to provide one such potential means for adequate biological simulations based on QMV Quantum Logic and meta-Categorical Modeling as for example in a QMV-based, Quantum-Topos (Baianu and Glazebrook,2004.

  7. Reducing the computational requirements for simulating tunnel fires by combining multiscale modelling and multiple processor calculation

    DEFF Research Database (Denmark)

    Vermesi, Izabella; Rein, Guillermo; Colella, Francesco

    2017-01-01

    directly. The feasibility analysis showed a difference of only 2% in temperature results from the published reference work that was performed with Ansys Fluent (Colella et al., 2010). The reduction in simulation time was significantly larger when using multiscale modelling than when performing multiple...

  8. Modeling of tool-tissue interactions for computer-based surgical simulation: a literature review

    NARCIS (Netherlands)

    Misra, Sarthak; Ramesh, K.T.; Okamura, Allison M.

    2008-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in th

  9. SCIPR: A Computational Model to Simulate Cultural Identities for Predicting Reactions to Events

    Science.gov (United States)

    2008-06-01

    2004; Hogg, Sherman, Dierselhuis, Maitner, & Moffit, 2007) and by communicating with other people to find out their opinions ( Festinger , 1954). A...the Relative Agreement Interaction Model. Journal of Artificial Societies and Social Simulation, 5(4). Festinger , L. (1954). A Theory of Social

  10. Proactive monitoring and adaptive management of social carrying capacity in Arches National Park: an application of computer simulation modeling.

    Science.gov (United States)

    Lawson, Steven R; Manning, Robert E; Valliere, William A; Wang, Benjamin

    2003-07-01

    Public visits to parks and protected areas continue to increase and may threaten the integrity of natural and cultural resources and the quality of the visitor experience. Scientists and managers have adopted the concept of carrying capacity to address the impacts of visitor use. In the context of outdoor recreation, the social component of carrying capacity refers to the level of visitor use that can be accommodated in parks and protected areas without diminishing the quality of the visitor experience to an unacceptable degree. This study expands and illustrates the use of computer simulation modeling as a tool for proactive monitoring and adaptive management of social carrying capacity at Arches National Park. A travel simulation model of daily visitor use throughout the Park's road and trail network and at selected attraction sites was developed, and simulations were conducted to estimate a daily social carrying capacity for Delicate Arch, an attraction site in Arches National Park, and for the Park as a whole. Further, a series of simulations were conducted to estimate the effect of a mandatory shuttle bus system on daily social carrying capacity of Delicate Arch to illustrate how computer simulation modeling can be used as a tool to facilitate adaptive management of social carrying capacity.

  11. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-02-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing) that students apply to study how lakes around the globe are experiencing the effects of climate change. In the module, students develop hypotheses about the effects of different climate scenarios on lakes and then test their hypotheses using hundreds of model simulations. We taught the module in a 4-hour workshop and found that participation in the module significantly increased both undergraduate and graduate students' understanding about climate change effects on lakes. Moreover, participation in the module also significantly increased students' perceived experience level in using different software, technologies, and modeling tools. By embedding modeling in an environmental science context, non-computer science students were able to successfully use and master technologies that they had previously never been exposed to. Overall, our findings suggest that modeling is a powerful tool for catalyzing student learning on the effects of climate change.

  12. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2016-08-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing) that students apply to study how lakes around the globe are experiencing the effects of climate change. In the module, students develop hypotheses about the effects of different climate scenarios on lakes and then test their hypotheses using hundreds of model simulations. We taught the module in a 4-hour workshop and found that participation in the module significantly increased both undergraduate and graduate students' understanding about climate change effects on lakes. Moreover, participation in the module also significantly increased students' perceived experience level in using different software, technologies, and modeling tools. By embedding modeling in an environmental science context, non-computer science students were able to successfully use and master technologies that they had previously never been exposed to. Overall, our findings suggest that modeling is a powerful tool for catalyzing student learning on the effects of climate change.

  13. Does polar interaction influence medium viscosity? A computer simulation investigation using model liquids

    Indian Academy of Sciences (India)

    Snehasis Daschakraborty; Ranjit Biswas

    2012-07-01

    Molecular dynamics simulations of model liquids interacting via Lennard-Jones (L-J) and Stockmayer (SM) interactions have been carried out to explore the effects of the longer-ranged dipole-dipole interaction on solvent viscosity and diffusion. Switching on of the dipolar interaction at a fixed density and temperature has been found to increase the viscosity over that of the LJ liquid, the extent of increase being a few percent to as large as ∼60% depending on the magnitude of the solvent dipole moment used in the SM potential. The simulated translational and rotational diffusion coefficients show strong dipole moment and temperature dependences, eventhough effects of these parameters on solvent-solvent radial distribution function are moderate. Interestingly, a partial solute-solvent decoupling is observed when the simulated translational and rotational diffusion coefficients are connected to the simulated viscosity coefficients via the Stokes-Einstein (SE) and Stokes-Einstein-Debye (SED) relations. In the limit of large dipole moment, simulated self-part of the van Hove correlation function at intermediate times reveals a departure from the Gaussian distribution with particle displacement. This suggests that dynamic heterogeneity is one of the reasons for the departure of centre-of-mass diffusion from the SE relation in these model systems.

  14. Computer Simulation Modeling for Recreation Management: A Study on Carriage Road Use in Acadia National Park, Maine, USA.

    Science.gov (United States)

    WANG; MANNING

    1999-02-01

    / The number of visits to outdoor recreation areas has increased dramatically in the last three decades, leading managers and researchers to wonder if there is a limit to the amount of use a resource such as a park can accommodate. One of the difficulties in addressing this carrying capacity-related question has been the complex nature of visitor travel patterns on often extensive networks of roads and trails. Systematic direct observation is often impractical and anecdotal information is usually inadequate. This study explores the utility of computer simulation as a tool for describing visitor travel by building a dynamic model of visitor travel on the carriage roads of Acadia National Park, Maine, USA. The simulation model uses empirical inputs such as travel routes and travel speeds to generate simulated recreation days on the carriage roads. Data on persons-per-viewscape (PPV) conditions were then gathered from multiple model runs and incorporated into the National Park Service's visitor experience and resource protection (VERP) planning process. Results show that PPV conditions under present-day use levels do not violate proposed standards of quality. Results also show likely PPV conditions under scenarios of increasing use and in different areas within the carriage road system. Goodness-of-fit validity tests indicate the model is an accurate representation of the actual system. The findings of this study suggest that computer simulation is useful for estimating current carrying capacity conditions, predicting future conditions, and guiding related research. KEY WORDS: Computer simulation modeling; Carrying capacity; Recreation management; Acadia National Park

  15. Grid computing and biomolecular simulation.

    Science.gov (United States)

    Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

    2005-08-15

    Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

  16. Massive Parallel Quantum Computer Simulator

    CERN Document Server

    De Raedt, K; De Raedt, H; Ito, N; Lippert, T; Michielsen, K; Richter, M; Trieu, B; Watanabe, H; Lippert, Th.

    2006-01-01

    We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.

  17. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  18. Globalization and the Polish economy: stylized facts and simulations using a Computable General Equilibrium Model

    OpenAIRE

    Gradzewicz, Michał; Hagemejer, Jan; Zbigniew, Żółkiewski

    2007-01-01

    The aim of the paper is to quantitatively assess the impact of globalization on the economy of Poland in the medium term. Four channels of the impact of globalization are distinguished: (i) trade openness, (ii) productivity improvement, (iii) labour migrations, (iv) liberalization of the services sector. We employ a computable general equilibrium model with multiple industries and households and imperfect competition features. Our results show positive and quite significant effects of globali...

  19. Modeling and simulation of membrane separation process using computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    Kambiz Tahvildari

    2016-01-01

    Full Text Available Separation of CO2 from air was simulated in this work. The considered process for removal of CO2 was a hollow-fiber membrane contactor and an aqueous solution of 2-amino-2-metyl-1-propanol (AMP as absorbent. The model was developed based on mass transfer as well as chemical reaction for CO2 and solvent in the contactor. The equations of model were solved using finite element method. Simulation results were compared with experimental data, and good agreement was observed. The results revealed that increasing solvent velocity enhances removal of CO2 in the hollow-fiber membrane contactor. Moreover, it was found that counter-current process mode is more favorable to achieve the highest separation efficiency.

  20. Computational studies of biomembrane systems: Theoretical considerations, simulation models, and applications

    CERN Document Server

    Deserno, Markus; Paulsen, Harald; Peter, Christine; Schmid, Friederike

    2014-01-01

    This chapter summarizes several approaches combining theory, simulation and experiment that aim for a better understanding of phenomena in lipid bilayers and membrane protein systems, covering topics such as lipid rafts, membrane mediated interactions, attraction between transmembrane proteins, and aggregation in biomembranes leading to large superstructures such as the light harvesting complex of green plants. After a general overview of theoretical considerations and continuum theory of lipid membranes we introduce different options for simulations of biomembrane systems, addressing questions such as: What can be learned from generic models? When is it expedient to go beyond them? And what are the merits and challenges for systematic coarse graining and quasi-atomistic coarse grained models that ensure a certain chemical specificity?

  1. A computer simulation of the turbocharged turbo compounded diesel engine system: A description of the thermodynamic and heat transfer models

    Science.gov (United States)

    Assanis, D. N.; Ekchian, J. E.; Frank, R. M.; Heywood, J. B.

    1985-01-01

    A computer simulation of the turbocharged turbocompounded direct-injection diesel engine system was developed in order to study the performance characteristics of the total system as major design parameters and materials are varied. Quasi-steady flow models of the compressor, turbines, manifolds, intercooler, and ducting are coupled with a multicylinder reciprocator diesel model, where each cylinder undergoes the same thermodynamic cycle. The master cylinder model describes the reciprocator intake, compression, combustion and exhaust processes in sufficient detail to define the mass and energy transfers in each subsystem of the total engine system. Appropriate thermal loading models relate the heat flow through critical system components to material properties and design details. From this information, the simulation predicts the performance gains, and assesses the system design trade-offs which would result from the introduction of selected heat transfer reduction materials in key system components, over a range of operating conditions.

  2. Modeling and simulation of heat sinks for computer processors in COMSOL Multiphysics

    OpenAIRE

    2012-01-01

    In this study, the heat transfer of three desktop- computer heat sinks was analyzed. The objective of using these heat sinks is to avoid overheating of the computer’s processing unit and in turn reduce the corresponding loss in the unit’s service time. The heat sinks were modeled using COMSOL Multiphysics with the actual dimensions of the devices, and heat generation was modeled with a point source. In the next step, the heat sink designs were modified to achieve a lower temperature in the hi...

  3. Study of the asynchronous traction drive's operating modes by computer simulation. Part 1: Problem formulation and computer model

    Directory of Open Access Journals (Sweden)

    Pavel KOLPAHCHYAN

    2015-06-01

    Full Text Available In this paper, the problems arising from the design of electric locomotives with asynchronous traction drive (with three-phase AC induction motors are considered as including the debugging of control algorithms. The electrical circuit provides the individual (by axle control of traction motors. This allows realizing the operational disconnection/connection of one or more axles in the automatic mode, with account of actual load. In perspective, the evaluation of locomotive's energy efficiency at the realization of various control algorithms must be obtained. Another objective is to research the dynamic processes in various modes of the electric locomotive operation (start and acceleration, traction regime, coasting movement, wheel-slide protection, etc. To solve these problems, a complex computer model based on the representation of AC traction drive as controlled electromechanical system, is developed in Part 1. The description of methods applied in modeling of traction drive elements (traction motors, power converters, control systems, as well as of mechanical part and of "wheel-rail" contact, is given. The control system provides the individual control of the traction motors. Part 2 of the paper focuses on the results of dynamic processes modeling in various modes of electric locomotive operation.

  4. Biomes computed from simulated climatologies

    National Research Council Canada - National Science Library

    Claussen, M; Esch, M

    1992-01-01

    The biome model of Prentice is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max...

  5. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Alan [The University of Edinburgh, Edinburgh EH9 3JZ, Scotland (United Kingdom); Harlen, Oliver G. [University of Leeds, Leeds LS2 9JT (United Kingdom); Harris, Sarah A., E-mail: s.a.harris@leeds.ac.uk [University of Leeds, Leeds LS2 9JT (United Kingdom); University of Leeds, Leeds LS2 9JT (United Kingdom); Khalid, Syma; Leung, Yuk Ming [University of Southampton, Southampton SO17 1BJ (United Kingdom); Lonsdale, Richard [Max-Planck-Institut für Kohlenforschung, Kaiser-Wilhelm-Platz 1, 45470 Mülheim an der Ruhr (Germany); Philipps-Universität Marburg, Hans-Meerwein Strasse, 35032 Marburg (Germany); Mulholland, Adrian J. [University of Bristol, Bristol BS8 1TS (United Kingdom); Pearson, Arwen R. [University of Leeds, Leeds LS2 9JT (United Kingdom); University of Hamburg, Hamburg (Germany); Read, Daniel J.; Richardson, Robin A. [University of Leeds, Leeds LS2 9JT (United Kingdom); The University of Edinburgh, Edinburgh EH9 3JZ, Scotland (United Kingdom)

    2015-01-01

    The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

  6. Computational model and simulations of gas-liquid-solid three-phase interactions

    Science.gov (United States)

    Zhang, Lucy; Wang, Chu

    2013-11-01

    A computational technique to model three-phase (gas-liquid-solid) interactions is proposed in this study. This numerical algorithm couples a connectivity-free front-tracking method that treats gas-liquid multi-fluid interface to the immersed finite element method that treats fully-coupled fluid-solid interactions. The numerical framework is based on a non-boundary-fitted meshing technique where the background grid is fixed where no mesh-updating or re-meshing is required. An indicator function is used to identify the gas from the liquid, and the fluid (gas or liquid) from the solid. Several 2-D and 3-D validation cases are demonstrated to show the accuracy and the robustness of the method. Funding from NRC and CCNI computational facility at Rensselaer Polytechnic Institute are greatly acknowledged.

  7. A finite-rate model for oxygen-silica catalysis through computational chemistry simulation

    Science.gov (United States)

    Norman, Paul; Schwartzentruber, Thomas

    2012-11-01

    The goal of this work is to model the heterogeneous recombination of atomic oxygen on silica surfaces, which is of interest for accurately predicting the heating on vehicles traveling at hypersonic velocities. This is accomplished by creating a finite rate catalytic model, which describes recombination from an atomistic perspective with a set of elementary gas-surface reactions. Fundamental to surface catalytic reactions are the chemical structures on the surface where recombination can occur. Using molecular dynamics simulations with the ReaxFF potential, we find that the chemical sites active in oxygen atom recombination on silica surfaces consist of a small number of specific defects. The individual reactions in our finite rate catalytic model are based on the possible outcomes of oxygen interaction with these defects. The parameters of the functional forms of the rates, including activation energies and pre-exponential factors, are found by carrying out molecular dynamics simulations of individual events. We find that the recombination coefficients predicted by the finite rate catalytic model display an exponential dependence with temperature, in qualitative agreement with experiment at between 1000 K - 1500 K. However, the ReaxFF potential requires reparametrization with new quantum chemical calculations specific to the reaction pathways presented in this work.

  8. Computational Model for the Neutronic Simulation of Pebble Bed Reactor’s Core Using MCNPX

    Directory of Open Access Journals (Sweden)

    J. Rosales

    2014-01-01

    Full Text Available Very high temperature reactor (VHTR designs offer promising performance characteristics; they can provide sustainable energy, improved proliferation resistance, inherent safety, and high temperature heat supply. These designs also promise operation to high burnup and large margins to fuel failure with excellent fission product retention via the TRISO fuel design. The pebble bed reactor (PBR is a design of gas cooled high temperature reactor, candidate for Generation IV of Nuclear Energy Systems. This paper describes the features of a detailed geometric computational model for PBR whole core analysis using the MCNPX code. The validation of the model was carried out using the HTR-10 benchmark. Results were compared with experimental data and calculations of other authors. In addition, sensitivity analysis of several parameters that could have influenced the results and the accuracy of model was made.

  9. [Development of a computer program to simulate the predictions of the replaced elements model of Pavlovian conditioning].

    Science.gov (United States)

    Vogel, Edgar H; Díaz, Claudia A; Ramírez, Jorge A; Jarur, Mary C; Pérez-Acosta, Andrés M; Wagner, Allan R

    2007-08-01

    Despite of the apparent simplicity of Pavlovian conditioning, research on its mechanisms has caused considerable debate, such as the dispute about whether the associated stimuli are coded in an "elementistic"(a compound stimuli is equivalent to the sum of its components) or a "configural" (a compound stimuli is a unique exemplar) fashion. This controversy is evident in the abundant research on the contrasting predictions of elementistic and the configural models. Recently, some mixed solutions have been proposed, which, although they have the advantages of both approaches, are difficult to evaluate due to their complexity. This paper presents a computer program to conduct simulations of a mixed model ( replaced elements model or REM). Instructions and examples are provided to use the simulator for research and educational purposes.

  10. Simulation of Blast Loading on an Ultrastructurally-based Computational Model of the Ocular Lens

    Science.gov (United States)

    2014-10-01

    year have been the imaging ones (Task 6), and mul- tiscale computational modeling (Task 1-3). For Task 6, undergraduate students Sai and Sri Rad ...C.J. Fowler, R.J. Brechner, and J.M. Tielsch. Char- acteristics and causes of penetrating eye injuries reported to the National Eye Trauma System...American Journal of Ophthalmology, 54(5):856–&, 1962. E.D. Weichel and M.H. Colyer. Combat ocular trauma and systemic injury. Cur- rent Opinion in

  11. Motor Vehicle Emission Modeling and Software Simulation Computing for Roundabout in Urban City

    Directory of Open Access Journals (Sweden)

    Haiwei Wang

    2013-01-01

    Full Text Available In urban road traffic systems, roundabout is considered as one of the core traffic bottlenecks, which are also a core impact of vehicle emission and city environment. In this paper, we proposed a transport control and management method for solving traffic jam and reducing emission in roundabout. The platform of motor vehicle testing system and VSP-based emission model was established firstly. By using the topology chart of the roundabout and microsimulation software, we calculated the instantaneous emission rates of different vehicle and total vehicle emissions. We argued that Integration-Model, combing traffic simulation and vehicle emission, can be performed to calculate the instantaneous emission rates of different vehicle and total vehicle emissions at the roundabout. By contrasting the exhaust emissions result between no signal control and signal control in this area at the rush hour, it draws a conclusion that setting the optimizing signal control can effectively reduce the regional vehicle emission. The proposed approach has been submitted to a simulation and experiment that involved an environmental assessment in Satellite Square, a roundabout in medium city located in China. It has been verified that setting signal control with knowledge engineering and Integration-Model is a practical way for solving the traffic jams and environmental pollution.

  12. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  <54% in two-choice classification accuracy. Significance. We believe GMMAC will be useful for clinical fNIRS-based brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  13. Energy loss and coronary flow simulation following hybrid stage I palliation: a hypoplastic left heart computational fluid dynamic model.

    Science.gov (United States)

    Shuhaiber, Jeffrey H; Niehaus, Justin; Gottliebson, William; Abdallah, Shaaban

    2013-08-01

    The theoretical differences in energy losses as well as coronary flow with different band sizes for branch pulmonary arteries (PA) in hypoplastic left heart syndrome (HLHS) remain unknown. Our objective was to develop a computational fluid dynamic model (CFD) to determine the energy losses and pulmonary-to-systemic flow rates. This study was done for three different PA band sizes. Three-dimensional computer models of the hybrid procedure were constructed using the standard commercial CFD softwares Fluent and Gambit. The computer models were controlled for bilateral PA reduction to 25% (restrictive), 50% (intermediate) and 75% (loose) of the native branch pulmonary artery diameter. Velocity and pressure data were calculated throughout the heart geometry using the finite volume numerical method. Coronary flow was measured simultaneously with each model. Wall shear stress and the ratio of pulmonary-to-systemic volume flow rates were calculated. Computer simulations were compared at fixed points utilizing echocardiographic and catheter-based metric dimensions. Restricting the PA band to a 25% diameter demonstrated the greatest energy loss. The 25% banding model produced an energy loss of 16.76% systolic and 24.91% diastolic vs loose banding at 7.36% systolic and 17.90% diastolic. Also, restrictive PA bands had greater coronary flow compared with loose PA bands (50.2 vs 41.9 ml/min). Shear stress ranged from 3.75 Pascals with restrictive PA banding to 2.84 Pascals with loose banding. Intermediate PA banding at 50% diameter achieved a Qp/Qs (closest to 1) at 1.46 systolic and 0.66 diastolic compared with loose or restrictive banding without excess energy loss. CFD provides a unique platform to simulate pressure, shear stress as well as energy losses of the hybrid procedure. PA banding at 50% provided a balanced pulmonary and systemic circulation with adequate coronary flow but without extra energy losses incurred.

  14. Computer-Aided Simulation of Mastoidectomy

    Institute of Scientific and Technical Information of China (English)

    CHEN He-xin; MA Zhi-chao; Wang Zhang-feng; GUO Jie-bo; WEN Wei-ping; XU Geng

    2008-01-01

    Objective To establish a three-dimensional model of the temporal bone using CT scan images for study of temporal bone structures and simulation of mastoidectomy procedures. Methods CT scan images from 6 individuals (12 temporal bones) were used to reconstruct the Fallopian canal, internal auditory canal, cochlea, semicircular canals, sigmoid sinus, posterior fossa floor and jugular bulb on a computer platform. Their anatomical relations within the temporal bone were restored in the computed model. The same model was used to simulate mastoidectomy procedures. Results The reconstructed computer model provided accurate and clear three-dimensional images of temporal bone structures. Simulation of mastoidectomy using these images provided procedural experiences closely mimicking the real surgical procedure. Conclusion Computeraided three dimensional reconstruction of temporal bone structures using CT scan images is a useful tool in surgical simulation and can aid surgical procedure planning.

  15. Computer simulations reveal complex distribution of haemodynamic forces in a mouse retina model of angiogenesis

    CERN Document Server

    Bernabeu, Miguel O; Jones, Martin; Nielsen, Jens H; Krüger, Timm; Nash, Rupert W; Groen, Derek; Hetherington, James; Gerhardt, Holger; Coveney, Peter V

    2013-01-01

    There is currently limited understanding of the role played by haemodynamic forces on the processes governing vascular development. One of many obstacles to be overcome is being able to measure those forces, at the required resolution level, on vessels only a few micrometres thick. In the current paper, we present an in silico method for the computation of the haemodynamic forces experienced by murine retinal vasculature (a widely used vascular development animal model) beyond what is measurable experimentally. Our results show that it is possible to reconstruct high-resolution three-dimensional geometrical models directly from samples of retinal vasculature and that the lattice-Boltzmann algorithm can be used to obtain accurate estimates of the haemodynamics in these domains. Our findings show that the flow patterns recovered are complex, that branches of predominant flow exist from early development stages, and that the pruning process tends to make the wall shear stress experienced by the capillaries incre...

  16. Computational modeling of the pressurization process in a NASP vehicle propellant tank experimental simulation

    Science.gov (United States)

    Sasmal, G. P.; Hochstein, J. I.; Wendl, M. C.; Hardy, T. L.

    1991-01-01

    A multidimensional computational model of the pressurization process in a slush hydrogen propellant storage tank was developed and its accuracy evaluated by comparison to experimental data measured for a 5 ft diameter spherical tank. The fluid mechanic, thermodynamic, and heat transfer processes within the ullage are represented by a finite-volume model. The model was shown to be in reasonable agreement with the experiment data. A parameter study was undertaken to examine the dependence of the pressurization process on initial ullage temperature distribution and pressurant mass flow rate. It is shown that for a given heat flux rate at the ullage boundary, the pressurization process is nearly independent of initial temperature distribution. Significant differences were identified between the ullage temperature and velocity fields predicted for pressurization of slush and those predicted for pressurization of liquid hydrogen. A simplified model of the pressurization process was constructed in search of a dimensionless characterization of the pressurization process. It is shown that the relationship derived from this simplified model collapses all of the pressure history data generated during this study into a single curve.

  17. Computational study of nonlinear plasma waves: 1: Simulation model and monochromatic wave propagation

    Science.gov (United States)

    Matda, Y.; Crawford, F. W.

    1974-01-01

    An economical low noise plasma simulation model is applied to a series of problems associated with electrostatic wave propagation in a one-dimensional, collisionless, Maxwellian plasma, in the absence of magnetic field. The model is described and tested, first in the absence of an applied signal, and then with a small amplitude perturbation, to establish the low noise features and to verify the theoretical linear dispersion relation at wave energy levels as low as 0.000,001 of the plasma thermal energy. The method is then used to study propagation of an essentially monochromatic plane wave. Results on amplitude oscillation and nonlinear frequency shift are compared with available theories. The additional phenomena of sideband instability and satellite growth, stimulated by large amplitude wave propagation and the resulting particle trapping, are described.

  18. Downscaling seasonal to centennial simulations on distributed computing infrastructures using WRF model. The WRF4G project

    Science.gov (United States)

    Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.

    2013-12-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models

  19. A computer simulation of a potential derived from the gay-berne potential for lattice model

    Directory of Open Access Journals (Sweden)

    Habtamu Zewdie

    2000-06-01

    Full Text Available The lattice model of elongated molecules interacting via a potential derived from the Gay-Berne pair potential is proposed. We made a systematic study of the effect of varying the molecular elongation and intermolecular vector orientation dependence of the pair potential on the thermodynamic as well as the structural properties of liquid crystals. A Monte Carlo simulations of molecules placed at the site of a simple cubic lattice and interacting via the modified Gay-Berne potential with its nearest neighbours is performed. The internal energy, heat capacity, angular pair correlation function and scalar order parameter are obtained. The results are compared against predictions of molecular field theory, experimental results and that of other related simulations wherever possible. It is shown that for more elongated molecules the nematic-isotropic transition becomes stronger first order transition. For a given molecular elongation as the intermolecular vector orientation dependence becomes larger the nematic-isotropic transition becomes a stronger first order transition as measured by the rate of change of the order parameter and the divergence of the heat capacity. Scaling the potential well seems to have dramatic change on the effect of the potential well anisotropy on trends of nematic-isotropic transition temperature and divergence of the heat capacity. It is shown that the behaviour of many nematics can be described by proposed model with the elongation ratio of molecules and potential well anisotropy ranging from 3 to 5.

  20. Computer Simulation Study of Human Locomotion with a Three-Dimensional Entire-Body Neuro-Musculo-Skeletal Model

    Science.gov (United States)

    Hase, Kazunori; Yokoi, Takashi

    In the present study, the computer simulation technique to autonomously generate running motion from walking was developed using a three-dimensional entire-body neuro-musculo-skeletal model. When maximizing locomotive speed was employed as the evaluative criterion, the initial walking pattern could not transition to a valid running motion. When minimizing the period of foot-ground contact was added to this evaluative criterion, the simulation model autonomously produced appropriate three-dimensional running. Changes in the neuronal system showed the fatigue coefficient of the neural oscillators to reduce as locomotion patterns transitioned from walking to running. Then, when the running speed increased, the amplitude of the non-specific stimulus from the higher center increased. These two changes indicate mean that the improvement in responsiveness of the neuronal system is important for the transition process from walking to running, and that the comprehensive activation level of the neuronal system is essential in the process of increasing running speed.

  1. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  2. Final Technical Report: High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Feist, Christ [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models. Subtask 1.1 Turbine Scale Model: A novel computational framework for simulating the coupled interaction of complex floating structures with large-scale ocean waves and atmospheric turbulent winds has been developed. This framework is based on a domain decomposition approach coupling a large-scale far-field domain, where realistic wind and wave conditions representative from offshore environments are developed, with a near-field domain, where wind-wave body interactions can be investigated. The method applied in the near-field domain is based on a fluid-structure interaction (FSI) approach combining the curvilinear immersed boundary (CURVIB) method with a two-phase flow level set formulation and is capable of solving free surface flows interacting non-linearly with floating wind turbines. For coupling the far-field and near-field domains, a wave generation method for incorporating complex wave fields into Navier-Stokes solvers has been proposed. The wave generation method was validated for a variety of wave cases including a broadband spectrum. The computational framework has been further validated for wave-body interactions by replicating the experiment of floating wind turbine model subject to different sinusoidal wave forces (task 3). Finally, the full capabilities of the framework have been demonstrated by carrying out large eddy simulation (LES) of a floating wind turbine interacting with realistic ocean wind and wave conditions Subtask 1.2 Farm Scale Model: Several actuator

  3. Study Development of the Cardiac Computer Simulations

    Institute of Scientific and Technical Information of China (English)

    VOLKERHellemanns; ZHANGHong; SEKOUSingare; ZHANGZhen-xi; KONGXiang-yun

    2004-01-01

    The technique of computer simulations is a very efficient method in investigating mechanisms of many diseases. This paper reviews how the simulations of the human heart started as a simple mathematical models in the past and developed to the point where genetic information is needed to do suitable work like finding out new medicaments against heart diseases. Also the Influence of the development of computer performance in the future as well as the data presentation is described.

  4. Computational modeling of pitching cylinder-type ocean wave energy converters using 3D MPI-parallel simulations

    Science.gov (United States)

    Freniere, Cole; Pathak, Ashish; Raessi, Mehdi

    2016-11-01

    Ocean Wave Energy Converters (WECs) are devices that convert energy from ocean waves into electricity. To aid in the design of WECs, an advanced computational framework has been developed which has advantages over conventional methods. The computational framework simulates the performance of WECs in a virtual wave tank by solving the full Navier-Stokes equations in 3D, capturing the fluid-structure interaction, nonlinear and viscous effects. In this work, we present simulations of the performance of pitching cylinder-type WECs and compare against experimental data. WECs are simulated at both model and full scales. The results are used to determine the role of the Keulegan-Carpenter (KC) number. The KC number is representative of viscous drag behavior on a bluff body in an oscillating flow, and is considered an important indicator of the dynamics of a WEC. Studying the effects of the KC number is important for determining the validity of the Froude scaling and the inviscid potential flow theory, which are heavily relied on in the conventional approaches to modeling WECs. Support from the National Science Foundation is gratefully acknowledged.

  5. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  6. Challenges in Computational Social Modeling and Simulation for National Security Decision Making

    Science.gov (United States)

    2011-06-01

    need to be addressed before computational social models are ready for “ prime time” application in national security decision-making environments...in http://defense- update.com/newscast/0308/news/news0703_iednetworks.htm) and an Erdos -Renyi random graph ith the same number of nodes and density...SAS 0.0 0.2 0.4 0.6 0.8 1.0 1.2 P roportion per B ar 0 10 20 30 40 D E G R E E 0 1 2 3 4 5 6 ERDOS 0.0 0.1 0.2 0.3 0.4 0.5 0.6 P roportion per B ar 0

  7. A Unique Computational Algorithm to Simulate Probabilistic Multi-Factor Interaction Model Complex Material Point Behavior

    Science.gov (United States)

    Chamis, Christos C.; Abumeri, Galib H.

    2010-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points--the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  8. Development of computational models for the simulation of isodose curves on dosimetry films generated by iodine-125 brachytherapy seeds

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Adriano M.; Meira-Belo, Luiz C.; Reis, Sergio C.; Grynberg, Suely E., E-mail: amsantos@cdtn.b [Center for Development of Nuclear Technology (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-07-01

    The interstitial brachytherapy is one modality of radiotherapy in which radioactive sources are placed directly in the region to be treated or close to it. The seeds that are used in the treatment of prostate cancer are generally cylindrical radioactive sources, consisting of a ceramic or metal matrix, which acts as the carrier of the radionuclide and as the X-ray marker, encapsulated in a sealed titanium tube. This study aimed to develop a computational model to reproduce the film-seed geometry, in order to obtain the spatial regions of the isodose curves produced by the seed when it is put over the film surface. The seed modeled in this work was the OncoSeed 6711, a sealed source of iodine-125, which its isodose curves were obtained experimentally in previous work with the use of dosimetric films. For the films modeling, compositions and densities of the two types of dosimetric films were used: Agfa Personal Monitoring photographic film 2/10, manufactured by Agfa-Geavaert; and the model EBT radiochromic film, by International Specialty Products. The film-seed models were coupled to the Monte Carlo code MCNP5. The results obtained by simulations showed to be in good agreement with experimental results performed in a previous work. This indicates that the computational model can be used in future studies for other seeds models. (author)

  9. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  10. Exploring Shifts in Middle School Learners' Modeling Activity While Generating Drawings, Animations, and Computational Simulations of Molecular Diffusion

    Science.gov (United States)

    Wilkerson-Jerde, Michelle H.; Gravel, Brian E.; Macrander, Christopher A.

    2015-04-01

    Modeling and using technology are two practices of particular interest to K-12 science educators. These practices are inextricably linked among professionals, who engage in modeling activity with and across a variety of representational technologies. In this paper, we explore the practices of five sixth-grade girls as they generated models of smell diffusion using drawing, stop-motion animation, and computational simulation during a multi-day workshop. We analyze video, student discourse, and artifacts to address the questions: In what ways did learners' modeling practices, reasoning about mechanism, and ideas about smell shift as they worked across this variety of representational technologies? And, what supports enabled them to persist and progress in the modeling activity? We found that the girls engaged in two distinct modeling cycles that reflected persistence and deepening engagement in the task. In the first, messing about, they focused on describing and representing many ideas related to the spread of smell at once. In the second, digging in, they focused on testing and revising specific mechanisms that underlie smell diffusion. Upon deeper analysis, we found these cycles were linked to the girls' invention of "oogtom," a representational object that encapsulated many ideas from the first cycle and allowed the girls to restart modeling with the mechanistic focus required to construct simulations. We analyze the role of activity design, facilitation, and technological infrastructure in this pattern of engagement over the course of the workshop and discuss implications for future research, curriculum design, and classroom practice.

  11. Computational modeling of concrete flow

    DEFF Research Database (Denmark)

    Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic

    2007-01-01

    This paper provides a general overview of the present status regarding computational modeling of the flow of fresh concrete. The computational modeling techniques that can be found in the literature may be divided into three main families: single fluid simulations, numerical modeling of discrete...

  12. Computer simulation of gear tooth manufacturing processes

    Science.gov (United States)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  13. A Computer Model for the Simulation of Nonspherical Particle Dynamics in the Human Respiratory Tract

    Directory of Open Access Journals (Sweden)

    Robert Sturm

    2012-01-01

    Full Text Available In the study presented here deposition of spheres and nonspherical particles with various aspect ratios (0.01–100 in the human respiratory tract was theoretically modeled. Shape of the nonspherical particles was considered by the application of the latest aerodynamic diameter concepts. Particle deposition was predicted by using a stochastic model of the lung geometry and simulating particle transport trajectories according to the random-walk algorithm. Concerning fibers total deposition is significantly enhanced with respect to that of spheres for μm-sized particles, whereby at normal breathing conditions peripheral lung compartments serve as primary deposition targets. In the case of oblate disks, total deposition becomes mostly remarkable for submicron particles, with the bronchioli and alveoli being targeted to a high extent. Enhancement of the aerodynamic diameter and/or flow rate generally causes a displacement of deposition maxima from peripheral to more proximal lung regions. From these findings, it can be concluded that these particle classes may represent tremendous occupational hazards, especially if they are attached with radioactive elements or heavy metals.

  14. Ecological impacts of environmental toxicants and radiation on the microbial ecosystem: a model simulation of computational microbiology

    Energy Technology Data Exchange (ETDEWEB)

    Doi, Masahiro; Sakashita, Tetsuya; Ishii, Nobuyoshi; Fuma, Shoichi; Takeda, Hiroshi; Miyamoto, Kiriko; Yanagisawa, K.; Nakamura, Yuji [National Institute of Radiological Sciences, Inage, Chiba (Japan); Kawabata, Zenichiro [Center for Ecological Research, Kyoto Univ., Otsu, Shiga (Japan)

    2000-05-01

    This study explores a microorganic closed-ecosystem by computer simulation to illustrate symbiosis among populations in the microcosm that consists of heterotroph protozoa, Tetrahymena thermophila B as a consumer, autotroph algae, Euglena gracilis Z as a primary producer and saprotroph Bacteria, Escherichia coli DH5 as decomposer. The simulation program is written as a procedure of StarLogoT1.5.1, which is developed by Center for Connected Learning and Computer-Based Modeling, Tufts University. The virtual microcosm is structured and operated by the following rules; (1) Environment is defined as a lattice model, which consists of 10,201 square patches, 300 micron Wide, 300 micron Length and 100 micron Hight. (2) Each patch has its own attributes, Nutrient, Detritus and absolute coordinates, (3) Components of the species, Tetrahymena, Euglena and E-coli are defined as sub-system, and each sub-system has its own attributes as location, heading direction, cell-age, structured biomass, reserves energy and demographic parameters (assimilation rate, breeding threshold, growth rate, etc.). (4) Each component of the species, Tetrahymena, Euglena and E-coli, lives by foraging (Tetrahymena eats E-coli), excreting its metabolic products to the environment (as a substrate of E-coli), breeding and dying according vital condition. (5) Euglena utilizes sunlight energy by photosynthesis process and produces organic compounds. E-coli breaks down the organic compounds of dead protoplasm or metabolic wastes (Detritus) and releases inorganic substances to construct down stream of food cycle. Virtual ecosystem in this study is named SIM-COSM, a parallel computing model for self-sustaining system of complexity. It found that SIM-COSM is a valuable to illustrate symbiosis among populations in the microcosm, where a feedback mechanism acts in response to disturbances and interactions among species and environment. In the simulation, microbes increased demographic and environmental

  15. Delay modeling in logic simulation

    Energy Technology Data Exchange (ETDEWEB)

    Acken, J. M.; Goldstein, L. H.

    1980-01-01

    As digital integrated circuit size and complexity increases, the need for accurate and efficient computer simulation increases. Logic simulators such as SALOGS (SAndia LOGic Simulator), which utilize transition states in addition to the normal stable states, provide more accurate analysis than is possible with traditional logic simulators. Furthermore, the computational complexity of this analysis is far lower than that of circuit simulation such as SPICE. An eight-value logic simulation environment allows the use of accurate delay models that incorporate both element response and transition times. Thus, timing simulation with an accuracy approaching that of circuit simulation can be accomplished with an efficiency comparable to that of logic simulation. 4 figures.

  16. Reservoir Thermal Recover Simulation on Parallel Computers

    Science.gov (United States)

    Li, Baoyan; Ma, Yuanle

    The rapid development of parallel computers has provided a hardware background for massive refine reservoir simulation. However, the lack of parallel reservoir simulation software has blocked the application of parallel computers on reservoir simulation. Although a variety of parallel methods have been studied and applied to black oil, compositional, and chemical model numerical simulations, there has been limited parallel software available for reservoir simulation. Especially, the parallelization study of reservoir thermal recovery simulation has not been fully carried out, because of the complexity of its models and algorithms. The authors make use of the message passing interface (MPI) standard communication library, the domain decomposition method, the block Jacobi iteration algorithm, and the dynamic memory allocation technique to parallelize their serial thermal recovery simulation software NUMSIP, which is being used in petroleum industry in China. The parallel software PNUMSIP was tested on both IBM SP2 and Dawn 1000A distributed-memory parallel computers. The experiment results show that the parallelization of I/O has great effects on the efficiency of parallel software PNUMSIP; the data communication bandwidth is also an important factor, which has an influence on software efficiency. Keywords: domain decomposition method, block Jacobi iteration algorithm, reservoir thermal recovery simulation, distributed-memory parallel computer

  17. Simulating chemistry using quantum computers

    CERN Document Server

    Kassal, Ivan; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2010-01-01

    The difficulty of simulating quantum systems, well-known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  18. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  19. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  20. Computational simulation of thermal hydraulic processes in the model LMFBR fuel assembly

    Science.gov (United States)

    Bayaskhalanov, M. V.; Merinov, I. G.; Korsun, A. S.; Vlasov, M. N.

    2017-01-01

    The aim of this study was to verify a developed software module on the experimental fuel assembly with partial blockage of the flow section. The developed software module for simulation of thermal hydraulic processes in liquid metal coolant is based on theory of anisotropic porous media with specially developed integral turbulence model for coefficients determination. The finite element method is used for numerical solution. Experimental data for hexahedral assembly with electrically heated smooth cylindrical rods cooled by liquid sodium are considered. The results of calculation obtained with developed software module for a case of corner blockade are presented. The calculated distribution of coolant velocities showed the presence of the vortex flow behind the blockade. Features vortex region are in a good quantitative and qualitative agreement with experimental data. This demonstrates the efficiency of the hydrodynamic unit for developed software module. But obtained radial coolant temperature profiles differ significantly from the experimental in the vortex flow region. The possible reasons for this discrepancy were analyzed.

  1. Modeling and simulation challenges in Eulerian-Lagrangian computations of multiphase flows

    Science.gov (United States)

    Diggs, Angela; Balachandar, S.

    2017-01-01

    The present work addresses the numerical methods required for particle-gas and particle-particle interactions in Eulerian-Lagrangian simulations of multiphase flow. Local volume fraction as seen by each particle is the quantity of foremost importance in modeling and evaluating such interactions. We consider a general multiphase flow with a distribution of particles inside a fluid flow discretized on an Eulerian grid. Particle volume fraction is needed both as a Lagrangian quantity associated with each particle and also as an Eulerian quantity associated with the flow. In Grid-Based (GB) methods, the volume fraction is first obtained within each cell as an Eulerian quantity and then interpolated to each particle. In Particle-Based (PB) methods, the particle volume fraction is obtained at each particle and then projected onto the Eulerian grid. Traditionally, GB methods are used in multiphase flow, but sub-grid resolution can be obtained through use of PB methods. By evaluating the total error and its components we compare the performance of GB and PB methods. The standard von Neumann error analysis technique has been adapted for rigorous evaluation of rate of convergence. The methods presented can be extended to obtain accurate field representations of other Lagrangian quantities.

  2. Macromod: Computer Simulation For Introductory Economics

    Science.gov (United States)

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  3. Uncertainty and error in computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  4. The Application of Computer Musculoskeletal Modeling and Simulation to Investigate Compressive Tibiofemoral Force and Muscle Functions in Obese Children

    Directory of Open Access Journals (Sweden)

    Liang Huang

    2013-01-01

    Full Text Available This study aimed to utilize musculoskeletal modelling and simulation to investigate the compressive tibiofemoral force and individual muscle function in obese children. We generated a 3D muscle-driven simulation of eight obese and eight normal-weight boys walking at their self-selected speed. The compressive tibiofemoral force and individual muscle contribution to the support and progression accelerations of center of mass (COM were computed for each participant based on the subject-specific model. The simulated results were verified by comparing them to the experimental kinematics and EMG data. We found a linear relationship between the average self-selected speed and the normalized peak compressive tibiofemoral force (R2=0.611. The activity of the quadriceps contributed the most to the peak compressive tibiofemoral force during the stance phase. Obese children and nonobese children use similar muscles to support and accelerate the body COM, but nonobese children had significantly greater contributions of individual muscles. The obese children may therefore adopt a compensation strategy to avoid increasing joint loads and muscle requirements during walking. The absolute compressive tibiofemoral force and muscle forces were still greater in obese children. The long-term biomechanical adaptations of the musculoskeletal system to accommodate the excess body weight during walking are a concern.

  5. Integrating computer-aided modeling and micro-simulation in multi-criteria evaluation of service infrastructure assignment approaches

    Directory of Open Access Journals (Sweden)

    Alfonso Duran

    2013-07-01

    Full Text Available Purpose: This paper proposes an integrated computer-supported multi-staged approach to the flexible design and multicriteria evaluation of service infrastructure assignment processes/ algorithms. Design/methodology/approach: It involves particularizing a metamodel encompassing the main generic components and relationships into process models and process instances, by incorporating structural data from the real-life system. Existing data on the target user population is fed into a micro-modeling system to generate a matching population of individual “virtual” users, each with its own set of trait values. The micro-simulation of their interaction with the assignment process of both the incumbent and the competitors generates a rich multi-dimensional output, encompassing both “revealed” and non-observable data. This enables a comprehensive multi-criteria evaluation of the foreseeable performance of the designed process/ algorithm, and therefore its iterative improvement. Findings: The research project developed a set of methodologies and associated supporting tools encompassing the modeling, micro-simulation and performance assessment of service infrastructure assignment processes. Originality/value: The proposed approach facilitates, in a multicriteria environment, the flexible modeling/design of situation-specific assignment processes/algorithms and their performance assessment when facing their case-specific user population.

  6. Direct Numerical Simulation of Boiling Multiphase Flows: State-of-the-Art, Modeling, Algorithmic and Computer Needs

    Energy Technology Data Exchange (ETDEWEB)

    Nourgaliev R.; Knoll D.; Mousseau V.; Berry R.

    2007-04-01

    The state-of-the-art for Direct Numerical Simulation (DNS) of boiling multiphase flows is reviewed, focussing on potential of available computational techniques, the level of current success for their applications to model several basic flow regimes (film, pool-nucleate and wall-nucleate boiling -- FB, PNB and WNB, respectively). Then, we discuss multiphysics and multiscale nature of practical boiling flows in LWR reactors, requiring high-fidelity treatment of interfacial dynamics, phase-change, hydrodynamics, compressibility, heat transfer, and non-equilibrium thermodynamics and chemistry of liquid/vapor and fluid/solid-wall interfaces. Finally, we outline the framework for the {\\sf Fervent} code, being developed at INL for DNS of reactor-relevant boiling multiphase flows, with the purpose of gaining insight into the physics of multiphase flow regimes, and generating a basis for effective-field modeling in terms of its formulation and closure laws.

  7. Computationally efficient algorithms for Brownian dynamics simulation of long flexible macromolecules modeled as bead-rod chains

    Science.gov (United States)

    Moghani, Mahdy Malekzadeh; Khomami, Bamin

    2017-02-01

    The computational efficiency of Brownian dynamics (BD) simulation of the constrained model of a polymeric chain (bead-rod) with n beads and in the presence of hydrodynamic interaction (HI) is reduced to the order of n2 via an efficient algorithm which utilizes the conjugate-gradient (CG) method within a Picard iteration scheme. Moreover, the utility of the Barnes and Hut (BH) multipole method in BD simulation of polymeric solutions in the presence of HI, with regard to computational cost, scaling, and accuracy, is discussed. Overall, it is determined that this approach leads to a scaling of O (n1.2) . Furthermore, a stress algorithm is developed which accurately captures the transient stress growth in the startup of flow for the bead-rod model with HI and excluded volume (EV) interaction. Rheological properties of the chains up to n =350 in the presence of EV and HI are computed via the former algorithm. The result depicts qualitative differences in shear thinning behavior of the polymeric solutions in the intermediate values of the Weissenburg number (10

  8. Numerical simulation of ultrasonic enhancement on mass transfer in liquid-solid reaction by a new computational model.

    Science.gov (United States)

    Jiao, Qingbin; Bayanheshig; Tan, Xin; Zhu, Jiwei

    2014-03-01

    Mass transfer coefficient is an important parameter in the process of mass transfer. It can reflect the degree of enhancement of mass transfer process in liquid-solid reaction and in non-reactive systems like dissolution and leaching, and further verify the issues by experiments in the reaction process. In the present paper, a new computational model quantitatively solving ultrasonic enhancement on mass transfer coefficient in liquid-solid reaction is established, and the mass transfer coefficient on silicon surface with a transducer at frequencies of 40 kHz, 60 kHz, 80 kHz and 100 kHz has been numerically simulated. The simulation results indicate that mass transfer coefficient increases with the increasing of ultrasound power, and the maximum value of mass transfer coefficient is 1.467 × 10(-4) m/s at 60 kHz and the minimum is 1.310 × 10(-4) m/s at 80 kHz in the condition when ultrasound power is 50 W (the mass transfer coefficient is 2.384 × 10(-5) m/s without ultrasound). The extrinsic factors such as temperature and transducer diameter and distance between reactor and ultrasound source also influence the mass transfer coefficient on silicon surface. Mass transfer coefficient increases with the increasing temperature, with the decreasing distance between silicon and central position, with the decreasing of transducer diameter, and with the decreasing of distance between reactor and ultrasound source at the same ultrasonic power and frequency. The simulation results indicate that the computational model can quantitatively solve the ultrasonic enhancement on mass transfer coefficient.

  9. Using dataflow architecture to solve the transport lag problem when interfacing with an engineering model flight computer in a telemetry simulation

    Science.gov (United States)

    White, Joey

    The applicability of the dataflow architecture to a telemetry simulation is examined with particular reference to the problem of interfacing the simulation with an engineering model flight computer. The discussion covers the transport loop lag problem, simulation moding and control, the dataflow architecture solution, telemetry formatting and serialization, uplink command synchronization and reception, command validation and routing, and on-board computer interface and telemetry data request/response processing. The concepts discussed here have been developed for application on a training simulation for the NASA Orbital Maneuvering Vehicle.

  10. The Use of Model Matching Video Analysis and Computational Simulation to Study the Ankle Sprain Injury Mechanism

    Directory of Open Access Journals (Sweden)

    Daniel Tik-Pui Fong

    2012-10-01

    Full Text Available Lateral ankle sprains continue to be the most common injury sustained by athletes and create an annual healthcare burden of over $4 billion in the U.S. alone. Foot inversion is suspected in these cases, but the mechanism of injury remains unclear. While kinematics and kinetics data are crucial in understanding the injury mechanisms, ligament behaviour measures ‐ such as ligament strains ‐ are viewed as the potential causal factors of ankle sprains. This review article demonstrates a novel methodology that integrates model matching video analyses with computational simulations in order to investigate injury‐producing events for a better understanding of such injury mechanisms. In particular, ankle joint kinematics from actual injury incidents were deduced by model matching video analyses and then input into a generic computational model based on rigid bone surfaces and deformable ligaments of the ankle so as to investigate the ligament strains that accompany these sprain injuries. These techniques may have the potential for guiding ankle sprain prevention strategies and targeted rehabilitation therapies.

  11. Effect of Inquiry-Based Computer Simulation Modeling on Pre-Service Teachers' Understanding of Homeostasis and Their Perceptions of Design Features

    Science.gov (United States)

    Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn

    2016-01-01

    This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…

  12. Effect of Inquiry-Based Computer Simulation Modeling on Pre-Service Teachers' Understanding of Homeostasis and Their Perceptions of Design Features

    Science.gov (United States)

    Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn

    2016-01-01

    This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…

  13. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  14. Computer simulation of model cohesive powders: Plastic consolidation, structural changes, and elasticity under isotropic loads

    Science.gov (United States)

    Gilabert, F. A.; Roux, J.-N.; Castellanos, A.

    2008-09-01

    The quasistatic behavior of a simple two-dimensional model of a cohesive powder under isotropic loads is investigated by discrete element simulations. We ignore contact plasticity and focus on the effect of geometry and collective rearrangements on the material behavior. The loose packing states, as assembled and characterized in a previous numerical study [Gilabert, Roux, and Castellanos, Phys. Rev. E 75, 011303 (2007)], are observed, under growing confining pressure P , to undergo important structural changes, while solid fraction Φ irreversibly increases (typically, from 0.4-0.5 to 0.75-0.8). The system state goes through three stages, with different forms of the plastic consolidation curve, i.e., Φ as a function of the growing reduced pressure P*=Pa/F0 , defined with adhesion force F0 and grain diameter a . In the low-confinement regime (I), the system undergoes negligible plastic compaction, and its structure is influenced by the assembling process. In regime II the material state is independent of initial conditions, and the void ratio varies linearly with lnP [i.e., Δ(1/Φ)=λΔ(lnP*) ], as described in the engineering literature. Plasticity index λ is reduced in the presence of a small rolling resistance (RR). In the last stage of compaction (III), Φ approaches an asymptotic, maximum solid fraction Φmax , as a power law Φmax-Φ∝(P*)-α , with α≃1 , and properties of cohesionless granular packs are gradually retrieved. Under consolidation, while the range ξ of fractal density correlations decreases, force patterns reorganize from self-balanced clusters to force chains, with correlative evolutions of force distributions, and elastic moduli increase by a large amount. Plastic deformation events correspond to very small changes in the network topology, while the denser regions tend to move like rigid bodies. Elastic properties are dominated by the bending of thin junctions in loose systems. For growing RR those tend to form particle chains, the

  15. Numerical characteristics of quantum computer simulation

    Science.gov (United States)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  16. Shadow effects in simulated ultrasound images derived from computed tomography images using a focused beam tracing model

    DEFF Research Database (Denmark)

    Pham, An Hoai; Lundgren, Bo; Stage, Bjarne

    2012-01-01

    Simulation of ultrasound images based on computed tomography (CT) data has previously been performed with different approaches. Shadow effects are normally pronounced in ultrasound images, so they should be included in the simulation. In this study, a method to capture the shadow effects has been...

  17. Three-dimensional computer simulation at vehicle collision using dynamic model. Application to various collision types; Rikigaku model ni yoru jidosha shototsuji no sanjigen kyodo simulation. Shushu no shototsu keitai eno tekiyo

    Energy Technology Data Exchange (ETDEWEB)

    Abe, M.; Morisawa, M. [Musashi Institute of Technology, Tokyo (Japan); Sato, T. [Keio University, Tokyo (Japan); Kobayashi, K. [Molex-Japan Co. Ltd., Tokyo (Japan)

    1997-10-01

    The past study of safety at vehicle collision pays attention to phenomena within the short time from starting collision, and the behavior of rollover is studied separating from that at collision. Most simulations of traffic accident are two-dimensional simulations. Therefore, it is indispensable for vehicle design to the analyze three-dimensional and continuous behavior from crash till stopping. Accordingly, in this study, the three-dimensional behavior of two vehicles at collision was simulated by computer using dynamic models. Then, by comparison of the calculated results with real vehicles` collision test data, it was confirmed that dynamic model of this study was reliable. 10 refs., 6 figs., 3 tabs.

  18. A Markov computer simulation model of the economics of neuromuscular blockade in patients with acute respiratory distress syndrome

    Directory of Open Access Journals (Sweden)

    Chow John L

    2006-03-01

    Full Text Available Abstract Background Management of acute respiratory distress syndrome (ARDS in the intensive care unit (ICU is clinically challenging and costly. Neuromuscular blocking agents may facilitate mechanical ventilation and improve oxygenation, but may result in prolonged recovery of neuromuscular function and acute quadriplegic myopathy syndrome (AQMS. The goal of this study was to address a hypothetical question via computer modeling: Would a reduction in intubation time of 6 hours and/or a reduction in the incidence of AQMS from 25% to 21%, provide enough benefit to justify a drug with an additional expenditure of $267 (the difference in acquisition cost between a generic and brand name neuromuscular blocker? Methods The base case was a 55 year-old man in the ICU with ARDS who receives neuromuscular blockade for 3.5 days. A Markov model was designed with hypothetical patients in 1 of 6 mutually exclusive health states: ICU-intubated, ICU-extubated, hospital ward, long-term care, home, or death, over a period of 6 months. The net monetary benefit was computed. Results Our computer simulation modeling predicted the mean cost for ARDS patients receiving standard care for 6 months to be $62,238 (5% – 95% percentiles $42,259 – $83,766, with an overall 6-month mortality of 39%. Assuming a ceiling ratio of $35,000, even if a drug (that cost $267 more hypothetically reduced AQMS from 25% to 21% and decreased intubation time by 6 hours, the net monetary benefit would only equal $137. Conclusion ARDS patients receiving a neuromuscular blocker have a high mortality, and unpredictable outcome, which results in large variability in costs per case. If a patient dies, there is no benefit to any drug that reduces ventilation time or AQMS incidence. A prospective, randomized pharmacoeconomic study of neuromuscular blockers in the ICU to asses AQMS or intubation times is impractical because of the highly variable clinical course of patients with ARDS.

  19. Modeling and Simulation of the Thermal Runaway Behavior of Cylindrical Li-Ion Cells—Computing of Critical Parameters

    Directory of Open Access Journals (Sweden)

    Andreas Melcher

    2016-04-01

    Full Text Available The thermal behavior of Li-ion cells is an important safety issue and has to be known under varying thermal conditions. The main objective of this work is to gain a better understanding of the temperature increase within the cell considering different heat sources under specified working conditions. With respect to the governing physical parameters, the major aim is to find out under which thermal conditions a so called Thermal Runaway occurs. Therefore, a mathematical electrochemical-thermal model based on the Newman model has been extended with a simple combustion model from reaction kinetics including various types of heat sources assumed to be based on an Arrhenius law. This model was realized in COMSOL Multiphysics modeling software. First simulations were performed for a cylindrical 18650 cell with a L i C o O 2 -cathode to calculate the temperature increase under two simple electric load profiles and to compute critical system parameters. It has been found that the critical cell temperature T crit , above which a thermal runaway may occur is approximately 400 K , which is near the starting temperature of the decomposition of the Solid-Electrolyte-Interface in the anode at 393 . 15 K . Furthermore, it has been found that a thermal runaway can be described in three main stages.

  20. Computational Investigations on Polymerase Actions in Gene Transcription and Replication Combining Physical Modeling and Atomistic Simulations

    OpenAIRE

    Yu, Jin

    2015-01-01

    Polymerases are protein enzymes that move along nucleic acid chains and catalyze template-based polymerization reactions during gene transcription and replication. The polymerases also substantially improve transcription or replication fidelity through the non-equilibrium enzymatic cycles. We briefly review computational efforts that have been made toward understanding mechano-chemical coupling and fidelity control mechanisms of the polymerase elongation. The polymerases are regarded as molec...

  1. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach.

    Science.gov (United States)

    Díaz-Zuccarini, Vanessa; Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London.

  2. Dynamic modelling of an adsorption storage tank using a hybrid approach combining computational fluid dynamics and process simulation

    Science.gov (United States)

    Mota, J.P.B.; Esteves, I.A.A.C.; Rostam-Abadi, M.

    2004-01-01

    A computational fluid dynamics (CFD) software package has been coupled with the dynamic process simulator of an adsorption storage tank for methane fuelled vehicles. The two solvers run as independent processes and handle non-overlapping portions of the computational domain. The codes exchange data on the boundary interface of the two domains to ensure continuity of the solution and of its gradient. A software interface was developed to dynamically suspend and activate each process as necessary, and be responsible for data exchange and process synchronization. This hybrid computational tool has been successfully employed to accurately simulate the discharge of a new tank design and evaluate its performance. The case study presented here shows that CFD and process simulation are highly complementary computational tools, and that there are clear benefits to be gained from a close integration of the two. ?? 2004 Elsevier Ltd. All rights reserved.

  3. Casting directly from a computer model by using advanced simulation software FLOW-3D Cast ®

    Directory of Open Access Journals (Sweden)

    M. Sirviö

    2009-01-01

    Full Text Available ConiferRob - A patternless casting technique, originally conceived at VTT Technical Research Centre of Finland and furtherdeveloped at its spin-off company, Simtech Systems, offers up to 40% savings in product development costs, and up to two months shorterdevelopment times compared to conventional techniques. Savings of this order can be very valuable on today's highly competitivemarkets. Casting simulation is commonly used for designing of casting systems. However, most of the software are today old fashioned and predicting just shrinkage porosity. Flow Science, VTT and Simtech have developed new software called FLOW-3D Cast ® , whichcan simulate surface defects, air entrainment, filters, core gas problems and even a cavitation.

  4. A Finite-Rate Gas-Surface Interaction Model Informed by Fundamental Computational Chemistry Simulations

    Science.gov (United States)

    2013-03-31

    oxygen interactions with a specific crystalline polymorph of SiO2 (called β- cristobalite ). Computer images of this crystal lattice are shown in Fig...2. The choice of β- cristobalite is motivated by experimental studies from Balat-Pichelin et al. [15,22,23] where silicon-carbide (SiC) surfaces were...conclusion that the measured loss rates correspond to β- cristobalite was based on the fact that for polymorph diagrams of SiO2, β- cristobalite is most stable

  5. UTILIZATION OF THE NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY FIRE DYNAMICS SIMULATION COMPUTER MODEL

    Energy Technology Data Exchange (ETDEWEB)

    L. BARTLEIN

    2001-05-01

    The objective of this report is to provide a methodology for utilization of the NIST FDS code to evaluate the effects of radiant and convective heating from single and multiple fire sources, on heat sensitive targets as Special Nuclear Materials (SNM), and High Explosives (HE). The presentation will demonstrate practical applications of the FDS computer program in fire hazards analysis, and illustrate the advantages over hand calculations for radiant heat and convective transfer and fire progression. The ''visualization'' of radiant and convective heat effects will be demonstrated as a tool for supporting conclusions of fire hazards analysis and TSR development.

  6. Computer aided simulation for developing a simple model to predict cooling of packaged foods

    DEFF Research Database (Denmark)

    Christensen, Martin Gram; Feyissa, Aberham Hailu; Adler-Nissen, Jens

    A new equation to predict equilibrium temperatures for cooling operations of packaged foods has been deducted from the traditional 1st order solution to Fourier’s heat transfer equations. The equation is analytical in form and only requires measurable parameters, in form of area vs. volume ratio (A...... are too laborious or impossible to conduct. The deducted equation was tested for irregular geometries, unequal heat transfer and headspace restrictions. The new equation predicted equilibrium temperature curves of the simulated cooling with a low error (1.5°C for Fourier numbers below 0.3) and good...

  7. Computer simulation environment for comparative analysis of models for investment portfolio management

    Science.gov (United States)

    Marchev, Angel, Jr.; Marchev, Angel

    2013-12-01

    Building on the notion of systematically analyzing investment portfolio as a feedback system, there is a need of experimentation system. In this paper such system for experimenting with various traditional, classical, advanced, etc. models for portfolio management is described. The main objective is to have the ability to compete the models systematically on a unified data track.

  8. Computer simulation of temperature-dependent growth of fractal and compact domains in diluted Ising models

    DEFF Research Database (Denmark)

    Sørensen, Erik Schwartz; Fogedby, Hans C.; Mouritsen, Ole G.

    1989-01-01

    A version of the two-dimensional site-diluted spin-(1/2 Ising model is proposed as a microscopic interaction model which governs solidification and growth processes controlled by vacancy diffusion. The Ising Hamiltonian describes a solid-fluid phase transition and it permits a thermodynamic......-water interfaces....

  9. Model description document for a computer program for the emulation/simulation of a space station environmental control and life support system (ESCM)

    Science.gov (United States)

    Yanosy, James L.

    1988-01-01

    Emulation/Simulation Computer Model (ESCM) computes the transient performance of a Space Station air revitalization subsystem with carbon dioxide removal provided by a solid amine water desorbed subsystem called SAWD. This manual describes the mathematical modeling and equations used in the ESCM. For the system as a whole and for each individual component, the fundamental physical and chemical laws which govern their operations are presented. Assumptions are stated, and when necessary, data is presented to support empirically developed relationships.

  10. Distribution of ethanol in a model membrane: a computer simulation study

    Science.gov (United States)

    Chanda, Jnanojjal; Bandyopadhyay, Sanjoy

    2004-07-01

    Constant temperature and pressure (NPT) atomistic molecular dynamics (MD) simulations have been carried out on fully hydrated liquid crystalline lamellar phase of a pure dimyrystoylphosphatidylcholine (DMPC) lipid bilayer at 30 °C, and its mixture with a mol fraction of 12.5% of ethanol. It has been observed that at this low concentration the ethanol molecules preferentially occupy regions near the bilayer interface, in agreement with NMR data. Small changes in bilayer structure and the lipid hydrocarbon chain conformations have been noticed. It is observed that the interaction between the ethanol molecules and the lipid head groups has influence on the orientation of the P - → N + head group dipole toward the aqueous phase.

  11. Simulation modeling of cloud computing for smart grid using CloudSim

    Directory of Open Access Journals (Sweden)

    Sandeep Mehmi

    2017-05-01

    Full Text Available In this paper a smart grid cloud has been simulated using CloudSim. Various parameters like number of virtual machines (VM, VM Image size, VM RAM, VM bandwidth, cloudlet length, and their effect on cost and cloudlet completion time in time-shared and space-shared resource allocation policy have been studied. As the number of cloudlets increased from 68 to 178, greater number of cloudlets completed their execution with high cloudlet completion time in time-shared allocation policy as compared to space-shared allocation policy. Similar trend has been observed when VM bandwidth is increased from 1 Gbps to 10 Gbps and VM RAM is increased from 512 MB to 5120 MB. The cost of processing increased linearly with respect to increase in number of VMs, VM Image size and cloudlet length.

  12. The computer program LIAR for the simulation and modeling of high performance linacs

    Energy Technology Data Exchange (ETDEWEB)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.O.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-07-01

    High performance linear accelerators are the central components of the proposed next generation of linear colliders. They must provide acceleration of up to 750 GeV per beam while maintaining small normalized emittances. Standard simulation programs, mainly developed for storage rings, did not meet the specific requirements for high performance linacs with high bunch charges and strong wakefields. The authors present the program. LIAR (LInear Accelerator Research code) that includes single and multi-bunch wakefield effects, a 6D coupled beam description, specific optimization algorithms and other advanced features. LIAR has been applied to and checked against the existing Stanford Linear Collider (SLC), the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS) at SLAC. Its modular structure allows easy extension for different purposes. The program is available for UNIX workstations and Windows PC`s.

  13. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  14. Vernier Caliper and Micrometer Computer Models Using Easy Java Simulation and Its Pedagogical Design Features--Ideas for Augmenting Learning with Real Instruments

    Science.gov (United States)

    Wee, Loo Kang; Ning, Hwee Tiang

    2014-01-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a…

  15. Vernier Caliper and Micrometer Computer Models Using Easy Java Simulation and Its Pedagogical Design Features--Ideas for Augmenting Learning with Real Instruments

    Science.gov (United States)

    Wee, Loo Kang; Ning, Hwee Tiang

    2014-01-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a…

  16. MUSIDH, multiple use of simulated demographic histories, a novel method to reduce computation time in microsimulation models of infectious diseases.

    Science.gov (United States)

    Fischer, E A J; De Vlas, S J; Richardus, J H; Habbema, J D F

    2008-09-01

    Microsimulation of infectious diseases requires simulation of many life histories of interacting individuals. In particular, relatively rare infections such as leprosy need to be studied in very large populations. Computation time increases disproportionally with the size of the simulated population. We present a novel method, MUSIDH, an acronym for multiple use of simulated demographic histories, to reduce computation time. Demographic history refers to the processes of birth, death and all other demographic events that should be unrelated to the natural course of an infection, thus non-fatal infections. MUSIDH attaches a fixed number of infection histories to each demographic history, and these infection histories interact as if being the infection history of separate individuals. With two examples, mumps and leprosy, we show that the method can give a factor 50 reduction in computation time at the cost of a small loss in precision. The largest reductions are obtained for rare infections with complex demographic histories.

  17. Computational modeling of flow and combustion in a couette channel simulating microgravity

    Science.gov (United States)

    Hamdan, Ghaleb

    Theoretically a Couette flow in a narrow channel can be utilized to simulate microgravity conditions experienced by a surface flame due to the linear velocity profile. Hence, the Couette channel is a potential apparatus for the study of flame spread in an environment that recreated microgravity flow conditions. Simulated microgravity conditions were achieved by limiting the vertical extent over and under the flame to suppress buoyancy. This numerical study was done for a 2-D channel using Fire Dynamics Simulator (FDS). This thesis is divided into two sections; the first is the study of Couette flow with a non-reacting cold flow in a finite length channel, a subject with surprisingly little past research, despite the ubiquity of "infinite" Couette channels in text books. The channel was placed in a room to allow for a better representation of a realistic channel and allow the flow and pressure field to develop without forcing them at the inlet and outlet. The plate's velocities, channel's gap and the channel's length were varied and the results of the u-velocity profile, w-velocity profile and pressure were investigated. The entrance length relationship with Reynolds number for a finite Couette Channel was determined for the first time - as far as the author knows - in order to ensure the flame occurs in a fully developed flow. In contrast to an infinite channel, the u-velocity was found to be nonlinear due to an adverse pressure differential created along the channel attributed to the pull force along the entrance of the channel created by the top plate a well as the pressure differential created by the flow exiting the channel. The linearity constant was derived for the one moving plate case. The domain consisted of a rectangular region with the top plate moving and the bottom plate fixed except for a few cases in which the bottom plate also moved and were compared with only one moving plate. The second section describes the combustion of a thin cellulose sample

  18. Computational Simulation of Complex Structure Fancy Yarns

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A study is reported for mathematical model and simulation of complex structure fancy yarns. The investigated complex structure fancy yarns have a multithread structure composed of three components -core, effect, and binder yams. In current research the precondition was accepted that the cross-sections of the both two yarns of the effect intermediate product in the complex structure fancy yarn remain the circles shaped, and this shape does not change during manufacturing of the fancy yarn. Mathematical model of complex structure fancy yarn is established based on parameter equation of space helix line and computer simulation is further carried out using the computational mathematical tool Matlab 6.5. Theoretical structure of fancy yarn is compared with an experimental sample. The simulation system would help for further the set ofinformation in designing of new assortment of the complex structure fancy yarns and prediction of visual effects of fancy yarns in end-use fabrics.

  19. Cosmological Simulations on a Grid of Computers

    CERN Document Server

    Depardon, Benjamin; Desprez, Frédéric; Blaizot, Jérémy; Courtois, Hélène M

    2010-01-01

    The work presented in this paper aims at restricting the input parameter values of the semi-analytical model used in GALICS and MOMAF, so as to derive which parameters influence the most the results, e.g., star formation, feedback and halo recycling efficiencies, etc. Our approach is to proceed empirically: we run lots of simulations and derive the correct ranges of values. The computation time needed is so large, that we need to run on a grid of computers. Hence, we model GALICS and MOMAF execution time and output files size, and run the simulation using a grid middleware: DIET. All the complexity of accessing resources, scheduling simulations and managing data is harnessed by DIET and hidden behind a web portal accessible to the users.

  20. Computer simulation on fatigue behavior of cemented hip prostheses: a physiological model.

    Science.gov (United States)

    Hung, Jui-Pin; Chen, Jian-Horng; Chiang, Hsiu-Lu; Wu, James Shih-Shyn

    2004-11-01

    This paper is concerned with the investigation on the fatigue failure of implant fixation by numerical approaches. A computer algorithm based on finite element analysis and continuum damage mechanics was proposed to quantify the fatigue damage rate of cement mantle under physiological conditions. In examining the interfacial debonding effect, the interface elements were introduced at cement-stem interfaces and calibrated with the increase of loading cycles. Current results reveal that the major sites for failure initiation are in the proximal anterior-medial regions and at the distal prosthesis tip, which clearly demonstrate the same failure scenario as observed in clinical studies. Such fatigue failures not only result in the corruption of cement-stem interfaces, but also greatly affect the cement stress distribution and the damage rate in subsequent loading cycles. Another significant result is that the predicted damage rate increases steadily with gait cycles. This trend in damage development is consistent with the findings obtained from fatigue tests available in literature. It is anticipated that presented methodology can serve as a pre-clinical validation of cemented hip prostheses.

  1. Radar Landmass Simulation Computer Programming (Interim Report).

    Science.gov (United States)

    RADAR SCANNING, TERRAIN), (*NAVAL TRAINING, RADAR OPERATORS), (*FLIGHT SIMULATORS, TERRAIN AVOIDANCE), (* COMPUTER PROGRAMMING , INSTRUCTION MANUALS), PLAN POSITION INDICATORS, REAL TIME, DISPLAY SYSTEMS, RADAR IMAGES, SIMULATION

  2. Computer Modeling and Simulation of Bullet Impact to the Human Thorax

    Science.gov (United States)

    2000-06-01

    of an epiphyseal plate. As the plate expands, the cartilaginous growth plate beneath it is subsequently calcified and turned into compact bone. For...important in creating a complete thoracic model. Additionally, the various ligaments, tendons , and minor muscles of the thorax are not included in

  3. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  4. Simulation of windblown dust transport from a mine tailings impoundment using a computational fluid dynamics model

    OpenAIRE

    Stovern, Michael; Felix, Omar; Csavina, Janae; Kyle P. Rine; Russell, MacKenzie R.; Jones, Robert M; King, Matt; Betterton, Eric A.; Sáez, A. Eduardo

    2014-01-01

    Mining operations are potential sources of airborne particulate metal and metalloid contaminants through both direct smelter emissions and wind erosion of mine tailings. The warmer, drier conditions predicted for the Southwestern US by climate models may make contaminated atmospheric dust and aerosols increasingly important, due to potential deleterious effects on human health and ecology. Dust emissions and dispersion of dust and aerosol from the Iron King Mine tailings in Dewey-Humboldt, Ar...

  5. A Computer Simulation of Ultrasound Thermal Bio-Effect in Embryonic Model

    Directory of Open Access Journals (Sweden)

    J. Rozman

    2003-12-01

    Full Text Available At the present time, the usage of ultrasound diagnostic equipmenthas become an inseparable part of diagnosis for a number of medicalinvestigations. Several scientific studies published in the last yearsshowed that when applying a diagnostic ultrasound system on animals itis possible to create negative changes in tissues. New ultrasoundtechnologies and higher output acoustic powers have brought a possiblerisk connected to the usage of ultrasound in diagnostics. The knowledgeof risk level and exploration of limiting factors is an important pointfor the assessment of marginal ultrasound exposure values of medicalinvestigation during pregnancy, especially in the first trimester. Thecontribution presents a MATLABZ application for modeling of tissueheating in human embryos at the developmental age of seven and eightweeks. Recent calculations of US fields, which are generated by severaltypes of various unfocused single transducers (rectangular, circular,and annular, represent maximum temperature elevation of 0.4 °C inembryonic model tissues for the exposure of 1 min. The models ofembryonic tissue heating provide comparative studies of possiblebio-effect with the purpose to explore limiting factors of ultrasoundexposure.

  6. Evolution of blast wave profiles in simulated air blasts: experiment and computational modeling

    Science.gov (United States)

    Chandra, N.; Ganpule, S.; Kleinschmit, N. N.; Feng, R.; Holmberg, A. D.; Sundaramurthy, A.; Selvan, V.; Alai, A.

    2012-09-01

    Shock tubes have been extensively used in the study of blast traumatic brain injury due to increased incidence of blast-induced neurotrauma in Iraq and Afghanistan conflicts. One of the important aspects in these studies is how to best replicate the field conditions in the laboratory which relies on reproducing blast wave profiles. Evolution of the blast wave profiles along the length of the compression-driven air shock tube is studied using experiments and numerical simulations with emphasis on the shape and magnitude of pressure time profiles. In order to measure dynamic pressures of the blast, a series of sensors are mounted on a cylindrical specimen normal to the flow direction. Our results indicate that the blast wave loading is significantly different for locations inside and outside of the shock tube. Pressure profiles inside the shock tube follow the Friedlander waveform fairly well. Upon approaching exit of the shock tube, an expansion wave released from the shock tube edges significantly degrades the pressure profiles. For tests outside the shock tube, peak pressure and total impulse reduce drastically as we move away from the exit and majority of loading is in the form of subsonic jet wind. In addition, the planarity of the blast wave degrades as blast wave evolves three dimensionally. Numerical results visually and quantitatively confirm the presence of vortices, jet wind and three-dimensional expansion of the planar blast wave near the exit. Pressure profiles at 90° orientation show flow separation. When cylinder is placed inside, this flow separation is not sustained, but when placed outside the shock tube this flow separation is sustained which causes tensile loading on the sides of the cylinder. Friedlander waves formed due to field explosives in the intermediate-to far-field ranges are replicated in a narrow test region located deep inside the shock tube.

  7. Simulating Human Cognitive Using Computational Verb Theory

    Institute of Scientific and Technical Information of China (English)

    YANGTao

    2004-01-01

    Modeling and simulation of a life system is closely connected to the modeling of cognition,especially for advanced life systems. The primary difference between an advanced life system and a digital computer is that the advanced life system consists of a body with mind while a digital computer is only a mind in a formal sense. To model an advanced life system one needs to symbols into a body where a digital computer is embedded. In this paper, a computational verb theory is proposed as a new paradigm of grounding symbols into the outputs of sensors. On one hand, a computational verb can preserve the physical "meanings" of the dynamics of sensor data such that a symbolic system can be used to manipulate physical meanings instead of abstract tokens in the digital computer. On the other hand, the physical meanings of an abstract symbol/token, which is usually an output of a reasoning process in the digital computer, can be restored and fed back to the actuators. Therefore, the computational verb theory bridges the gap between symbols and physical reality from the dynamic cognition perspective.

  8. Automatic temperature computation for realistic IR simulation

    Science.gov (United States)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  9. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  10. A computational model for large eddy simulation of dilute bubbly turbulent flows

    Science.gov (United States)

    Hajit, Mohammad; Sotiropoulos, Fotis

    2013-11-01

    A mathematical formulation of filtered equations for two phase bubbly flows based on two-fluid method is presented. To remove high frequencies (noise), we extracted the filtered form of the equations in curvilinear coordinates, converting the microscopic governing equations to macroscopic equations via spatial averaging of solution variables. The set of equations describing the hydrodynamics in a gas-liquid system can be solved effectively if the solution procedure is decoupled so that an efficient iterative scheme can be employed. We propose a formulation for dilute bubbly flows in which the equations are converted to a loosely-coupled form. The resulting mathematical model is based on five distinct sets of equations, namely mixture momentum balance, pressure Poisson equation, Boyle's law and momentum and mass balances of gas phase. This mathematical formulation provides an efficient numerical procedure for two-way coupling of bubbly flows at low gas holdups. The subgrid-scale modeling is based on dynamic procedure of Germano for both phases. The formulation is validated for a fully turbulent bubble column test by comparing to available experimental results. This work is supported by the US department of energy (DE-EE0005416) and the Minnesota supercomputing institute.

  11. Computer simulation of multiple dynamic photorefractive gratings

    DEFF Research Database (Denmark)

    Buchhave, Preben

    1998-01-01

    The benefits of a direct visualization of space-charge grating buildup are described. The visualization is carried out by a simple repetitive computer program, which simulates the basic processes in the band-transport model and displays the result graphically or in the form of numerical data. The....... The simulation sheds light on issues that are not amenable to analytical solutions, such as the spectral content of the wave forms, cross talk in three-beam interaction, and the range of applications of the band-transport model. (C) 1998 Optical Society of America....

  12. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  13. Computational modelling of six speed hybrid gear box and its simulation using Simulinkas an interactive tool of MATLAB

    Directory of Open Access Journals (Sweden)

    Devesh Ramphal Upadhyay

    2016-02-01

    Full Text Available The paper introduces an idea which adds itself into contribution of getting best fuel economy of a passenger car when it is running at high speed on a highway. A six speed (forward gear box is addressed in the paper which is controlled manually and automatically as well. The paper introduces an advancement in manual transmission gear box for passenger cars. Hydraulic circuit is designed with mechatronics point of view and resulting in making the shifting of gear automatically. A computational design is made of the Hybrid Gear Box (HGB using CATIA P3 V5 as a designing software. A new gear meshing in 5 speed manual transmission gear box which synchronizes with the output shaft of the transmission automatically after getting command by the automated system designed. Parameters are considered on the basis of practical model and is been simulated by using Simdriveline as the Simulink tool of MATLAB r2010a. The mechanical properties of the components of the hybrid gear box is calculated on the basis of the functional parameters and with help of the fundamental and dependent properties formulation. The final result is the graphical analysis of the model forobtaining at least 15% fuel efficient than any of the vehicle of same configurations.

  14. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  15. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  16. Computer simulation to arc spraying

    Institute of Scientific and Technical Information of China (English)

    梁志芳; 李午申; 王迎娜

    2004-01-01

    The arc spraying process is divided into two stages: the first stage is atomization-spraying stream (ASS) and the second one is spraying deposition (SD). Then study status is described of both stages' physical model and corresponding controlling-equation. Based on the analysis of study status, the conclusion as follows is got. The heat and mass transfer models with two or three dimensions in ASS stage should be established to far deeply analyses the dynamical and thermal behavior of the overheat droplet. The statistics law of overheated droplets should be further studied by connecting simulation with experiments. More proper validation experiments should be designed for flattening simulation to modify the models in SD stage.

  17. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  18. Edible oil structures at low and intermediate concentrations. I. Modeling, computer simulation, and predictions for X ray scattering

    Science.gov (United States)

    Pink, David A.; Quinn, Bonnie; Peyronel, Fernanda; Marangoni, Alejandro G.

    2013-12-01

    Triacylglycerols (TAGs) are biologically important molecules which form the recently discovered highly anisotropic crystalline nanoplatelets (CNPs) and, ultimately, the large-scale fat crystal networks in edible oils. Identifying the hierarchies of these networks and how they spontaneously self-assemble is important to understanding their functionality and oil binding capacity. We have modelled CNPs and studied how they aggregate under the assumption that all CNPs are present before aggregation begins and that their solubility in the liquid oil is very low. We represented CNPs as rigid planar arrays of spheres with diameter ≈50 nm and defined the interaction between spheres in terms of a Hamaker coefficient, A, and a binding energy, VB. We studied three cases: weak binding, |VB|/kBT ≪ 1, physically realistic binding, VB = Vd(R, Δ), so that |VB|/kBT ≈ 1, and Strong binding with |VB|/kBT ≫ 1. We divided the concentration of CNPs, ϕ, with 0≤ϕ= 10-2 (solid fat content) ≤1, into two regions: Low and intermediate concentrations with 0<ϕ<0.25 and high concentrations with 0.25 < ϕ and considered only the first case. We employed Monte Carlo computer simulation to model CNP aggregation and analyzed them using static structure functions, S(q). We found that strong binding cases formed aggregates with fractal dimension, D, 1.7≤D ≤1.8, in accord with diffusion limited cluster-cluster aggregation (DLCA) and weak binding formed aggregates with D =3, indicating a random distribution of CNPs. We found that models with physically realistic intermediate binding energies formed linear multilayer stacks of CNPs (TAGwoods) with fractal dimension D =1 for ϕ =0.06,0.13, and 0.22. TAGwood lengths were greater at lower ϕ than at higher ϕ, where some of the aggregates appeared as thick CNPs. We increased the spatial scale and modelled the TAGwoods as rigid linear arrays of spheres of diameter ≈500 nm, interacting via the attractive van der Waals interaction. We

  19. Computer Simulation for Emergency Incident Management

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  20. Computer simulations in the science classroom

    Science.gov (United States)

    Richards, John; Barowy, William; Levin, Dov

    1992-03-01

    In this paper we describe software for science instruction that is based upon a constructivist epistemology of learning. From a constructivist perspective, the process of learning is viewed as an active construction of knowledge, rather than a passive reception of information. The computer has the potential to provide an environment in which students can explore their understanding and better construct scientific knowledge. The Explorer is an interactive environment that integrates animated computer models with analytic capabilities for learning and teaching science. The system include graphs, a spreadsheet, scripting, and interactive tools. During formative evaluation of Explorer in the classroom, we have focused on learning the function and effectiveness of computer models in teaching science. Models have helped students relate theory to experiment when used in conjunction with hands-on activities and when the simulation addressed students' naive understanding of the phenomena. Two classroom examples illustrate our findings. The first is based on the dynamics of colliding objects. The second describes a class modeling the function of simple electric circuits. The simulations bridge between phenomena and theory by providing an abstract representation on which students may make measurements. Simulations based on scientific theory help to provide a set of interrelated experiences that challenge students' informal understanding of the science.

  1. Development of response models for the Earth Radiation Budget Experiment (ERBE) sensors. Part 1: Dynamic models and computer simulations for the ERBE nonscanner, scanner and solar monitor sensors

    Science.gov (United States)

    Halyo, Nesim; Choi, Sang H.; Chrisman, Dan A., Jr.; Samms, Richard W.

    1987-01-01

    Dynamic models and computer simulations were developed for the radiometric sensors utilized in the Earth Radiation Budget Experiment (ERBE). The models were developed to understand performance, improve measurement accuracy by updating model parameters and provide the constants needed for the count conversion algorithms. Model simulations were compared with the sensor's actual responses demonstrated in the ground and inflight calibrations. The models consider thermal and radiative exchange effects, surface specularity, spectral dependence of a filter, radiative interactions among an enclosure's nodes, partial specular and diffuse enclosure surface characteristics and steady-state and transient sensor responses. Relatively few sensor nodes were chosen for the models since there is an accuracy tradeoff between increasing the number of nodes and approximating parameters such as the sensor's size, material properties, geometry, and enclosure surface characteristics. Given that the temperature gradients within a node and between nodes are small enough, approximating with only a few nodes does not jeopardize the accuracy required to perform the parameter estimates and error analyses.

  2. Understanding membrane fouling mechanisms through computational simulations

    Science.gov (United States)

    Xiang, Yuan

    This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.

  3. Computer Simulation of Spatial Arrangement and Connectivity of Particles in Three-Dimensional Microstructure: Application to Model Electrical Conductivity of Polymer Matrix Composite

    Science.gov (United States)

    Louis, P.; Gokhale, A. M.

    1996-01-01

    Computer simulation is a powerful tool for analyzing the geometry of three-dimensional microstructure. A computer simulation model is developed to represent the three-dimensional microstructure of a two-phase particulate composite where particles may be in contact with one another but do not overlap significantly. The model is used to quantify the "connectedness" of the particulate phase of a polymer matrix composite containing hollow carbon particles in a dielectric polymer resin matrix. The simulations are utilized to estimate the morphological percolation volume fraction for electrical conduction, and the effective volume fraction of the particles that actually take part in the electrical conduction. The calculated values of the effective volume fraction are used as an input for a self-consistent physical model for electrical conductivity. The predicted values of electrical conductivity are in very good agreement with the corresponding experimental data on a series of specimens having different particulate volume fraction.

  4. Modeling and Computer Simulation of the Pulsed Powering of Mechanical D.C. Circuit Breakers for the CERN/LHC Superconducting Magnet Energy Extraction System

    CERN Document Server

    Anushat, V; Erokhin, A; Kussul, A; Medvedko, A S

    2000-01-01

    This article presents the results of modeling and computer simulation of non-linear devices such as the Electromagnetic Driver of a D.C. Circuit Breaker. The mechanical and electromagnetic parts of the Driver are represented as equivalent electrical circuits and all basic processes of the Driver's magnetic circuit are calculated.

  5. Calculation of limits for significant unidirectional changes in two or more serial results of a biomarker based on a computer simulation model

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2015-01-01

    concept on more than two results will increase the number of false-positive results. Therefore, a simple method is needed to interpret the significance of a difference when all available serial biomarker results are considered. METHODS: A computer simulation model using Excel was developed. Based on 10...

  6. Computer simulation model for the striped bass young-of-the-year population in the Hudson River. [Effects of entrainment and impingement at power plants on population dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Eraslan, A.H.; Van Winkle, W.; Sharp, R.D.; Christensen, S.W.; Goodyear, C.P.; Rush, R.M.; Fulkerson, W.

    1975-09-01

    This report presents a daily transient (tidal-averaged), longitudinally one-dimensional (cross-section-averaged) computer simulation model for the assessment of the entrainment and impingement impacts of power plant operations on young-of-the-year populations of the striped bass, Morone saxatilis, in the Hudson River.

  7. Petri nets in Snoopy: a unifying framework for the graphical display, computational modelling, and simulation of bacterial regulatory networks.

    Science.gov (United States)

    Marwan, Wolfgang; Rohr, Christian; Heiner, Monika

    2012-01-01

    Using the example of phosphate regulation in enteric bacteria, we demonstrate the particular suitability of stochastic Petri nets to model biochemical phenomena and their simulative exploration by various features of the software tool Snoopy.

  8. Computer simulation of aeolian bedforms

    Institute of Scientific and Technical Information of China (English)

    苗天德; 慕青松; 武生智

    2001-01-01

    A discrete model is set up using the cellular automaton method and applied to simulate the formation and evolution of aeolian bedforms. The calculated bedforms resemble the actual shape of natural sand ripples and dunes.This reveals that the sand movement is a typical nonlinear dynamical process, and that the nesting configuration of sand ripples, dunes and draas are a self-organized system with a fractal characteristic, and evotves simultaneously at various scales in the sand-airflow.

  9. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  10. Computer simulation of spacecraft/environment interaction.

    Science.gov (United States)

    Krupnikov, K K; Makletsov, A A; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-10-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991 1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  11. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  12. Computer Simulation of Radial Immunodiffusion

    Science.gov (United States)

    Trautman, Rodes

    1972-01-01

    Theories of diffusion with chemical reaction are reviewed as to their contributions toward developing an algorithm needed for computer simulation of immunodiffusion. The Spiers-Augustin moving sink and the Engelberg stationary sink theories show how the antibody-antigen reaction can be incorporated into boundary conditions of the free diffusion differential equations. For this, a stoichiometric precipitate was assumed and the location of precipitin lines could be predicted. The Hill simultaneous linear adsorption theory provides a mathematical device for including another special type of antibody-antigen reaction in antigen excess regions of the gel. It permits an explanation for the lowered antigen diffusion coefficient, observed in the Oudin arrangement of single linear diffusion, but does not enable prediction of the location of precipitin lines. The most promising mathematical approach for a general solution is implied in the Augustin alternating cycle theory. This assumes the immunodiffusion process can be evaluated by alternating computation cycles: free diffusion without chemical reaction and chemical reaction without diffusion. The algorithm for the free diffusion update cycle, extended to both linear and radial geometries, is given in detail since it was based on gross flow rather than more conventional expressions in terms of net flow. Limitations on the numerical integration process using this algorithm are illustrated for free diffusion from a cylindrical well. PMID:4629869

  13. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    Science.gov (United States)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between

  14. Computer simulation of solder joint failure

    Energy Technology Data Exchange (ETDEWEB)

    Burchett, S.N.; Frear, D.R. [Sandia National Lab., Albuquerque, NM (United States); Rashid, M.M. [Univ. of California, Davis, CA (United States)

    1997-04-01

    The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide the fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.

  15. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  16. Optically simulated universal quantum computation

    Science.gov (United States)

    Francisco, D.; Ledesma, S.

    2008-04-01

    Recently, classical optics based systems to emulate quantum information processing have been proposed. The analogy is based on the possibility of encoding a quantum state of a system with a 2N-dimensional Hilbert space as an image in the input of an optical system. The probability amplitude of each state of a certain basis is associated with the complex amplitude of the electromagnetic field in a given slice of the laser wavefront. Temporal evolution is represented as the change of the complex amplitude of the field when the wavefront pass through a certain optical arrangement. Different modules that represent universal gates for quantum computation have been implemented. For instance, unitary operations acting on the qbits space (or U(2) gates) are represented by means of two phase plates, two spherical lenses and a phase grating in a typical image processing set up. In this work, we present CNOT gates which are emulated by means of a cube prism that splits a pair of adjacent rays incoming from the input image. As an example of application, we present an optical module that can be used to simulate the quantum teleportation process. We also show experimental results that illustrate the validity of the analogy. Although the experimental results obtained are promising and show the capability of the system for simulate the real quantum process, we must take into account that any classical simulation of quantum phenomena, has as fundamental limitation the impossibility of representing non local entanglement. In this classical context, quantum teleportation has only an illustrative interpretation.

  17. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    Science.gov (United States)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  18. Creating Electronic Books-Chapters for Computers and Tablets Using Easy Java/JavaScript Simulations, EjsS Modeling Tool

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    This paper shares my journey (tools used, design principles derived and modeling pedagogy implemented) when creating electronic books-chapters (epub3 format) for computers and tablets using Easy Java/JavaScript Simulations, (old name EJS, new EjsS) Modeling Tool. The theory underpinning this work grounded on learning by doing through dynamic and interactive simulation-models that can be more easily made sense of instead of the static nature of printed materials. I started combining related computer models with supporting texts and illustrations into a coherent chapter, a logical next step towards tighter support for teachers and students ,developing prototypes electronic chapters on the topics of Simple Harmonic Motion and Gravity customized for the Singapore-Cambridge General Certificate of Education Advanced Level (A-level). I aim to inspire more educators to create interactive and open educational resources for the benefit of all. Prototypes: http://iwant2study.org/ospsg/index.php/interactive-resources/phy...

  19. QCE : A Simulator for Quantum Computer Hardware

    NARCIS (Netherlands)

    Michielsen, Kristel; Raedt, Hans De

    2003-01-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms.

  20. Strange attractor simulated on a quantum computer

    OpenAIRE

    2002-01-01

    We show that dissipative classical dynamics converging to a strange attractor can be simulated on a quantum computer. Such quantum computations allow to investigate efficiently the small scale structure of strange attractors, yielding new information inaccessible to classical computers. This opens new possibilities for quantum simulations of various dissipative processes in nature.

  1. Computer Simulation Studies in Condensed-Matter Physics XVII

    Science.gov (United States)

    Landau, D. P.; Lewis, S. P.; Schüttler, H.-B.

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  2. Documentation of a computer program to simulate lake-aquifer interaction using the MODFLOW ground water flow model and the MOC3D solute-transport model

    Science.gov (United States)

    Merritt, Michael L.; Konikow, Leonard F.

    2000-01-01

    Heads and flow patterns in surficial aquifers can be strongly influenced by the presence of stationary surface-water bodies (lakes) that are in direct contact, vertically and laterally, with the aquifer. Conversely, lake stages can be significantly affected by the volume of water that seeps through the lakebed that separates the lake from the aquifer. For these reasons, a set of computer subroutines called the Lake Package (LAK3) was developed to represent lake/aquifer interaction in numerical simulations using the U.S. Geological Survey three-dimensional, finite-difference, modular ground-water flow model MODFLOW and the U.S. Geological Survey three-dimensional method-of-characteristics solute-transport model MOC3D. In the Lake Package described in this report, a lake is represented as a volume of space within the model grid which consists of inactive cells extending downward from the upper surface of the grid. Active model grid cells bordering this space, representing the adjacent aquifer, exchange water with the lake at a rate determined by the relative heads and by conductances that are based on grid cell dimensions, hydraulic conductivities of the aquifer material, and user-specified leakance distributions that represent the resistance to flow through the material of the lakebed. Parts of the lake may become ?dry? as upper layers of the model are dewatered, with a concomitant reduction in lake surface area, and may subsequently rewet when aquifer heads rise. An empirical approximation has been encoded to simulate the rewetting of a lake that becomes completely dry. The variations of lake stages are determined by independent water budgets computed for each lake in the model grid. This lake budget process makes the package a simulator of the response of lake stage to hydraulic stresses applied to the aquifer. Implementation of a lake water budget requires input of parameters including those representing the rate of lake atmospheric recharge and evaporation

  3. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  4. Computation of a combined spherical-elastic and viscous-half-space earth model for ice sheet simulation

    CERN Document Server

    Bueler, E; Kallen-Brown, J A; Bueler, Ed; Lingle, Craig S.; Kallen-Brown, Jed A.

    2006-01-01

    This report starts by describing the continuum model used by Lingle & Clark (1985) to approximate the deformation of the earth under changing ice sheet and ocean loads. That source considers a single ice stream, but we apply their underlying model to continent-scale ice sheet simulation. Their model combines Farrell's (1972) elastic spherical earth with a viscous half-space overlain by an elastic plate lithosphere. The latter half-space model is derivable from calculations by Cathles (1975). For the elastic spherical earth we use Farrell's tabulated Green's function, as do Lingle & Clark. For the half-space model, however, we propose and implement a significantly faster numerical strategy, a spectral collocation method (Trefethen 2000) based directly on the Fast Fourier Transform. To verify this method we compare to an integral formula for a disc load. To compare earth models we build an accumulation history from a growing similarity solution from (Bueler, et al.~2005) and and simulate the coupled (ic...

  5. Computer simulations applied in materials

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  6. Computer simulations applied in materials

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  7. Comprehensive Memory-Bound Simulations on Single Board Computers

    OpenAIRE

    Himpe, Christian; Leibner, Tobias; Rave, Stephan

    2017-01-01

    Numerical simulations of increasingly complex models, demand growing amounts of (main) memory. Typically, large quantities of memory are provided by workstation- and server-type computers, but in turn consume massive amounts of power. Model order reduction can reduce the memory requirements of simulations by constructing reduced order models, yet the assembly of these surrogate models itself often requires memory-rich compute environments. We resolve this deadlock by careful algorithmic desig...

  8. Modeling Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Shuyi; WEN Yingyou; ZHAO Hong

    2006-01-01

    In this paper, a formal approach based on predicate logic is proposed for representing and reasoning of trusted computing models. Predicates are defined to represent the characteristics of the objects and the relationship among these objects in a trusted system according to trusted computing specifications. Inference rules of trusted relation are given too. With the semantics proposed, some trusted computing models are formalized and verified, which shows that Predicate calculus logic provides a general and effective method for modeling and reasoning trusted computing systems.

  9. Computer Simulations of the Fatigue Crack Propagation

    Directory of Open Access Journals (Sweden)

    A. Materna

    2000-01-01

    Full Text Available The following hypothesis for design of structures based on the damage tolerance philosophy is laid down: the perpendicular fatigue crack growth rate v in a certain point of a curved crack front is given by the local value of stress intensity factor per unit of nominal stress K' and the local triaxiality T which describes the constraint. The relationship v = f (K', T is supposed to be typical for a given loading spectrum and material. Such relationship for a 2024 Al alloy and the flight-simulation spectrum was derived from the fatigue test of the rectangular panel with the central hole and used for three-dimensional simulation of the corner fatigue crack propagation in the model of the wing spar flangeplate. Finite element and boundary element methods were used for these computations. The results of the simulation are in good agreement with the experiment.

  10. Computer Simulation of Developmental Processes and ...

    Science.gov (United States)

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  11. Computational Aerodynamic Simulations of a 1215 ft/sec Tip Speed Transonic Fan System Model for Acoustic Methods Assessment and Development

    Science.gov (United States)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of a 1215 ft/sec tip speed transonic fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, low-noise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15-foot Low Speed Wind Tunnel at the NASA Glenn Research Center. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating points simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, which for this model did not include a split flow path with core and bypass ducts. As a result, it was only necessary to adjust fan rotational speed in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. Computed blade row flow fields at all fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the flow fields at all operating conditions reveals no excessive boundary layer separations or related secondary-flow problems.

  12. Multidimensional computer simulation of Stirling cycle engines

    Science.gov (United States)

    Hall, C. A.; Porsching, T. A.; Medley, J.; Tew, R. C.

    1990-01-01

    The computer code ALGAE (algorithms for the gas equations) treats incompressible, thermally expandable, or locally compressible flows in complicated two-dimensional flow regions. The solution method, finite differencing schemes, and basic modeling of the field equations in ALGAE are applicable to engineering design settings of the type found in Stirling cycle engines. The use of ALGAE to model multiple components of the space power research engine (SPRE) is reported. Videotape computer simulations of the transient behavior of the working gas (helium) in the heater-regenerator-cooler complex of the SPRE demonstrate the usefulness of such a program in providing information on thermal and hydraulic phenomena in multiple component sections of the SPRE.

  13. Investigation of Carbohydrate Recognition via Computer Simulation

    Directory of Open Access Journals (Sweden)

    Quentin R. Johnson

    2015-04-01

    Full Text Available Carbohydrate recognition by proteins, such as lectins and other (biomolecules, can be essential for many biological functions. Recently, interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. We focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  14. Computer simulation in nuclear science and engineering

    Energy Technology Data Exchange (ETDEWEB)

    Akiyama, Mamoru; Miya, Kenzo; Iwata, Shuichi; Yagawa, Genki; Kondo, Shusuke (Tokyo Univ. (Japan)); Hoshino, Tsutomu; Shimizu, Akinao; Takahashi, Hiroshi; Nakagawa, Masatoshi

    1992-03-01

    The numerical simulation technology used for the design of nuclear reactors includes the scientific fields of wide range, and is the cultivated technology which grew in the steady efforts to high calculation accuracy through safety examination, reliability verification test, the assessment of operation results and so on. Taking the opportunity of putting numerical simulation to practical use in wide fields, the numerical simulation of five basic equations which describe the natural world and the progress of its related technologies are reviewed. It is expected that numerical simulation technology contributes to not only the means of design study but also the progress of science and technology such as the construction of new innovative concept, the exploration of new mechanisms and substances, of which the models do not exist in the natural world. The development of atomic energy and the progress of computers, Boltzmann's transport equation and its periphery, Navier-Stokes' equation and its periphery, Maxwell's electromagnetic field equation and its periphery, Schroedinger wave equation and its periphery, computational solid mechanics and its periphery, and probabilistic risk assessment and its periphery are described. (K.I.).

  15. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  16. 基于Agent消费者行为仿真计算模型构建%Computational Model of Agent-based Consumer Behavior Simulation

    Institute of Scientific and Technical Information of China (English)

    崔雪彬; 陆云波

    2011-01-01

    基于Agent消费者市场仿真研究前沿而复杂,目前国内研究较少.为了发展此项研究,构建了基于Agent消费者行为计算模型,通过科学地量化虚拟市场中消费者行为和营销策略关系的内在机理,努力使其达到可计算程度,此模型为未来的仿真模型实现和发展打下坚实的基础.该模型在基于Agent仿真技术背景下,利用效用方程为Agent设置微观规则,通过衡量商品营销组合对消费者的效用,进而达到模拟消费者的商品选择行为的目的.未来该计算模型还需要仿真平台实现以及开展复杂而漫长的验证过程.%Agent-based consumer behavior simulation is a new and complicated research, in which very few Chinese scholars involve. To develop it in China, the paper has built a computational model of agent-based consumer behavior simulation through scientifically converting to computational function with the micro mechanism of consumer behavior in relation to marketing strategies. The model framework plays an important role for future model implementation and improving. The model makes use of Agent-based simulation technology. The micro Agent rule is set based on utility equations to measure the utility of the product' s marketing mix and then simulate the consumer' s consuming choice acts. In the future, the computational model need to be executed in the simulation platform and validated systematically.

  17. New Computer Simulations of Macular Neural Functioning

    Science.gov (United States)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  18. The Guide to Computer Simulations and Games

    CERN Document Server

    Becker, K

    2011-01-01

    The first computer simulation book for anyone designing or building a game Answering the growing demand for a book catered for those who design, develop, or use simulations and games this book teaches you exactly what you need to know in order to understand the simulations you build or use all without having to earn another degree. Organized into three parts, this informative book first defines computer simulations and describes how they are different from live-action and paper-based simulations. The second section builds upon the previous, with coverage of the technical details of simulations

  19. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  20. The possibility of coexistence and co-development in language competition: ecology-society computational model and simulation.

    Science.gov (United States)

    Yun, Jian; Shang, Song-Chao; Wei, Xiao-Dan; Liu, Shuang; Li, Zhi-Jie

    2016-01-01

    Language is characterized by both ecological properties and social properties, and competition is the basic form of language evolution. The rise and decline of one language is a result of competition between languages. Moreover, this rise and decline directly influences the diversity of human culture. Mathematics and computer modeling for language competition has been a popular topic in the fields of linguistics, mathematics, computer science, ecology, and other disciplines. Currently, there are several problems in the research on language competition modeling. First, comprehensive mathematical analysis is absent in most studies of language competition models. Next, most language competition models are based on the assumption that one language in the model is stronger than the other. These studies tend to ignore cases where there is a balance of power in the competition. The competition between two well-matched languages is more practical, because it can facilitate the co-development of two languages. A third issue with current studies is that many studies have an evolution result where the weaker language inevitably goes extinct. From the integrated point of view of ecology and sociology, this paper improves the Lotka-Volterra model and basic reaction-diffusion model to propose an "ecology-society" computational model for describing language competition. Furthermore, a strict and comprehensive mathematical analysis was made for the stability of the equilibria. Two languages in competition may be either well-matched or greatly different in strength, which was reflected in the experimental design. The results revealed that language coexistence, and even co-development, are likely to occur during language competition.

  1. Connections between simulations and observation in climate computer modeling. Scientist's practices and "bottom-up epistemology" lessons

    Science.gov (United States)

    Guillemot, Hélène

    Climate modeling is closely tied, through its institutions and practices, to observations from satellites and to the field sciences. The validity, quality and scientific credibility of models are based on interaction between models and observation data. In the case of numerical modeling of climate and climate change, validation is not solely a scientific interest: the legitimacy of computer modeling, as a tool of knowledge, has been called into question in order to deny the reality of any anthropogenic climate change; model validations thereby bring political issues into play as well. There is no systematic protocol of validation: one never validates a model in general, but the capacity of a model to account for a defined climatic phenomenon or characteristic. From practices observed in the two research centers developing and using a climate model in France, this paper reviews different ways in which the researchers establish links between models and empirical data (which are not reduced to the latter validating the former) and convince themselves that their models are valid. The analysis of validation practices-relating to parametrization, modes of variability, climatic phenomena, etc.-allows us to highlight some elements of the epistemology of modeling.

  2. Computational model for simulation of sequences of helicity and angular momentum transfer in turbid tissue-like scattering medium (Conference Presentation)

    Science.gov (United States)

    Doronin, Alexander; Meglinski, Igor

    2017-02-01

    Current report considers development of a unified Monte Carlo (MC) -based computational model for simulation of propagation of Laguerre-Gaussian (LG) beams in turbid tissue-like scattering medium. With a primary goal to proof the concept of using complex light for tissue diagnosis we explore propagation of LG beams in comparison with Gaussian beams for both linear and circular polarization. MC simulations of radially and azimuthally polarized LG beams in turbid media have been performed, classic phenomena such as preservation of the orbital angular momentum, optical memory and helicity flip are observed, detailed comparison is presented and discussed.

  3. Computer simulation technology in inertial confinement (ICF)

    Energy Technology Data Exchange (ETDEWEB)

    Yabe, Takashi (Gunma Univ., Kiryu (Japan). Faculty of Engineering)

    1994-12-01

    Recent development of computational technologies in inertial confinement fusion (ICF) is reviewed with a special emphasis on hydrodynamic simulations. The CIP method developed for ICF simulations is one of the typical examples that are used in various fields of physics such as variety of computational fluid dynamics, astrophysics, laser applications, geophysics, and so on. (author).

  4. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  5. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    Science.gov (United States)

    Khawli, Toufik Al; Gebhardt, Sascha; Eppelt, Urs; Hermanns, Torsten; Kuhlen, Torsten; Schulz, Wolfgang

    2016-06-01

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  6. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    Energy Technology Data Exchange (ETDEWEB)

    Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten [RWTH Aachen University, Chair for Nonlinear Dynamics, Steinbachstr. 15, 52047 Aachen (Germany); Gebhardt, Sascha [RWTH Aachen University, Virtual Reality Group, IT Center, Seffenter Weg 23, 52074 Aachen (Germany); Kuhlen, Torsten [Forschungszentrum Jülich GmbH, Institute for Advanced Simulation (IAS), Jülich Supercomputing Centre (JSC), Wilhelm-Johnen-Straße, 52425 Jülich (Germany); Schulz, Wolfgang [Fraunhofer, ILT Laser Technology, Steinbachstr. 15, 52047 Aachen (Germany)

    2016-06-08

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  7. Comparison between Utsu's and Vere-Jones' aftershocks model by means of a computer simulation based on the acceptance-rejection sampling of von Neumann

    Science.gov (United States)

    Reyes, J.; Morales-Esteban, A.; González, E.; Martínez-Álvarez, F.

    2016-07-01

    In this research, a new algorithm for generating a stochastic earthquake catalog is presented. The algorithm is based on the acceptance-rejection sampling of von Neumann. The result is a computer simulation of earthquakes based on the calculated statistical properties of each zone. Vere-Jones states that an earthquake sequence can be modeled as a series of random events. This is the model used in the proposed simulation. Contrariwise, Utsu indicates that the mainshocks are special geophysical events. The algorithm has been applied to zones of Chile, China, Spain, Japan, and the USA. This allows classifying the zones according to Vere-Jones' or Utsu's model. The results have been quantified relating the mainshock with the largest aftershock within the next 5 days (which has been named as Bath event). The results show that some zones fit Utsu's model and others Vere-Jones'. Finally, the fraction of seismic events that satisfy certain properties of magnitude and occurrence is analyzed.

  8. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are bor

  9. Computational Aerodynamic Simulations of a 1484 ft/sec Tip Speed Quiet High-Speed Fan System Model for Acoustic Methods Assessment and Development

    Science.gov (United States)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of a 1484 ft/sec tip speed quiet high-speed fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, low-noise research fan/nacelle model that has undergone experimental testing in the 9- by 15-foot Low Speed Wind Tunnel at the NASA Glenn Research Center. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating points simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, which includes a core duct and a bypass duct that merge upstream of the fan system nozzle. As a result, only fan rotational speed and the system bypass ratio, set by means of a translating nozzle plug, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. Computed blade row flow fields at all fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive or critical boundary layer separations or related secondary-flow problems, with the exception of the hub boundary layer at the core duct entrance. At that location a significant flow separation is present. The region of local flow

  10. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  11. FEL Simulation Using Distributed Computing

    Energy Technology Data Exchange (ETDEWEB)

    Einstein, Joshua [Fermilab; Bernabeu Altayo, Gerard [Fermilab; Biedron, Sandra [Ljubljana U.; Freund, Henry [Colorado State U., Fort Collins; Milton, Stephen [Colorado State U., Fort Collins; van der Slot, Peter [Colorado State U., Fort Collins

    2016-06-01

    While simulation tools are available and have been used regularly for simulating light sources, the increasing availability and lower cost of GPU-based processing opens up new opportunities. This poster highlights a method of how accelerating and parallelizing code processing through the use of COTS software interfaces.

  12. Space-based Observation System Simulation Experiments for the Global Water Cycle: Information Tradeoffs, Model Diagnostics, and Exascale Computing

    Science.gov (United States)

    Reed, P. M.

    2011-12-01

    Global scale issues such as population growth, changing land-use, and climate change place our natural resources at the center of focus for a broad range of interdependent science, engineering, and policy problems. Our ability to mitigate and adapt to the accelerating rate of environmental change is critically dependent on our ability to observe and predict the natural, built, and social systems that define sustainability at the global scale. Despite the risks and challenges posed by global change, we are faced with critical risks to our ability to maintain and improve long term space-based observations of these changes. Despite consensus agreement on the critical importance of space-based Earth science, the fundamental challenge remains: How should we manage the severe tradeoffs and design challenges posed by maximizing the value of existing and proposed spaced-based Earth observation systems? Addressing this question requires transformative innovations in the design and management of spaced-based Earth observation systems that effectively take advantage of massively parallel computing architectures to enable the discovery and exploitation of critical mission tradeoffs using high-resolution space-based observation system simulation events (OSSEs) that simulate the global water cycle data that would result from sensing innovations and evaluates their merit with carefully constructed prediction and management benchmarks.

  13. Information diffusion, Facebook clusters, and the simplicial model of social aggregation: a computational simulation of simplicial diffusers for community health interventions.

    Science.gov (United States)

    Kee, Kerk F; Sparks, Lisa; Struppa, Daniele C; Mannucci, Mirco A; Damiano, Alberto

    2016-01-01

    By integrating the simplicial model of social aggregation with existing research on opinion leadership and diffusion networks, this article introduces the constructs of simplicial diffusers (mathematically defined as nodes embedded in simplexes; a simplex is a socially bonded cluster) and simplicial diffusing sets (mathematically defined as minimal covers of a simplicial complex; a simplicial complex is a social aggregation in which socially bonded clusters are embedded) to propose a strategic approach for information diffusion of cancer screenings as a health intervention on Facebook for community cancer prevention and control. This approach is novel in its incorporation of interpersonally bonded clusters, culturally distinct subgroups, and different united social entities that coexist within a larger community into a computational simulation to select sets of simplicial diffusers with the highest degree of information diffusion for health intervention dissemination. The unique contributions of the article also include seven propositions and five algorithmic steps for computationally modeling the simplicial model with Facebook data.

  14. Vernier caliper and micrometer computer models using Easy Java Simulation and its pedagogical design features—ideas for augmenting learning with real instruments

    Science.gov (United States)

    Wee, Loo Kang; Tiang Ning, Hwee

    2014-09-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a simple two-dimensional view for learning from pen and paper questions and the real world; (2) hints, answers, different scale options and the inclusion of zero error; (3) assessment for learning feedback. The initial positive feedback from Singaporean students and educators indicates that these tools could be successfully shared and implemented in learning communities. Educators are encouraged to change the source code for these computer models to suit their own purposes; they have creative commons attribution licenses for the benefit of all.

  15. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  16. Computational simulation of liquid rocket injector anomalies

    Science.gov (United States)

    Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.

    1986-01-01

    A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.

  17. IVOA Recommendation: Simulation Data Model

    CERN Document Server

    Lemson, Gerard; Cervino, Miguel; Gheller, Claudio; Gray, Norman; LePetit, Franck; Louys, Mireille; Ooghe, Benjamin; Wagner, Rick; Wozniak, Herve

    2014-01-01

    In this document and the accompanying documents we describe a data model (Simulation Data Model) describing numerical computer simulations of astrophysical systems. The primary goal of this standard is to support discovery of simulations by describing those aspects of them that scientists might wish to query on, i.e. it is a model for meta-data describing simulations. This document does not propose a protocol for using this model. IVOA protocols are being developed and are supposed to use the model, either in its original form or in a form derived from the model proposed here, but more suited to the particular protocol. The SimDM has been developed in the IVOA Theory Interest Group with assistance of representatives of relevant working groups, in particular DM and Semantics.

  18. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  19. Multiscale Computer Simulation of Failure in Aerogels

    Science.gov (United States)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  20. Filtration theory using computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Corey, I. [Lawrence Livermore National Lab., CA (United States)

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  1. Computer simulation in physics and engineering

    CERN Document Server

    Steinhauser, Martin Oliver

    2013-01-01

    This work is a needed reference for widely used techniques and methods of computer simulation in physics and other disciplines, such as materials science. The work conveys both: the theoretical foundations of computer simulation as well as applications and "tricks of the trade", that often are scattered across various papers. Thus it will meet a need and fill a gap for every scientist who needs computer simulations for his/her task at hand. In addition to being a reference, case studies and exercises for use as course reading are included.

  2. Product Costing in FMT: Comparing Deterministic and Stochastic Models Using Computer-Based Simulation for an Actual Case Study

    DEFF Research Database (Denmark)

    Nielsen, Steen

    2000-01-01

    This paper expands the traditional product costing technique be including a stochastic form in a complex production process for product costing. The stochastic phenomenon in flesbile manufacturing technologies is seen as an important phenomenon that companies try to decreas og eliminate. DFM has...... been used for evaluating the appropriateness of the firm's production capability. In this paper a simulation model is developed to analyze the relevant cost behaviour with respect to DFM and to develop a more streamlined process in the layout of the manufacturing process....

  3. Computer simulations of the mouse spermatogenic cycle

    Directory of Open Access Journals (Sweden)

    Debjit Ray

    2014-12-01

    Full Text Available The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal–spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.

  4. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  5. Computationally modeling interpersonal trust.

    Science.gov (United States)

    Lee, Jin Joo; Knox, W Bradley; Wormwood, Jolie B; Breazeal, Cynthia; Desteno, David

    2013-01-01

    We present a computational model capable of predicting-above human accuracy-the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  6. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  7. Computational Simulation of Explosively Generated Pulsed Power Devices

    Science.gov (United States)

    2013-03-21

    physics models for magnetohydrodynamics, and ALEGRA-HEDP, which builds on the ALEGRA- MHD version and adds physics model that allow simulation of high energy...development, there is a genuine need for more theory-based research and an accurate computer modeling capability. One of the programs that has done...developed by Sandia National Laboratories, to develop a computer model that can accurately represent an FEG and that can be verified against existing

  8. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction...... of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling......, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic aspects of John von Neumann’s First Draft of a Report on the EDVAC from 1945. b. Herbert Simon’s notion of simulation in The Science of the Artificial from the 1970s. c...

  9. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  10. Computer-simulated bone architecture in a simple bone-remodeling model based on a reaction-diffusion system.

    Science.gov (United States)

    Tezuka, Ken-ichi; Wada, Yoshitaka; Takahashi, Akiyuki; Kikuchi, Masanori

    2005-01-01

    Bone is a complex system with functions including those of adaptation and repair. To understand how bone cells can create a structure adapted to the mechanical environment, we propose a simple bone remodeling model based on a reaction-diffusion system influenced by mechanical stress. Two-dimensional bone models were created and subjected to mechanical loads. The conventional finite element method (FEM) was used to calculate stress distribution. A stress-reactive reaction-diffusion model was constructed and used to simulate bone remodeling under mechanical loads. When an external mechanical stress was applied, stimulated bone formation and subsequent activation of bone resorption produced an efficient adaptation of the internal shape of the model bone to a given stress, and demonstrated major structures of trabecular bone seen in the human femoral neck. The degree of adaptation could be controlled by modulating the diffusion constants of hypothetical local factors. We also tried to demonstrate the deformation of bone structure during osteoporosis by the modulation of a parameter affecting the balance between formation and resorption. This simple model gives us an insight into how bone cells can create an architecture adapted to environmental stress, and will serve as a useful tool to understand both physiological and pathological states of bone based on structural information.

  11. Computer Simulation Study of Human Locomotion with a Three-Dimensional Entire-Body Neuro-Musculo-Skeletal Model

    Science.gov (United States)

    Hase, Kazunori; Yamazaki, Nobutoshi

    A model having a three-dimensional entire-body structure and consisting of both the neuronal system and the musculo-skeletal system was proposed to precisely simulate human walking motion. The dynamics of the human body was represented by a 14-rigid-link system and 60 muscular models. The neuronal system was represented by three sub-systems: the rhythm generator system consisting of 32 neural oscillators, the sensory feedback system, and the peripheral system expressed by static optimization. Unknown neuronal parameters were adjusted by a numerical search method using the evaluative criterion for locomotion that was defined by a hybrid between the locomotive energy efficiency and the smoothness of the muscular tensions. The model could successfully generate continuous and three-dimensional walking patterns and stabilized walking against mechanical perturbation. The walking pattern was more stable than that of the model based on dynamic optimization, and more precise than that of the previous model based on a similar neuronal system.

  12. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  13. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  14. Computational simulation of liquid fuel rocket injectors

    Science.gov (United States)

    Landrum, D. Brian

    1994-01-01

    A major component of any liquid propellant rocket is the propellant injection system. Issues of interest include the degree of liquid vaporization and its impact on the combustion process, the pressure and temperature fields in the combustion chamber, and the cooling of the injector face and chamber walls. The Finite Difference Navier-Stokes (FDNS) code is a primary computational tool used in the MSFC Computational Fluid Dynamics Branch. The branch has dedicated a significant amount of resources to development of this code for prediction of both liquid and solid fuel rocket performance. The FDNS code is currently being upgraded to include the capability to model liquid/gas multi-phase flows for fuel injection simulation. An important aspect of this effort is benchmarking the code capabilities to predict existing experimental injection data. The objective of this MSFC/ASEE Summer Faculty Fellowship term was to evaluate the capabilities of the modified FDNS code to predict flow fields with liquid injection. Comparisons were made between code predictions and existing experimental data. A significant portion of the effort included a search for appropriate validation data. Also, code simulation deficiencies were identified.

  15. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  16. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  17. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  18. Computer simulation and modeling of graded bandgap CuInSe{sub 2}/CdS based solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Dhingra, A.; Rothwarf, A. [Drexel Univ., Philadelphia, PA (United States). Dept. of Electrical and Computer Engineering

    1996-04-01

    This paper proposes the use of graded bandgap absorber material, to improve the low open-circuit voltage (V{sub oc}) seen in CuInSe{sub 2}/CdS solar cells, without sacrificing the short-circuit current density (J{sub sc}). It also proposes a p-i-n model for the CuInSe{sub 2}/CdS solar cell, where the intrinsic region is the graded bandgap CIS. Reflecting surfaces are provided at the p-i and n-i interfaces to trap the light in the narrow intrinsic region for maximum generation of electron and hole pairs (EHP`s). This optical confinement results in a 25--40% increase in the number of photons absorbed. An extensive numerical simulator was developed, which provides a 1-D self-consistent solution for Poisson`s equation and the two continuity equations for electrons and holes. This simulator was used to generate J-V curves to delineate the effect of different grading profiles on cell performance. The effects of a uniform bandgap, normal grading, reverse grading, and a low bandgap notch have been considered. Having established the inherent advantages to these grading profiles an optimal doubly graded structure is proposed. Replacing the thick CdS (2.42ev) layer assumed in the simulations with a wide gap semiconductor such as ZnO (3.35ev) increases all current densities by about 5 mA/cm{sup 2}, and increases the optimal calculated efficiency from 17.9% to roughly 21% for a doubly graded structure with a thickness of 1 {micro}m and bandgaps ranging from 1.3 eV to 1.5 eV.

  19. Investigating European genetic history through computer simulations.

    Science.gov (United States)

    Currat, Mathias; Silva, Nuno M

    2013-01-01

    The genetic diversity of Europeans has been shaped by various evolutionary forces including their demographic history. Genetic data can thus be used to draw inferences on the population history of Europe using appropriate statistical methods such as computer simulation, which constitutes a powerful tool to study complex models. Here, we focus on spatially explicit simulation, a method which takes population movements over space and time into account. We present its main principles and then describe a series of studies using this approach that we consider as particularly significant in the context of European prehistory. All simulation studies agree that ancient demographic events played a significant role in the establishment of the European gene pool; but while earlier works support a major genetic input from the Near East during the Neolithic transition, the most recent ones revalue positively the contribution of pre-Neolithic hunter-gatherers and suggest a possible impact of very ancient demographic events. This result of a substantial genetic continuity from pre-Neolithic times to the present challenges some recent studies analyzing ancient DNA. We discuss the possible reasons for this discrepancy and identify future lines of investigation in order to get a better understanding of European evolution.

  20. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  1. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  2. Computer simulation and vehicle front optimisation.

    NARCIS (Netherlands)

    Sluis, J. van der

    1993-01-01

    The influence of the stiffness and shape of a car-front on injuries of bicyclists caused by side collisions was studied by computer simulation. Simulation was a suitable method in this case because of two reasons: variation of shape and stiffness is more difficult to perform in case of an experiment

  3. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  4. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  5. Stochastic model for computer simulation of the number of cancer cells and lymphocytes in homogeneous sections of cancer tumors

    CERN Document Server

    Castellanos-Moreno, Arnulfo; Corella-Madueño, Adalberto; Gutiérrez-López, Sergio; Rosas-Burgos, Rodrigo

    2014-01-01

    We deal with a small enough tumor section to consider it homogeneous, such that populations of lymphocytes and cancer cells are independent of spatial coordinates. A stochastic model based in one step processes is developed to take into account natural birth and death rates. Other rates are also introduced to consider medical treatment: natural birth rate of lymphocytes and cancer cells; induced death rate of cancer cells due to self-competition, and other ones caused by the activated lymphocytes acting on cancer cells. Additionally, a death rate of cancer cells due to induced apoptosis is considered. Weakness due to the advance of sickness is considered by introducing a lymphocytes death rate proportional to proliferation of cancer cells. Simulation is developed considering different combinations of the parameters and its values, so that several strategies are taken into account to study the effect of anti-angiogenic drugs as well the self-competition between cancer cells. Immune response, with the presence ...

  6. Outcomes from monitoring of patients on antiretroviral therapy in resource-limited settings with viral load, CD4 cell count, or clinical observation alone: a computer simulation model

    DEFF Research Database (Denmark)

    Phillips, Andrew N; Pillay, Deenan; Miners, Alec H

    2008-01-01

    of such monitoring strategies, especially in terms of survival and resistance development. METHODS: A validated computer simulation model of HIV infection and the effect of antiretroviral therapy was used to compare survival, use of second-line regimens, and development of resistance that result from different......, the predicted proportion of potential life-years survived was 83% with viral load monitoring (switch when viral load >500 copies per mL), 82% with CD4 cell count monitoring (switch at 50% drop from peak), and 82% with clinical monitoring (switch when two new WHO stage 3 events or a WHO stage 4 event occur...

  7. An Effective Data Representation and Computation Scheme in Computer Simulation for Neural Networks

    Institute of Scientific and Technical Information of China (English)

    CHENHoujin; YUANBaozong

    2004-01-01

    A Biological neural network (BNN) is composed of a vast number of neurons interconnected by synapses. It has the ability to process information and generate a specific pattern of electrical activity. To analyze its interior structure and exterior properties, computational models were combined with experimental data and one computer simulation system was implemented. As BNN is a complicated nonlinear system and the simulation deals with a great amount of numeric computations,so data representation and computation scheme are critical to simulation process. In this paper, Object-oriented data representation (OODR) was designed to have sharable and reusable properties, and one novel hybrid computation scheme was presented. With OODR, data share and computation share were simultaneously achieved. According to the hybrid computation scheme, individual computation method was applied to corresponding object based on its model characteristics and the computation efficiency was obviously increased. Now they were adopted in one BNN simulation system which was implemented in platform independent language JAVA. As the simulation system took advantage of the data representation and the computation scheme, so its performances were greatly improved, and it has got practical applications in many countries.

  8. A computational model for feature binding

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The "Binding Problem" is an important problem across many disciplines, including psychology, neuroscience, computational modeling, and even philosophy. In this work, we proposed a novel computational model, Bayesian Linking Field Model, for feature binding in visual perception, by combining the idea of noisy neuron model, Bayesian method, Linking Field Network and competitive mechanism. Simulation Experiments demonstrated that our model perfectly fulfilled the task of feature binding in visual perception and provided us some enlightening idea for future research.

  9. A computational model for feature binding

    Institute of Scientific and Technical Information of China (English)

    SHI ZhiWei; SHI ZhongZhi; LIU Xi; SHI ZhiPing

    2008-01-01

    The "Binding Problem" is an important problem across many disciplines, including psychology, neuroscience, computational modeling, and even philosophy. In this work, we proposed a novel computational model, Bayesian Linking Field Model, for feature binding in visual perception, by combining the idea of noisy neuron model, Bayesian method, Linking Field Network and competitive mechanism.Simulation Experiments demonstrated that our model perfectly fulfilled the task of feature binding in visual perception and provided us some enlightening idea for future research.

  10. Computer simulation of two-phase flow in nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, W.

    1992-09-01

    Two-phase flow models dominate the economic resource requirements for development and use of computer codes for analyzing thermohydraulic transients in nuclear power plants. Six principles are presented on mathematical modeling and selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited for two-phase flow analysis in nuclear reactors than the two-fluid model, because of the latter`s closure problem. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost.

  11. Computer simulation of two-phase flow in nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, W.

    1992-01-01

    Two-phase flow models dominate the economic resource requirements for development and use of computer codes for analyzing thermohydraulic transients in nuclear power plants. Six principles are presented on mathematical modeling and selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited for two-phase flow analysis in nuclear reactors than the two-fluid model, because of the latter's closure problem. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost.

  12. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible...... into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction......, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic aspects of John von Neumann’s First Draft of a Report on the EDVAC from 1945. b. Herbert Simon’s notion of simulation in The Science of the Artificial from the 1970s. c...

  13. Polymer Composites Corrosive Degradation: A Computational Simulation

    Science.gov (United States)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  14. Computer Simulation of Electron Positron Annihilation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, y

    2003-10-02

    With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create

  15. Supramolecular organization of functional organic materials in the bulk and at organic/organic interfaces: a modeling and computer simulation approach.

    Science.gov (United States)

    Muccioli, Luca; D'Avino, Gabriele; Berardi, Roberto; Orlandi, Silvia; Pizzirusso, Antonio; Ricci, Matteo; Roscioni, Otello Maria; Zannoni, Claudio

    2014-01-01

    The molecular organization of functional organic materials is one of the research areas where the combination of theoretical modeling and experimental determinations is most fruitful. Here we present a brief summary of the simulation approaches used to investigate the inner structure of organic materials with semiconducting behavior, paying special attention to applications in organic photovoltaics and clarifying the often obscure jargon hindering the access of newcomers to the literature of the field. Special attention is paid to the choice of the computational "engine" (Monte Carlo or Molecular Dynamics) used to generate equilibrium configurations of the molecular system under investigation and, more importantly, to the choice of the chemical details in describing the molecular interactions. Recent literature dealing with the simulation of organic semiconductors is critically reviewed in order of increasing complexity of the system studied, from low molecular weight molecules to semiflexible polymers, including the challenging problem of determining the morphology of heterojunctions between two different materials.

  16. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  17. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  18. Computer Simulations Hone Leadership Skills

    Science.gov (United States)

    Olson, Lynn

    2007-01-01

    An $11 million executive-training course for principals, modeled after best practices used in the corporate, medical, engineering, and military worlds, is starting to gain traction among states. Developed by the National Institute for School Leadership, or NISL, a for-profit company based in Washington, the program is now used widely in…

  19. Computationally modeling interpersonal trust

    OpenAIRE

    Jin Joo eLee; Brad eKnox; Jolie eBaumann; Cynthia eBreazeal; David eDeSteno

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our pr...

  20. Enabling Computational Technologies for Terascale Scientific Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  1. Computer Simulation Instruction: Carrying out Chemical Experiments

    Directory of Open Access Journals (Sweden)

    Ibtesam Al-Mashaqbeh

    2014-05-01

    Full Text Available The purpose of this study was to investigate the effect of computer simulation Instruction (CSI on students' achievements: Carrying out chemical experiments to acquire chemical concepts for eleventh grade students. The subject of the study consisted two sections of a one girl's high school in Jordan. One section was randomly assigned to experimental group in which computer simulation Instruction (CSI was used, and the other section was randomly assigned to control group in which students were instructed by using the traditional teaching instruction. The findings indicated that there is progress on the part of the experimental group which used the computer simulation Instruction (CSI and this was reflected positively in the students’ achievement in carrying out chemical experiments to acquire chemical concepts.

  2. Computer simulation for centrifugal mold filling of precision titanium castings

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    Computer simulation codes were developed based on a proposed mathematical model for centrifugal mold filling processes and previous computer software for 3D mold filling and solidification of castings (CASM-3D for Windows). Sample simulations were implemented for mold filling processes of precision titanium castings under gravity and different centrifugal casting techniques. The computation results show that the alloy melt has a much stronger mold filling ability for thin section castings under a centrifugal force field than that only under the gravity. A "return back" mold filling manner is showed to be a reasonable technique for centrifugal casting processes, especially for thin section precision castings.

  3. Development and validation of a computational finite element model of the rabbit upper airway: simulations of mandibular advancement and tracheal displacement.

    Science.gov (United States)

    Amatoury, Jason; Cheng, Shaokoon; Kairaitis, Kristina; Wheatley, John R; Amis, Terence C; Bilston, Lynne E

    2016-04-01

    The mechanisms leading to upper airway (UA) collapse during sleep are complex and poorly understood. We previously developed an anesthetized rabbit model for studying UA physiology. On the basis of this body of physiological data, we aimed to develop and validate a two-dimensional (2D) computational finite element model (FEM) of the passive rabbit UA and peripharyngeal tissues. Model geometry was reconstructed from a midsagittal computed tomographic image of a representative New Zealand White rabbit, which included major soft (tongue, soft palate, constrictor muscles), cartilaginous (epiglottis, thyroid cartilage), and bony pharyngeal tissues (mandible, hard palate, hyoid bone). Other UA muscles were modeled as linear elastic connections. Initial boundary and contact definitions were defined from anatomy and material properties derived from the literature. Model parameters were optimized to physiological data sets associated with mandibular advancement (MA) and caudal tracheal displacement (TD), including hyoid displacement, which featured with both applied loads. The model was then validated against independent data sets involving combined MA and TD. Model outputs included UA lumen geometry, peripharyngeal tissue displacement, and stress and strain distributions. Simulated MA and TD resulted in UA enlargement and nonuniform increases in tissue displacement, and stress and strain. Model predictions closely agreed with experimental data for individually applied MA, TD, and their combination. We have developed and validated an FEM of the rabbit UA that predicts UA geometry and peripharyngeal tissue mechanical changes associated with interventions known to improve UA patency. The model has the potential to advance our understanding of UA physiology and peripharyngeal tissue mechanics. Copyright © 2016 the American Physiological Society.

  4. Structural Composites Corrosive Management by Computational Simulation

    Science.gov (United States)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  5. Electric Propulsion Plume Simulations Using Parallel Computer

    Directory of Open Access Journals (Sweden)

    Joseph Wang

    2007-01-01

    Full Text Available A parallel, three-dimensional electrostatic PIC code is developed for large-scale electric propulsion simulations using parallel supercomputers. This code uses a newly developed immersed-finite-element particle-in-cell (IFE-PIC algorithm designed to handle complex boundary conditions accurately while maintaining the computational speed of the standard PIC code. Domain decomposition is used in both field solve and particle push to divide the computation among processors. Two simulations studies are presented to demonstrate the capability of the code. The first is a full particle simulation of near-thruster plume using real ion to electron mass ratio. The second is a high-resolution simulation of multiple ion thruster plume interactions for a realistic spacecraft using a domain enclosing the entire solar array panel. Performance benchmarks show that the IFE-PIC achieves a high parallel efficiency of ≥ 90%

  6. Computer simulation models for teaching and learning Modelos de simulación en salud : una alternativa para la docencia

    Directory of Open Access Journals (Sweden)

    Juan Gonzálo Restrepo Salazar

    1997-04-01

    Full Text Available Computer programs are being used for teaching and learning of pharmacology and physiology at the University of Antioquia, in Medellín, Colombia. They should be more widely used since they offer clear advantages over traditional systems of teach.ng; they allow direct presentations of models in motion, as well as a more active, interesting and flexible way of learning; besides they can save time and cut costs. Los Modelos de Simulación por Computador son programas de aprendizaje para enseñar materias como farmacología y fisiología a los estudiantes del área de la salud y de las ciencias básicas biomédicas. Los experimentos de simulación pueden usarse como soporte para la docencia y en algunas circunstancias como alternativa en las prácticas de laboratorio. La tecnología por computador ahora disponible permite la presentación directa de modelos en movimiento y posibilita un aprendizaje menos pasivo, más eficiente e interesante. La respuesta simulada de los tejidos se genera ya sea por resultados de experimentos actuales o por modelos predictivos y se presenta en la pantalla con gráficas de alta resolución comparables con las situaciones reales. Los estudiantes pueden realizar experimentos simulados, cambiar fácilmente sus parámetros y obtener información de igual manera que si hubieran realizado el experimento en el laboratorio.

  7. Time reversibility, computer simulation, and chaos

    CERN Document Server

    Hoover, William Graham

    1999-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful

  8. Computer simulation of Wheeler's delayed-choice experiment with photons

    NARCIS (Netherlands)

    Zhao, S.; Yuan, S.; De Raedt, H.; Michielsen, K.

    2008-01-01

    We present a computer simulation model of Wheeler's delayed-choice experiment that is a one-to-one copy of an experiment reported recently (Jacques V. et al., Science, 315 (2007) 966). The model is solely based on experimental facts, satisfies Einstein's criterion of local causality and does not rel

  9. Computer simulation of Wheeler's delayed-choice experiment with photons

    NARCIS (Netherlands)

    Zhao, S.; Yuan, S.; De Raedt, H.; Michielsen, K.

    We present a computer simulation model of Wheeler's delayed-choice experiment that is a one-to-one copy of an experiment reported recently (Jacques V. et al., Science, 315 (2007) 966). The model is solely based on experimental facts, satisfies Einstein's criterion of local causality and does not

  10. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  11. GATE Monte Carlo simulation in a cloud computing environment

    Science.gov (United States)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  12. Simulation Versus Models: Which One and When?

    Science.gov (United States)

    Dorn, William S.

    1975-01-01

    Describes two types of computer-based experiments: simulation (which assumes no student knowledge of the workings of the computer program) is recommended for experiments aimed at inductive reasoning; and modeling (which assumes student understanding of the computer program) is recommended for deductive processes. (MLH)

  13. The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method

    Directory of Open Access Journals (Sweden)

    Dipakkumar Gohil

    2012-06-01

    Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.

  14. Perspective: Computer simulations of long time dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Elber, Ron [Department of Chemistry, The Institute for Computational Engineering and Sciences, University of Texas at Austin, Austin, Texas 78712 (United States)

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.

  15. Methods and computer executable instructions for rapidly calculating simulated particle transport through geometrically modeled treatment volumes having uniform volume elements for use in radiotherapy

    Science.gov (United States)

    Frandsen, Michael W.; Wessol, Daniel E.; Wheeler, Floyd J.

    2001-01-16

    Methods and computer executable instructions are disclosed for ultimately developing a dosimetry plan for a treatment volume targeted for irradiation during cancer therapy. The dosimetry plan is available in "real-time" which especially enhances clinical use for in vivo applications. The real-time is achieved because of the novel geometric model constructed for the planned treatment volume which, in turn, allows for rapid calculations to be performed for simulated movements of particles along particle tracks there through. The particles are exemplary representations of neutrons emanating from a neutron source during BNCT. In a preferred embodiment, a medical image having a plurality of pixels of information representative of a treatment volume is obtained. The pixels are: (i) converted into a plurality of substantially uniform volume elements having substantially the same shape and volume of the pixels; and (ii) arranged into a geometric model of the treatment volume. An anatomical material associated with each uniform volume element is defined and stored. Thereafter, a movement of a particle along a particle track is defined through the geometric model along a primary direction of movement that begins in a starting element of the uniform volume elements and traverses to a next element of the uniform volume elements. The particle movement along the particle track is effectuated in integer based increments along the primary direction of movement until a position of intersection occurs that represents a condition where the anatomical material of the next element is substantially different from the anatomical material of the starting element. This position of intersection is then useful for indicating whether a neutron has been captured, scattered or exited from the geometric model. From this intersection, a distribution of radiation doses can be computed for use in the cancer therapy. The foregoing represents an advance in computational times by multiple factors of

  16. Theory Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Shlachter, Jack [Los Alamos National Laboratory

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  17. Model of computation for Fourier optical processors

    Science.gov (United States)

    Naughton, Thomas J.

    2000-05-01

    We present a novel and simple theoretical model of computation that captures what we believe are the most important characteristics of an optical Fourier transform processor. We use this abstract model to reason about the computational properties of the physical systems it describes. We define a grammar for our model's instruction language, and use it to write algorithms for well-known filtering and correlation techniques. We also suggest suitable computational complexity measures that could be used to analyze any coherent optical information processing technique, described with the language, for efficiency. Our choice of instruction language allows us to argue that algorithms describable with this model should have optical implementations that do not require a digital electronic computer to act as a master unit. Through simulation of a well known model of computation from computer theory we investigate the general-purpose capabilities of analog optical processors.

  18. Computer Simulation Study of Bipolaron Formation

    NARCIS (Netherlands)

    Raedt, H. De; Lagendijk, A.

    1986-01-01

    Monte Carlo computer simulation techniques are used to study the formation of bipolarons on a lattice. The transition between the three possible states, extended, two-polaron, and bipolaron is studied. The phase diagram as a function of the strengths of the electron-phonon coupling and repulsive int

  19. Computer simulations of phospholipid - membrane thermodynamic fluctuations

    DEFF Research Database (Denmark)

    Pedersen, U.R.; Peters, Günther H.j.; Schröder, T.B.

    2008-01-01

    This paper reports all-atom computer simulations of five phospholipid membranes, DMPC, DPPC, DMPG, DMPS, and DMPSH, with a focus on the thermal equilibrium fluctuations of volume, energy, area, thickness, and order parameter. For the slow fluctuations at constant temperature and pressure (defined...

  20. GENMAP--A Microbial Genetics Computer Simulation.

    Science.gov (United States)

    Day, M. J.; And Others

    1985-01-01

    An interactive computer program in microbial genetics is described. The simulation allows students to work at their own pace and develop understanding of microbial techniques as they choose donor bacterial strains, specify selective media, and interact with demonstration experiments. Sample questions and outputs are included. (DH)

  1. On the computational modeling of FSW processes

    OpenAIRE

    Agelet de Saracibar Bosch, Carlos; Chiumenti, Michèle; Santiago, Diego de; Cervera Ruiz, Miguel; Dialami, Narges; Lombera, Guillermo

    2010-01-01

    This work deals with the computational modeling and numerical simulation of Friction Stir Welding (FSW) processes. Here a quasi-static, transient, mixed stabilized Eulerian formulation is used. Norton-Hoff and Sheppard-Wright rigid thermoplastic material models have been considered. A product formula algorithm, leading to a staggered solution scheme, has been used. The model has been implemented into the in-house developed FE code COMET. Results obtained in the simulation of FSW process are c...

  2. Spiking network simulation code for petascale computers

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  3. Spiking network simulation code for petascale computers

    Directory of Open Access Journals (Sweden)

    Susanne eKunkel

    2014-10-01

    Full Text Available Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  4. Creation of an idealized nasopharynx geometry for accurate computational fluid dynamics simulations of nasal airflow in patient-specific models lacking the nasopharynx anatomy.

    Science.gov (United States)

    A T Borojeni, Azadeh; Frank-Ito, Dennis O; Kimbell, Julia S; Rhee, John S; Garcia, Guilherme J M

    2017-05-01

    Virtual surgery planning based on computational fluid dynamics (CFD) simulations has the potential to improve surgical outcomes for nasal airway obstruction patients, but the benefits of virtual surgery planning must outweigh the risks of radiation exposure. Cone beam computed tomography (CT) scans represent an attractive imaging modality for virtual surgery planning due to lower costs and lower radiation exposures compared with conventional CT scans. However, to minimize the radiation exposure, the cone beam CT sinusitis protocol sometimes images only the nasal cavity, excluding the nasopharynx. The goal of this study was to develop an idealized nasopharynx geometry for accurate representation of outlet boundary conditions when the nasopharynx geometry is unavailable. Anatomically accurate models of the nasopharynx created from 30 CT scans were intersected with planes rotated at different angles to obtain an average geometry. Cross sections of the idealized nasopharynx were approximated as ellipses with cross-sectional areas and aspect ratios equal to the average in the actual patient-specific models. CFD simulations were performed to investigate whether nasal airflow patterns were affected when the CT-based nasopharynx was replaced by the idealized nasopharynx in 10 nasal airway obstruction patients. Despite the simple form of the idealized geometry, all biophysical variables (nasal resistance, airflow rate, and heat fluxes) were very similar in the idealized vs patient-specific models. The results confirmed the expectation that the nasopharynx geometry has a minimal effect in the nasal airflow patterns during inspiration. The idealized nasopharynx geometry will be useful in future CFD studies of nasal airflow based on medical images that exclude the nasopharynx. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Simulation modeling of carcinogenesis.

    Science.gov (United States)

    Ellwein, L B; Cohen, S M

    1992-03-01

    A discrete-time simulation model of carcinogenesis is described mathematically using recursive relationships between time-varying model variables. The dynamics of cellular behavior is represented within a biological framework that encompasses two irreversible and heritable genetic changes. Empirical data and biological supposition dealing with both control and experimental animal groups are used together to establish values for model input variables. The estimation of these variables is integral to the simulation process as described in step-by-step detail. Hepatocarcinogenesis in male F344 rats provides the basis for seven modeling scenarios which illustrate the complexity of relationships among cell proliferation, genotoxicity, and tumor risk.

  6. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  7. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  8. Computational algorithms for simulations in atmospheric optics.

    Science.gov (United States)

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  9. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  10. Capability of a regional climate model to simulate climate variables requested for water balance computation: a case study over northeastern France

    Science.gov (United States)

    Boulard, Damien; Castel, Thierry; Camberlin, Pierre; Sergent, Anne-Sophie; Bréda, Nathalie; Badeau, Vincent; Rossi, Aurélien; Pohl, Benjamin

    2016-05-01

    This paper documents the capability of the ARW/WRF regional climate model to regionalize near-surface atmospheric variables at high resolution (8 km) over Burgundy (northeastern France) from daily to interannual timescales. To that purpose, a 20-year continuous simulation (1989-2008) was carried out. The WRF model driven by ERA-Interim reanalyses was compared to in situ observations and a mesoscale atmospheric analyses system (SAFRAN) for five near-surface variables: precipitation, air temperature, wind speed, relative humidity and solar radiation, the last four variables being used for the calculation of potential evapotranspiration (ET0). Results show a significant improvement upon ERA-Interim. This is due to a good skill of the model to reproduce the spatial distribution for all weather variables, in spite of a slight over-estimation of precipitation amounts mostly during the summer convective season, and wind speed during winter. As compared to the Météo-France observations, WRF also improves upon SAFRAN analyses, which partly fail at showing realistic spatial distributions for wind speed, relative humidity and solar radiation—the latter being strongly underestimated. The SAFRAN ET0 is thus highly under-estimated too. WRF ET0 is in better agreement with observations. In order to evaluate WRF's capability to simulate a reliable ET0, the water balance of thirty Douglas-fir stands was computed using a process-based model. Three soil water deficit indexes corresponding to the sum of the daily deviations between the relative extractible water and a critical value of 40 % below which the low soil water content affects tree growth, were calculated using the nearest weather station, SAFRAN analyses weather data, or by merging observation and WRF weather variables. Correlations between Douglas-fir growth and the three estimated soil water deficit indexes show similar results. These results showed through the ET0 estimation and the relation between mean annual SWDI

  11. A computer-based simulator of the atmospheric turbulence

    Science.gov (United States)

    Konyaev, Petr A.

    2015-11-01

    Computer software for modeling the atmospheric turbulence is developed on the basis of a time-varying random medium simulation algorithm and a split-step Fourier transform method for solving a wave propagation equation. A judicious choice of the simulator parameters, like the velocity of the evolution and motion of the medium, turbulence spectrum and scales, enables different effects of a random medium on the optical wavefront to be simulated. The implementation of the simulation software is shown to be simple and efficient due to parallel programming functions from the MKL Intel ® Parallel Studio libraries.

  12. Computational Fluid Dynamics Modeling of a wood-burning stove-heated sauna using NIST's Fire Dynamics Simulator

    CERN Document Server

    Macqueron, Corentin

    2014-01-01

    The traditional sauna is studied from a thermal and fluid dynamics standpoint using the NIST's Fire Dynamics Simulator (FDS) software. Calculations are performed in order to determine temperature and velocity fields, heat flux, soot and steam cloud transport, etc. Results are discussed in order to assess the reliability of this new kind of utilization of the FDS fire safety engineering software.

  13. Computation simulation of the nonlinear response of suspension bridges

    Energy Technology Data Exchange (ETDEWEB)

    McCallen, D.B.; Astaneh-Asl, A.

    1997-10-01

    Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

  14. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  15. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  16. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2001-01-01

    Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...

  17. Fluid Dynamics Theory, Computation, and Numerical Simulation

    CERN Document Server

    Pozrikidis, Constantine

    2009-01-01

    Fluid Dynamics: Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner. The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming. This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice. There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes. Two distinguishing features of the discourse are: solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty. Matlab codes are presented and discussed for ...

  18. Genetic crossing vs cloning by computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, S. [Cologne Univ., Koeln (Germany)

    1997-06-01

    We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  19. Genetic Crossing vs Cloning by Computer Simulation

    Science.gov (United States)

    Dasgupta, Subinay

    We perform Monte Carlo simulation using Penna's bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  20. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  1. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  2. Computer simulation of FCC riser reactors.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S. L.; Golchert, B.; Lottes, S. A.; Petrick, M.; Zhou, C. Q.

    1999-04-20

    A three-dimensional computational fluid dynamics (CFD) code, ICRKFLO, was developed to simulate the multiphase reacting flow system in a fluid catalytic cracking (FCC) riser reactor. The code solve flow properties based on fundamental conservation laws of mass, momentum, and energy for gas, liquid, and solid phases. Useful phenomenological models were developed to represent the controlling FCC processes, including droplet dispersion and evaporation, particle-solid interactions, and interfacial heat transfer between gas, droplets, and particles. Techniques were also developed to facilitate numerical calculations. These techniques include a hybrid flow-kinetic treatment to include detailed kinetic calculations, a time-integral approach to overcome numerical stiffness problems of chemical reactions, and a sectional coupling and blocked-cell technique for handling complex geometry. The copyrighted ICRKFLO software has been validated with experimental data from pilot- and commercial-scale FCC units. The code can be used to evaluate the impacts of design and operating conditions on the production of gasoline and other oil products.

  3. Computer simulation of combustion of mine fires

    Institute of Scientific and Technical Information of China (English)

    余明高; 张和平; 范维澄; 王清安

    2002-01-01

    According to control theories, mine fires can be considered as an unsteady process after the normal ventilation system is disturbed. Applied the principal of physical chemistry and thermal fluid mechanics, the parameters models of the unsteady state system have been given, such as fuel combustion rate, heat of combustion, concentration, temperature, heat losses, heat resistance, work of expansion and heat pressure difference. The results of the calculation agree approximately with the results of the test. By the computer simulation, it is shown that the main factor of producing the throttling effect is the fire rate, second is the heat resistance and the heat pressure difference. The rate of heat flow that passes through the airway wall is the maximum on the surface, and decrease with time. The heat transfer progresses only within the range of 0.5 m away from theairway wall during combustion for 2 hours. Its variable for the mass flux rate and the percentage concentration of the gas along the airway of the downstream. When the delayed time is very small, the variation can be neglected. Viscosity resistance is the main part of the heat resistance, second is the expansion resistance that is less than tens Pascal when Mach number is very small. Work of expansion is principally turned into heat losses, only a very small part is consumed by the work of the heat resistance and the inertia acceleration.

  4. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Computational Challenges in Nuclear Weapons Simulation

    Energy Technology Data Exchange (ETDEWEB)

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  6. Testing alternative conceptual models of seawater intrusion in a coastal aquifer using computer simulation, southern California, USA

    Science.gov (United States)

    Nishikawa, T.

    1997-01-01

    Two alternative conceptual models of the physical processes controlling seawater intrusion in a coastal basin in California, USA, were tested to identify a likely principal pathway for seawater intrusion. The conceptual models were tested by using a two-dimensional, finite-element groundwater flow and transport model. This pathway was identified by the conceptual model that best replicated the historical data. The numerical model was applied in cross section to a submarine canyon that is a main avenue for seawater to enter the aquifer system underlying the study area. Both models are characterized by a heterogeneous, layered, water-bearing aquifer. However, the first model is characterized by flat-lying aquifer layers and by a high value of hydraulic conductivity in the basal aquifer layer, which is thought to be a principal conduit for seawater intrusion. The second model is characterized by offshore folding, which was modeled as a very nearshore outcrop, thereby providing a shorter path for seawater to intrude. General conclusions are that: 1) the aquifer system is best modeled as a flat, heterogeneous, layered system; 2) relatively thin basal layers with relatively high values of hydraulic conductivity are the principal pathways for seawater intrusion; and 3) continuous clay layers of low hydraulic conductivity play an important role in controlling the movement of seawater.

  7. Plasticity modeling & computation

    CERN Document Server

    Borja, Ronaldo I

    2013-01-01

    There have been many excellent books written on the subject of plastic deformation in solids, but rarely can one find a textbook on this subject. “Plasticity Modeling & Computation” is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids. It adopts a simple narrative style that is not mathematically overbearing, and has been written to emulate a professor giving a lecture on this subject inside a classroom. Each section is written to provide a balance between the relevant equations and the explanations behind them. Where relevant, sections end with one or more exercises designed to reinforce the understanding of the “lecture.” Color figures enhance the presentation and make the book very pleasant to read. For professors planning to use this textbook for their classes, the contents are sufficient for Parts A and B that can be taught in sequence over a period of two semesters or quarters.

  8. Computational fluid dynamics for sport simulation

    CERN Document Server

    2009-01-01

    All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.

  9. And So It Grows: Using a Computer-Based Simulation of a Population Growth Model to Integrate Biology & Mathematics

    Science.gov (United States)

    Street, Garrett M.; Laubach, Timothy A.

    2013-01-01

    We provide a 5E structured-inquiry lesson so that students can learn more of the mathematics behind the logistic model of population biology. By using models and mathematics, students understand how population dynamics can be influenced by relatively simple changes in the environment.

  10. Computer Simulation of Convective Plasma Cells

    CERN Document Server

    Carboni, Rodrigo

    2015-01-01

    Computer simulations of plasmas are relevant nowadays, because it helps us understand physical processes taking place in the sun and other stellar objects. We developed a program called PCell which is intended for displaying the evolution of the magnetic field in a 2D convective plasma cell with perfect conducting walls for different stationary plasma velocity fields. Applications of this program are presented. This software works interactively with the mouse and the users can create their own movies in MPEG format. The programs were written in Fortran and C. There are two versions of the program (GNUPLOT and OpenGL). GNUPLOT and OpenGL are used to display the simulation.

  11. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  12. Instructional Advice, Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2010-01-01

    Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…

  13. Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2011-01-01

    Students (N = 101) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without time advice) x 3 (with learning questions and corrective feedback, with…

  14. Solving wood chip transport problems with computer simulation.

    Science.gov (United States)

    Dennis P. Bradley; Sharon A. Winsauer

    1976-01-01

    Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.

  15. Sensitivity analysis of airport noise using computer simulation

    Directory of Open Access Journals (Sweden)

    Flavio Maldonado Bentes

    2011-09-01

    Full Text Available This paper presents the method to analyze the sensitivity of airport noise using computer simulation with the aid of Integrated Noise Model 7.0. The technique serves to support the selection of alternatives to better control aircraft noise, since it helps identify which areas of the noise curves experienced greater variation from changes in aircraft movements at a particular airport.

  16. Computer simulation of cytoskeleton-induced blebbing in lipid membranes

    DEFF Research Database (Denmark)

    Spangler, E. J.; Harvey, C. W.; Revalee, J. D.

    2011-01-01

    Blebs are balloon-shaped membrane protrusions that form during many physiological processes. Using computer simulation of a particle-based model for self-assembled lipid bilayers coupled to an elastic meshwork, we investigated the phase behavior and kinetics of blebbing. We found that blebs form...

  17. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  18. Revolutions in energy through modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tatro, M.; Woodard, J.

    1998-08-01

    The development and application of energy technologies for all aspects from generation to storage have improved dramatically with the advent of advanced computational tools, particularly modeling and simulation. Modeling and simulation are not new to energy technology development, and have been used extensively ever since the first commercial computers were available. However, recent advances in computing power and access have broadened the extent and use, and, through increased fidelity (i.e., accuracy) of the models due to greatly enhanced computing power, the increased reliance on modeling and simulation has shifted the balance point between modeling and experimentation. The complex nature of energy technologies has motivated researchers to use these tools to understand better performance, reliability and cost issues related to energy. The tools originated in sciences such as the strength of materials (nuclear reactor containment vessels); physics, heat transfer and fluid flow (oil production); chemistry, physics, and electronics (photovoltaics); and geosciences and fluid flow (oil exploration and reservoir storage). Other tools include mathematics, such as statistics, for assessing project risks. This paper describes a few advancements made possible by these tools and explores the benefits and costs of their use, particularly as they relate to the acceleration of energy technology development. The computational complexity ranges from basic spreadsheets to complex numerical simulations using hardware ranging from personal computers (PCs) to Cray computers. In all cases, the benefits of using modeling and simulation relate to lower risks, accelerated technology development, or lower cost projects.

  19. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  20. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    , Pressure profile calculations in lipid bilayers: A lipid bilayer is merely $\\sim$5~nm thick, but the lateral pressure (parallel to the bilayer plane) varies several hundred bar on this short distance (normal to the bilayer). These variations in the lateral pressure are commonly referred to as the pressure...... of neglecting pressure contributions from long range electrostatic interactions. The first issue is addressed by comparing two methods for calculating pressure profiles, and judged by the similar results obtained by these two methods the pressure profile appears to be well-defined for fluid phase lipid bilayers......The importance of computer simulations in lipid bilayer research has become more prominent for the last couple of decades and as computers get even faster, simulations will play an increasingly important part of understanding the processes that take place in and across cell membranes. This thesis...

  1. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  2. Computer simulation of molecular sorption in zeolites

    CERN Document Server

    Calmiano, M D

    2001-01-01

    The work presented in this thesis encompasses the computer simulation of molecular sorption. In Chapter 1 we outline the aims and objectives of this work. Chapter 2 follows in which an introduction to sorption in zeolites is presented, with discussion of structure and properties of the main zeolites studied. Chapter 2 concludes with a description of the principles and theories of adsorption. In Chapter 3 we describe the methodology behind the work carried out in this thesis. In Chapter 4 we present our first computational study, that of the sorption of krypton in silicalite. We describe work carried out to investigate low energy sorption sites of krypton in silicalite where we observe krypton to preferentially sorb into straight and sinusoidal channels over channel intersections. We simulate single step type I adsorption isotherms and use molecular dynamics to study the diffusion of krypton and obtain division coefficients and the activation energy. We compare our results to previous experimental and computat...

  3. Influence of frustrations on the thermodynamic properties of the low-dimensional Potts model studied by computer simulation

    Science.gov (United States)

    Babaev, A. B.; Murtazaev, A. K.; Suleimanov, E. M.; Rizvanova, T. R.

    2016-10-01

    Influence of disorder in the form of frustration on the thermodynamic behavior of a two-dimensional three-vertex Potts model has been studied by the Monte Carlo method, taking into account the nearest and next-nearest neighbors. Systems with linear sizes of L × L = N ( L = 9-48) on a triangular lattice have been considered. It has been shown that in the case of J 1 > 0 and J 2 model undergoes a phase transition outside this region.

  4. Computational modeling of failure in composite laminates

    NARCIS (Netherlands)

    Van der Meer, F.P.

    2010-01-01

    There is no state of the art computational model that is good enough for predictive simulation of the complete failure process in laminates. Already on the single ply level controversy exists. Much work has been done in recent years in the development of continuum models, but these fail to predict t

  5. Generating computational models for serious gaming

    NARCIS (Netherlands)

    Westera, Wim

    2014-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  6. Modeling and Simulation of Nanoindentation

    Science.gov (United States)

    Huang, Sixie; Zhou, Caizhi

    2017-08-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  7. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  8. Computer Simulation of Convective Plasma Cells

    OpenAIRE

    Carboni, Rodrigo; Frutos-Alfaro, Francisco

    2015-01-01

    Computer simulations of plasmas are relevant nowadays, because it helps us understand physical processes taking place in the sun and other stellar objects. We developed a program called PCell which is intended for displaying the evolution of the magnetic field in a 2D convective plasma cell with perfect conducting walls for different stationary plasma velocity fields. Applications of this program are presented. This software works interactively with the mouse and the users can create their ow...

  9. Computer simulation of the micropulse imaging lidar

    Science.gov (United States)

    Dai, Yongjiang; Zhao, Hongwei; Zhao, Yu; Wang, Xiaoou

    2000-10-01

    In this paper a design method of the Micro Pulse Lidar (MPL) is introduced, that is a computer simulation of the MPL. Some of the MPL parameters concerned air scattered and the effects on the performance of the lidar are discussed. The design software for the lidar with diode pumped solid laser is programmed by MATLAB. This software is consisted of six modules, that is transmitter, atmosphere, target, receiver, processor and display system. The method can be extended some kinds of lidar.

  10. Computer simulation of complexity in plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, Takaya; Sato, Tetsuya [National Inst. for Fusion Science, Toki, Gifu (Japan)

    1998-08-01

    By making a comprehensive comparative study of many self-organizing phenomena occurring in magnetohydrodynamics and kinetic plasmas, we came up with a hypothetical grand view of self-organization. This assertion is confirmed by a recent computer simulation for a broader science field, specifically, the structure formation of short polymer chains, where the nature of the interaction is completely different from that of plasmas. It is found that the formation of the global orientation order proceeds stepwise. (author)

  11. Computer Simulation of Multidimensional Archaeological Artefacts

    Directory of Open Access Journals (Sweden)

    Vera Moitinho de Almeida

    2012-11-01

    Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.

  12. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  13. Simulations of the pipe overpack to compute constitutive model parameters for use in WIPP room closure calculations.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung Yoon; Hansen, Francis D.

    2004-07-01

    The regulatory compliance determination for the Waste Isolation Pilot Plant includes the consideration of room closure. Elements of the geomechanical processes include salt creep, gas generation and mechanical deformation of the waste residing in the rooms. The WIPP was certified as complying with regulatory requirements based in part on the implementation of room closure and material models for the waste. Since the WIPP began receiving waste in 1999, waste packages have been identified that are appreciably more robust than the 55-gallon drums characterized for the initial calculations. The pipe overpack comprises one such waste package. This report develops material model parameters for the pipe overpack containers by using axisymmetrical finite element models. Known material properties and structural dimensions allow well constrained models to be completed for uniaxial, triaxial, and hydrostatic compression of the pipe overpack waste package. These analyses show that the pipe overpack waste package is far more rigid than the originally certified drum. The model parameters developed in this report are used subsequently to evaluate the implications to performance assessment calculations.

  14. QCWAVE, a Mathematica quantum computer simulation update

    CERN Document Server

    Tabakin, Frank

    2011-01-01

    This Mathematica 7.0/8.0 package upgrades and extends the quantum computer simulation code called QDENSITY. Use of the density matrix was emphasized in QDENSITY, although that code was also applicable to a quantum state description. In the present version, the quantum state version is stressed and made amenable to future extensions to parallel computer simulations. The add-on QCWAVE extends QDENSITY in several ways. The first way is to describe the action of one, two and three- qubit quantum gates as a set of small ($2 \\times 2, 4\\times 4$ or $8\\times 8$) matrices acting on the $2^{n_q}$ amplitudes for a system of $n_q$ qubits. This procedure was described in our parallel computer simulation QCMPI and is reviewed here. The advantage is that smaller storage demands are made, without loss of speed, and that the procedure can take advantage of message passing interface (MPI) techniques, which will hopefully be generally available in future Mathematica versions. Another extension of QDENSITY provided here is a mu...

  15. Computer modelling and numerical simulation of the solid state diode pumped Nd:YAG laser with intracavity saturable absorber

    OpenAIRE

    Yashkir, Yuriy

    2009-01-01

    Stimulated emission in the Nd:YAG laser with a saturable absorber Cr:YAG is modelled as a superposition of interacting optical modes which are stable in a given optical cavity. The interaction of the active laser crystal and a passive Q-switch with diode pump and optical cavity modes is modelled with account of transversal two-dimentional variation of the pump field and of all participating optical modes. Each elementary volume of an active laser crystal and the Q-switching crystal in...

  16. Simulating Boolean circuits on a DNA computer

    Energy Technology Data Exchange (ETDEWEB)

    Ogihara, Mitsunori; Ray, A. [Univ. of Rochester, NY (United States)

    1997-12-01

    We demonstrate that DNA computers can simulate Boolean circuits with a small overhead. Boolean circuits embody the notion of massively parallel signal processing and are frequently encountered in many parallel algorithms. Many important problems such as sorting, integer arithmetic, and matrix multiplication are known to be computable by small size Boolean circuits much faster than by ordinary sequential digital computers. This paper shows that DNA chemistry allows one to simulate large semi-unbounded fan-in Boolean circuits with a logarithmic slowdown in computation time. Also, for the class NC{sup 1}, the slowdown can be reduced to a constant. In this algorithm we have encoded the inputs, the Boolean AND gates, and the OR gates to DNA oligonucleotide sequences. We operate on the gates and the inputs by standard molecular techniques of sequence-specific annealing, ligation, separation by size, amplification, sequence-specific cleavage, and detection by size. Additional steps of amplification are not necessary for NC{sup 1} circuits. Preliminary biochemical experiments on a small test circuit have produced encouraging results. Further confirmatory experiments are in progress. 19 refs., 3 figs., 1 tab.

  17. Validated physical models and parameters of bulk 3C–SiC aiming for credible technology computer aided design (TCAD) simulation

    Science.gov (United States)

    Arvanitopoulos, A.; Lophitis, N.; Gyftakis, K. N.; Perkins, S.; Antoniou, M.

    2017-10-01

    The cubic form of SiC (β- or 3C-) compared to the hexagonal α-SiC polytypes, primarily 4H- and 6H–SiC, has lower growth cost and can be grown heteroepitaxially in large area silicon (Si) wafers which makes it of special interest. This in conjunction with the recently reported growth of improved quality 3C–SiC, make the development of devices an imminent objective. However, the readiness of models that accurately predict the material characteristics, properties and performance is an imperative requirement for attaining the design and optimization of functional devices. The purpose of this study is to provide and validate a comprehensive set of models alongside with their parameters for bulk 3C–SiC. The validation process revealed that the proposed models are in a very good agreement to experimental data and confidence ranges were identified. This is the first piece of work achieving that for 3C–SiC. Considerably, it constitutes the necessary step for finite element method simulations and technology computer aided design.

  18. Dynamic 99mTc-MAG3 renography: images for quality control obtained by combining pharmacokinetic modelling, an anthropomorphic computer phantom and Monte Carlo simulated scintillation camera imaging

    Science.gov (United States)

    Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael

    2013-05-01

    In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.

  19. A Computational Model to Simulate Groundwater Seepage Risk in Support of Geotechnical Investigations of Levee and Dam Projects

    Science.gov (United States)

    2013-03-01

    process-imitating rules. The model aggrades an alluvial floodplain, creating floodplain architecture by differentiating between sediment deposited by...meandering rivers. The results suggest that the channel aggradation rate influenced heavily the relative channel avulsion frequency during floodplain...composition and organization of the river basin and its floodplain (Schumm 1968). In an actively building ( aggrading ) floodplain, the river channel is

  20. Dynamic computer simulation of the Fort St. Vrain steam turbines

    Energy Technology Data Exchange (ETDEWEB)

    Conklin, J.C.

    1983-01-01

    A computer simulation is described for the dynamic response of the Fort St. Vrain nuclear reactor regenerative intermediate- and low-pressure steam turbines. The fundamental computer-modeling assumptions for the turbines and feedwater heaters are developed. A turbine heat balance specifying steam and feedwater conditions at a given generator load and the volumes of the feedwater heaters are all that are necessary as descriptive input parameters. Actual plant data for a generator load reduction from 100 to 50% power (which occurred as part of a plant transient on November 9, 1981) are compared with computer-generated predictions, with reasonably good agreement.