WorldWideScience

Sample records for models computer simulation

  1. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  2. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    leaving students. It is a probabilistic model. In the next part of this article, two more models - 'input/output model' used for production systems or economic studies and a. 'discrete event simulation model' are introduced. Aircraft Performance Model.

  3. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Most systems involve parameters and variables, which are random variables due to uncertainties. Probabilistic meth- ods are powerful in modelling such systems. In this second part, we describe probabilistic models and Monte Carlo simulation along with 'classical' matrix methods and differ- ential equations as most real ...

  4. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    A familiar example of a feedback loop is the business model in which part of the output or profit is fedback as input or additional capital - for instance, a company may choose to reinvest 10% of the profit for expansion of the business. Such simple models, like ..... would help scientists, engineers and managers towards better.

  5. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Modelling Deterministic Systems. N K Srinivasan gradu- ated from Indian. Institute of Science and obtained his Doctorate from Columbia Univer- sity, New York. He has taught in several universities, and later did system analysis, wargaming and simula- tion for defence. His other areas of interest are reliability engineer-.

  6. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  8. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  9. Evaluation of Marine Corps Manpower Computer Simulation Model

    Science.gov (United States)

    2016-12-01

    overall end strength are maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language...maintained. To assist their mission, an agent-based computer simulation model was developed in the Java computer language. This thesis investigates that...a simulation software that models business practices to assist that business in its “ability to analyze and make decisions on how to improve (their

  10. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...

  11. Using Computational Simulations to Confront Students' Mental Models

    Science.gov (United States)

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  12. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  13. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  14. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  15. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  16. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  18. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  19. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  20. Computer simulation study of water using a fluctuating charge model

    Indian Academy of Sciences (India)

    Unknown

    Abstract. Hydrogen bonding in small water clusters is studied through computer simulation methods using a sophisticated, empirical model of interaction developed by Rick et al (S W Rick, S J Stuart and B J Berne 1994 J. Chem. Phys. 101 6141) and others. The model allows for the charges on the interacting sites to ...

  1. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Developmental and Reproductive Toxicity (DART) testing is important for assessing the potential consequences of drug and chemical exposure on human health and well-being. Complexity of pregnancy and the reproductive cycle makes DART testing challenging and costly for traditional (animal-based) methods. A compendium of in vitro data from ToxCast/Tox21 high-throughput screening (HTS) programs is available for predictive toxicology. ‘Predictive DART’ will require an integrative strategy that mobilizes HTS data into in silico models that capture the relevant embryology. This lecture addresses progress on EPA's 'virtual embryo'. The question of how tissues and organs are shaped during development is crucial for understanding (and predicting) human birth defects. While ToxCast HTS data may predict developmental toxicity with reasonable accuracy, mechanistic models are still necessary to capture the relevant biology. Subtle microscopic changes induced chemically may amplify to an adverse outcome but coarse changes may override lesion propagation in any complex adaptive system. Modeling system dynamics in a developing tissue is a multiscale problem that challenges our ability to predict toxicity from in vitro profiling data (ToxCast/Tox21). (DISCLAIMER: The views expressed in this presentation are those of the presenter and do not necessarily reflect the views or policies of the US EPA). This was an invited seminar presentation to the National Institute for Public H

  2. Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing)

    Science.gov (United States)

    2016-04-01

    of Promises with Cloud • Cost efficiency • Unlimited storage • Backup and recovery • Automatic software integration • Easy access to information...Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing) Amy E. Henninger I N S T I T U T E F O R D E F E N S E A N A L...under contract HQ0034-14-D-0001, Project AI-2-3077, “ Cloud Computing for Modeling and Simulation,” for Office of the Deputy Assistant Director of

  3. Blast Load Simulator Experiments for Computational Model Validation Report 3

    Science.gov (United States)

    2017-07-01

    establish confidence in the simulation results specific to their intended use. One method for providing experimental data for computational model...walls, to higher blast pressures required to evaluate the performance of protective construction methods . Figure 1. ERDC Blast Load Simulator (BLS... Instrumentation included 3 pressure gauges mounted on the steel calibration plate, 2 pressure gauges mounted in the wall of the BLS, and 25 pressure gauges

  4. Computer simulation study of water using a fluctuating charge model

    Indian Academy of Sciences (India)

    Unknown

    study of water through computer simulation methods has attracted considerable attention. ... water. In particular, the single particle and collective relaxation times obtained using this model are in rough agreement with experiment. Yet, in all these quantities, the ..... The fictitious mass of the charge has to be chosen with care.

  5. Modeling and simulation the computer science of illusion

    CERN Document Server

    Raczynski, Stanislaw

    2006-01-01

    Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of

  6. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  7. Computational electronics semiclassical and quantum device modeling and simulation

    CERN Document Server

    Vasileska, Dragica; Klimeck, Gerhard

    2010-01-01

    Starting with the simplest semiclassical approaches and ending with the description of complex fully quantum-mechanical methods for quantum transport analysis of state-of-the-art devices, Computational Electronics: Semiclassical and Quantum Device Modeling and Simulation provides a comprehensive overview of the essential techniques and methods for effectively analyzing transport in semiconductor devices. With the transistor reaching its limits and new device designs and paradigms of operation being explored, this timely resource delivers the simulation methods needed to properly model state-of

  8. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  9. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  10. Ravenscar Computational Model compliant AADL Simulation on LEON2

    Directory of Open Access Journals (Sweden)

    Roberto Varona-Gómez

    2013-02-01

    Full Text Available AADL has been proposed for designing and analyzing SW and HW architectures for real-time mission-critical embedded systems. Although the Behavioral Annex improves its simulation semantics, AADL is a language for analyzing architectures and not for simulating them. AADS-T is an AADL simulation tool that supports the performance analysis of the AADL specification throughout the refinement process from the initial system architecture until the complete, detailed application and execution platform are developed. In this way, AADS-T enables the verification of the initial timing constraints during the complete design process. In this paper we focus on the compatibility of AADS-T with the Ravenscar Computational Model (RCM as part of the TASTE toolset. Its flexibility enables AADS-T to support different processors. In this work we have focused on performing the simulation on a LEON2 processor.

  11. Simulation models for computational plasma physics: Concluding report

    International Nuclear Information System (INIS)

    Hewett, D.W.

    1994-01-01

    In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell's equations (DiPeso, Hewett, accepted, J. Comp. Phys., 1993). This low frequency model is called the Streamlined Darwin Field model (SDF, Hewett, Larson, and Doss, J. Comp. Phys., 1992) has now been implemented in a fully non-neutral SDF code BEAGLE (Larson, Ph.D. dissertation, 1993) and has further extended to the quasi-neutral limit (DiPeso, Hewett, Comp. Phys. Comm., 1993). In addition, they have resurrected the quasi-neutral, zero-electron-inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work (Hewett, Chen, ICF Quarterly Report, July--September, 1993). Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer -- thus opening the door for the use of all their ADI schemes on these new computer architecture's (Mattor, Williams, Hewett, submitted to Parallel Computing, 1993)

  12. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  13. Protein adsorption on nanoparticles: model development using computer simulation.

    Science.gov (United States)

    Shao, Qing; Hall, Carol K

    2016-10-19

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.

  14. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2012-01-01

    Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.

  15. A model ecosystem experiment and its computational simulation studies

    International Nuclear Information System (INIS)

    Doi, M.

    2002-01-01

    Simplified microbial model ecosystem and its computer simulation model are introduced as eco-toxicity test for the assessment of environmental responses from the effects of environmental impacts. To take the effects on the interactions between species and environment into account, one option is to select the keystone species on the basis of ecological knowledge, and to put it in the single-species toxicity test. Another option proposed is to put the eco-toxicity tests as experimental micro ecosystem study and a theoretical model ecosystem analysis. With these tests, the stressors which are more harmful to the ecosystems should be replace with less harmful ones on the basis of unified measures. Management of radioactive materials, chemicals, hyper-eutrophic, and other artificial disturbances of ecosystem should be discussed consistently from the unified view point of environmental protection. (N.C.)

  16. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira

    2014-08-01

    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  17. Modelling of dusty plasma properties by computer simulation methods

    Energy Technology Data Exchange (ETDEWEB)

    Baimbetov, F B [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Ramazanov, T S [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Dzhumagulova, K N [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Kadyrsizov, E R [Institute for High Energy Densities of RAS, Izhorskaya 13/19, Moscow 125412 (Russian Federation); Petrov, O F [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Gavrikov, A V [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan)

    2006-04-28

    Computer simulation of dusty plasma properties is performed. The radial distribution functions, the diffusion coefficient are calculated on the basis of the Langevin dynamics. A comparison with the experimental data is made.

  18. Simulation-Based Inquiry Learning and Computer Modeling: Pitfalls and Potentials

    NARCIS (Netherlands)

    Mulder, Y.G.; Lazonder, Adrianus W.; de Jong, Anthonius J.M.

    2015-01-01

    Background. Inquiry learning environments increasingly incorporate simulation and modeling facilities. Students acquire knowledge through systematic experimentation with the simulations and express that knowledge in runnable computer models. Aim. As inquiry and modeling activities are new and

  19. Computational model for simulation small testing launcher, technical solution

    Science.gov (United States)

    Chelaru, Teodor-Viorel; Cristian, Barbu; Chelaru, Adrian

    2014-12-01

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project "Suborbital Launcher for Testing" (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle

  20. Flow Through a Laboratory Sediment Sample by Computer Simulation Modeling

    Science.gov (United States)

    2006-09-07

    sands; Interacting lattice gas; Computer simulation: Driven flow 16. SECURITY CLASSIFICATION OF: a. REPORT Unclassified b. ABSTRACT Unclassified...Transport in Porous Media, Springer, Berlin. 2000. [3] B. Loret, J.M. Huyghe (Eds.), Chemo-Mechanical Couplings in Porous Media Geomechanics and

  1. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  2. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  3. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  4. Some computer simulations based on the linear relative risk model

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs

  5. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models; FINAL

    International Nuclear Information System (INIS)

    Cook, Chris B; Richmond, Marshall C

    2001-01-01

    This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields

  6. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Christopher B.; Richmond, Marshall C.

    2001-05-01

    This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.

  7. Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System

    Science.gov (United States)

    2017-08-01

    Ballistic System by Daniel J Hornbaker Approved for public release; distribution is unlimited. NOTICES...Uncertainty from Computational Factors in Simulations of a Model Ballistic System by Daniel J Hornbaker Weapons and Materials Research...November 2016 4. TITLE AND SUBTITLE Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System 5a. CONTRACT NUMBER

  8. Computer modeling and simulation in inertial confinement fusion

    International Nuclear Information System (INIS)

    McCrory, R.L.; Verdon, C.P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics. 46 refs., 19 figs., 1 tab

  9. The design and calibration of a simulation model of a star computer network

    CERN Document Server

    Gomaa, H

    1982-01-01

    A simulation model of the CERN(European Organization for Nuclear Research) SPS star computer network is described. The model concentrates on simulating the message handling computer, through which all messages in the network pass. The paper describes the main features of the model, the transfer time parameters in the model and how performance measurements were used to assist in the calibration of the model.

  10. Dilbert-Peter model of organization effectiveness: computer simulations

    OpenAIRE

    Sobkowicz, Pawel

    2010-01-01

    We describe a computer model of general effectiveness of a hierarchical organization depending on two main aspects: effects of promotion to managerial levels and efforts to self-promote of individual employees, reducing their actual productivity. The combination of judgment by appearance in the promotion to higher levels of hierarchy and the Peter Principle (which states that people are promoted to their level of incompetence) results in fast declines in effectiveness of the organization. The...

  11. Computational modeling, optimization and manufacturing simulation of advanced engineering materials

    CERN Document Server

    2016-01-01

    This volume presents recent research work focused in the development of adequate theoretical and numerical formulations to describe the behavior of advanced engineering materials.  Particular emphasis is devoted to applications in the fields of biological tissues, phase changing and porous materials, polymers and to micro/nano scale modeling. Sensitivity analysis, gradient and non-gradient based optimization procedures are involved in many of the chapters, aiming at the solution of constitutive inverse problems and parameter identification. All these relevant topics are exposed by experienced international and inter institutional research teams resulting in a high level compilation. The book is a valuable research reference for scientists, senior undergraduate and graduate students, as well as for engineers acting in the area of computational material modeling.

  12. Math modeling and computer mechanization for real time simulation of rotary-wing aircraft

    Science.gov (United States)

    Howe, R. M.

    1979-01-01

    Mathematical modeling and computer mechanization for real time simulation of rotary wing aircraft is discussed. Error analysis in the digital simulation of dynamic systems, such as rotary wing aircraft is described. The method for digital simulation of nonlinearities with discontinuities, such as exist in typical flight control systems and rotor blade hinges, is discussed.

  13. Modelling physics detectors in a computer aided design system for simulation purposes

    International Nuclear Information System (INIS)

    Ahvenainen, J.; Oksakivi, T.; Vuoskoski, J.

    1995-01-01

    The possibility of transferring physics detector models from computer aided design systems into physics simulation packages like GEANT is receiving increasing attention. The problem of exporting detector models constructed in CAD systems into GEANT is well known. We discuss the problem and describe an application, called DDT, which allows one to design detector models in a CAD system and then transfer the models into GEANT for simulation purposes. (orig.)

  14. Exploring Students' Computational Thinking Skills in Modeling and Simulation Projects: : A Pilot Study

    NARCIS (Netherlands)

    Grgurina, Natasa; van Veen, Klaas; Barendsen, Erik; Zwaneveld, Bert; Suhre, Cor; Gal-Ezer, Judith; Sentance, Sue; Vahrenhold, Jan

    2015-01-01

    Computational Thinking (CT) is gaining a lot of attention in education. We explored how to discern the occurrences of CT in the projects of 12th grade high school students in the computer science (CS) course. Within the projects, they constructed models and ran simulations of phenomena from other

  15. Simulating Serious Games: A Discrete-Time Computational Model Based on Cognitive Flow Theory

    Science.gov (United States)

    Westera, Wim

    2018-01-01

    This paper presents a computational model for simulating how people learn from serious games. While avoiding the combinatorial explosion of a games micro-states, the model offers a meso-level pathfinding approach, which is guided by cognitive flow theory and various concepts from learning sciences. It extends a basic, existing model by exposing…

  16. Computer simulation modeling of recreation use: Current status, case studies, and future directions

    Science.gov (United States)

    David N. Cole

    2005-01-01

    This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...

  17. A Generalized Computer Simulation Language for Naval Systems Modeling.

    Science.gov (United States)

    1981-06-30

    A. B., "GASP", Enclopedia of Computer Science and Technology, J. Belzer, A. G. Holzman , and A. Kent, Editors, Vol. 8, Marcel Dekker, Inc., 1977. 39...Blacksburg, VA 24061 David Taylor Naval Ship Research & Development Center (DTNSRDC) Dr. A. Alan B. Pritsker Carderock Department of Industrial Engineering

  18. Interactive virtual simulation using a 3D computer graphics model for microvascular decompression surgery.

    Science.gov (United States)

    Oishi, Makoto; Fukuda, Masafumi; Hiraishi, Tetsuya; Yajima, Naoki; Sato, Yosuke; Fujii, Yukihiko

    2012-09-01

    The purpose of this paper is to report on the authors' advanced presurgical interactive virtual simulation technique using a 3D computer graphics model for microvascular decompression (MVD) surgery. The authors performed interactive virtual simulation prior to surgery in 26 patients with trigeminal neuralgia or hemifacial spasm. The 3D computer graphics models for interactive virtual simulation were composed of the brainstem, cerebellum, cranial nerves, vessels, and skull individually created by the image analysis, including segmentation, surface rendering, and data fusion for data collected by 3-T MRI and 64-row multidetector CT systems. Interactive virtual simulation was performed by employing novel computer-aided design software with manipulation of a haptic device to imitate the surgical procedures of bone drilling and retraction of the cerebellum. The findings were compared with intraoperative findings. In all patients, interactive virtual simulation provided detailed and realistic surgical perspectives, of sufficient quality, representing the lateral suboccipital route. The causes of trigeminal neuralgia or hemifacial spasm determined by observing 3D computer graphics models were concordant with those identified intraoperatively in 25 (96%) of 26 patients, which was a significantly higher rate than the 73% concordance rate (concordance in 19 of 26 patients) obtained by review of 2D images only (p computer graphics model provided a realistic environment for performing virtual simulations prior to MVD surgery and enabled us to ascertain complex microsurgical anatomy.

  19. Blast Load Simulator Experiments for Computational Model Validation: Report 1

    Science.gov (United States)

    2016-08-01

    to 2 psi) related to failures of conventional annealed glass and hollow concrete masonry unit walls. It can also simulate higher blast pressures for...Army, Air Force, Navy , and De- fense Special Weapons Agency 1998, Hyde 2003) calculations were con- ducted to produce a waveform that matched both peak...the structures located downstream of the cascade section of the BLS. ERDC/GSL TR-16-27 26 References Department of the Army, Air Force, Navy

  20. 9th Annual Conference of the North East Polytechnics Mathematical Modelling & Computer Simulation Group

    CERN Document Server

    Bradley, R

    1987-01-01

    In recent years, mathematical modelling allied to computer simulation has emerged as en effective and invaluable design tool for industry and a discipline in its own right. This has been reflected in the popularity of the growing number of courses and conferences devoted to the area. The North East Polytechnics Mathematical Modelling and Computer Simulation Group has a balanced representation of academics and industrialists and, as a Group, has the objective of promoting a continuing partnership between the Polytechnics in the North East and local industry. Prior to the present conference the Group has organised eight conferences with a variety of themes related to mathematical modelling and computer simulation. The theme chosen for the Polymodel 9 Conference held in Newcastle upon Tyne in May 1986 was Industrial Vibration Modelling, which is particularly approp riate for 'Industry Year' and is an area which continues to present industry and academics with new and challenging problems. The aim of the Conferen...

  1. Quasi-monte carlo simulation and variance reduction techniques substantially reduce computational requirements of patient-level simulation models: An application to a discrete event simulation model

    NARCIS (Netherlands)

    Treur, M.; Postma, M.

    2014-01-01

    Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using

  2. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient...

  3. Mathematical and computational modeling simulation of solar drying Systems

    Science.gov (United States)

    Mathematical modeling of solar drying systems has the primary aim of predicting the required drying time for a given commodity, dryer type, and environment. Both fundamental (Fickian diffusion) and semi-empirical drying models have been applied to the solar drying of a variety of agricultural commo...

  4. Computational modelling and simulation of the elder tumble

    Science.gov (United States)

    Han, Jia; Han, Chunying

    2011-12-01

    The Hanavan's fifteen rigid multi-body human model was simplified into the six rigid multi-body model, and then the six degrees of freedom The Kane dynamic model was set up. With the human body parameters and muscle parameters, the six degrees of freedom Kane formula and restriction conditions were used to obtain the elder tumble movement status when the feet were stopped suddenly. The initial rotation speed of each part of the body can be calculated. After that, the tumble movement of elder was stimulated and the impact force from the ground surface was obtained.

  5. Quantification of remodeling parameter sensitivity - assessed by a computer simulation model

    DEFF Research Database (Denmark)

    Thomsen, J.S.; Mosekilde, Li.; Mosekilde, Erik

    1996-01-01

    We have used a computer simulation model to evaluate the effect of several bone remodeling parameters on vertebral cancellus bone. The menopause was chosen as the base case scenario, and the sensitivity of the model to the following parameters was investigated: activation frequency, formation bal....... However, the formation balance was responsible for the greater part of total mass loss....

  6. Mechanical Modeling and Computer Simulation of Protein Folding

    Science.gov (United States)

    Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene

    2014-01-01

    In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…

  7. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    Science.gov (United States)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  8. Comprehensive Modeling and Visualization of Cardiac Anatomy and Physiology from CT Imaging and Computer Simulations.

    Science.gov (United States)

    Xiong, Guanglei; Sun, Peng; Zhou, Haoyin; Ha, Seongmin; Hartaigh, Briain O; Truong, Quynh A; Min, James K

    2017-02-01

    In clinical cardiology, both anatomy and physiology are needed to diagnose cardiac pathologies. CT imaging and computer simulations provide valuable and complementary data for this purpose. However, it remains challenging to gain useful information from the large amount of high-dimensional diverse data. The current tools are not adequately integrated to visualize anatomic and physiologic data from a complete yet focused perspective. We introduce a new computer-aided diagnosis framework, which allows for comprehensive modeling and visualization of cardiac anatomy and physiology from CT imaging data and computer simulations, with a primary focus on ischemic heart disease. The following visual information is presented: (1) Anatomy from CT imaging: geometric modeling and visualization of cardiac anatomy, including four heart chambers, left and right ventricular outflow tracts, and coronary arteries; (2) Function from CT imaging: motion modeling, strain calculation, and visualization of four heart chambers; (3) Physiology from CT imaging: quantification and visualization of myocardial perfusion and contextual integration with coronary artery anatomy; (4) Physiology from computer simulation: computation and visualization of hemodynamics (e.g., coronary blood velocity, pressure, shear stress, and fluid forces on the vessel wall). Substantially, feedback from cardiologists have confirmed the practical utility of integrating these features for the purpose of computer-aided diagnosis of ischemic heart disease.

  9. Computational model to simulate the interplay effect in dynamic IMRT delivery

    International Nuclear Information System (INIS)

    Yoganathan, S A; Maria Das, K J; Kumar, Shaleen

    2014-01-01

    The purpose of this study was to develop and experimentally verify a patient specific model for simulating the interplay effect in a DMLC based IMRT delivery. A computational model was developed using MATLAB program to incorporate the interplay effect in a 2D beams eye view fluence of dynamic IMRT fields. To simulate interplay effect, the model requires two inputs: IMRT field (DMLC file with dose rate and MU) and the patient specific respiratory motion. The interplay between the DMLC leaf motion and target was simulated for three lung patients. The target trajectory data was acquired using RPM system during the treatment simulation. The model was verified experimentally for the same patients using Imatrix 2D array device placed over QUASAR motion platform in CL2100 linac. The simulated fluences and measured fluences were compared with the TPS generated static fluence (no motion) using an in-house developed gamma evaluation program (2%/2mm). The simulated results were well within agreement with the measured. Comparison of the simulated and measured fluences with the TPS static fluence resulted 55.3% and 58.5% pixels passed the gamma criteria. A patient specific model was developed and validated for simulating the interplay effect in the dynamic IMRT delivery. This model can be clinically used to quantify the dosimetric uncertainty due to the interplay effect prior to the treatment delivery.

  10. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    Science.gov (United States)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M

  11. Wave propagation simulation in normal and infarcted myocardium: computational and modelling issues

    NARCIS (Netherlands)

    Maglaveras, N.; van Capelle, F. J.; de Bakker, J. M.

    1998-01-01

    Simulation of propagating action potentials (PAP) in normal and abnormal myocardium is used for the understanding of mechanisms responsible for eliciting dangerous arrhythmias. One- and two-dimensional models dealing with PAP properties are reviewed in this paper viewed both from the computational

  12. Towards an integrative computational model for simulating tumor growth and response to radiation therapy

    Science.gov (United States)

    Marrero, Carlos Sosa; Aubert, Vivien; Ciferri, Nicolas; Hernández, Alfredo; de Crevoisier, Renaud; Acosta, Oscar

    2017-11-01

    Understanding the response to irradiation in cancer radiotherapy (RT) may help devising new strategies with improved tumor local control. Computational models may allow to unravel the underlying radiosensitive mechanisms intervening in the dose-response relationship. By using extensive simulations a wide range of parameters may be evaluated providing insights on tumor response thus generating useful data to plan modified treatments. We propose in this paper a computational model of tumor growth and radiation response which allows to simulate a whole RT protocol. Proliferation of tumor cells, cell life-cycle, oxygen diffusion, radiosensitivity, RT response and resorption of killed cells were implemented in a multiscale framework. The model was developed in C++, using the Multi-formalism Modeling and Simulation Library (M2SL). Radiosensitivity parameters extracted from literature enabled us to simulate in a regular grid (voxel-wise) a prostate cell tissue. Histopathological specimens with different aggressiveness levels extracted from patients after prostatectomy were used to initialize in silico simulations. Results on tumor growth exhibit a good agreement with data from in vitro studies. Moreover, standard fractionation of 2 Gy/fraction, with a total dose of 80 Gy as a real RT treatment was applied with varying radiosensitivity and oxygen diffusion parameters. As expected, the high influence of these parameters was observed by measuring the percentage of survival tumor cell after RT. This work paves the way to further models allowing to simulate increased doses in modified hypofractionated schemes and to develop new patient-specific combined therapies.

  13. A generic simulation cell method for developing extensible, efficient and readable parallel computational models

    Science.gov (United States)

    Honkonen, I.

    2015-03-01

    I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring modification of existing code. This is an advantage for the development and testing of, e.g., geoscientific software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. An implementation of the generic simulation cell method presented here, generic simulation cell class (gensimcell), also includes support for parallel programming by allowing model developers to select which simulation variables of, e.g., a domain-decomposed model to transfer between processes via a Message Passing Interface (MPI) library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class requires a C++ compiler that supports a version of the language standardized in 2011 (C++11). The code is available at https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those who do are kindly requested to acknowledge and cite this work.

  14. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  15. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    Directory of Open Access Journals (Sweden)

    Supat Faarungsang

    2017-04-01

    Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.

  16. Development of a Computational Simulation Model for Conflict Management in Team Building

    Directory of Open Access Journals (Sweden)

    W. M. Wang

    2011-05-01

    Full Text Available Conflict management is one of the most important issues in leveraging organizational competitiveness. However, traditional social scientists built theories or models in this area which were mostly expressed in words and diagrams are insufficient. Social science research based on computational modeling and simulation is beginning to augment traditional theory building. Simulation provides a method for people to try their actions out in a way that is cost effective, faster, appropriate, flexible, and ethical. In this paper, a computational simulation model for conflict management in team building is presented. The model is designed and used to explore the individual performances related to the combination of individuals who have a range of conflict handling styles, under various types of resources and policies. The model is developed based on agent-based modeling method. Each of the agents has one of the five conflict handling styles: accommodation, compromise, competition, contingency, and learning. There are three types of scenarios: normal, convex, and concave. There are two types of policies: no policy, and a reward and punishment policy. Results from running the model are also presented. The simulation has led us to derive two implications concerning conflict management. First, a concave type of resource promotes competition, while convex type of resource promotes compromise and collaboration. Second, the performance ranking of different styles can be influenced by introducing different policies. On the other hand, it is possible for us to promote certain style by introducing different policies.

  17. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  18. Combining computational models, semantic annotations and simulation experiments in a graph database

    Science.gov (United States)

    Henkel, Ron; Wolkenhauer, Olaf; Waltemath, Dagmar

    2015-01-01

    Model repositories such as the BioModels Database, the CellML Model Repository or JWS Online are frequently accessed to retrieve computational models of biological systems. However, their storage concepts support only restricted types of queries and not all data inside the repositories can be retrieved. In this article we present a storage concept that meets this challenge. It grounds on a graph database, reflects the models’ structure, incorporates semantic annotations and simulation descriptions and ultimately connects different types of model-related data. The connections between heterogeneous model-related data and bio-ontologies enable efficient search via biological facts and grant access to new model features. The introduced concept notably improves the access of computational models and associated simulations in a model repository. This has positive effects on tasks such as model search, retrieval, ranking, matching and filtering. Furthermore, our work for the first time enables CellML- and Systems Biology Markup Language-encoded models to be effectively maintained in one database. We show how these models can be linked via annotations and queried. Database URL: https://sems.uni-rostock.de/projects/masymos/ PMID:25754863

  19. Features of development and analysis of the simulation model of a multiprocessor computer system

    Directory of Open Access Journals (Sweden)

    O. M. Brekhov

    2017-01-01

    Full Text Available Over the past decade, multiprocessor systems have been applied in computer technology. At present,multi-core processors are equipped not only with supercomputers, but also with the vast majority of mobile devices. This creates the need for students to learn the basic principles of their construction and functioning.One of the possible methods for analyzing the operation of multiprocessor systems is simulation modeling.Its use contributes to a better understanding of the effect of workload and structure parameters on performance. The article considers the features of the development of the simulation model for estimating the time characteristics of a multiprocessor computer system, as well as the use of the regenerative method of model analysis. The characteristics of the software implementation of the inverse kinematics solution of the robot are adopted as a workload. The given task consists in definition of turns in joints of the manipulator on known angular and linear position of its grasp. An analytical algorithm for solving the problem was chosen, namely, the method of simple kinematic relations. The work of the program is characterized by the presence of parallel calculations, during which resource conflicts arise between the processor cores, involved in simultaneous access to the memory via a common bus. In connection with the high information connectivity between parallel running programs, it is assumed that all processing cores use shared memory. The simulation model takes into account probabilistic memory accesses and tracks emerging queues to shared resources. The collected statistics reveal the productive and overhead time costs for the program implementation for each processor core involved. The simulation results show the unevenness of kernel utilization, downtime in queues to shared resources and temporary losses while waiting for other cores due to information dependencies. The results of the simulation are estimated by the

  20. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  1. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    Hemanth-Kumar, K.; Young, L.C.

    1995-01-01

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  2. Simulation of windblown dust transport from a mine tailings impoundment using a computational fluid dynamics model

    Science.gov (United States)

    Stovern, Michael; Felix, Omar; Csavina, Janae; Rine, Kyle P.; Russell, MacKenzie R.; Jones, Robert M.; King, Matt; Betterton, Eric A.; Sáez, A. Eduardo

    2014-09-01

    Mining operations are potential sources of airborne particulate metal and metalloid contaminants through both direct smelter emissions and wind erosion of mine tailings. The warmer, drier conditions predicted for the Southwestern US by climate models may make contaminated atmospheric dust and aerosols increasingly important, due to potential deleterious effects on human health and ecology. Dust emissions and dispersion of dust and aerosol from the Iron King Mine tailings in Dewey-Humboldt, Arizona, a Superfund site, are currently being investigated through in situ field measurements and computational fluid dynamics modeling. These tailings are heavily contaminated with lead and arsenic. Using a computational fluid dynamics model, we model dust transport from the mine tailings to the surrounding region. The model includes gaseous plume dispersion to simulate the transport of the fine aerosols, while individual particle transport is used to track the trajectories of larger particles and to monitor their deposition locations. In order to improve the accuracy of the dust transport simulations, both regional topographical features and local weather patterns have been incorporated into the model simulations. Results show that local topography and wind velocity profiles are the major factors that control deposition.

  3. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    Science.gov (United States)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  4. A Computation Fluid Dynamic Model for Gas Lift Process Simulation in a Vertical Oil Well

    Directory of Open Access Journals (Sweden)

    Kadivar Arash

    2017-03-01

    Full Text Available Continuous gas-lift in a typical oil well was simulated using computational fluid dynamic (CFD technique. A multi fluid model based on the momentum transfer between liquid and gas bubbles was employed to simulate two-phase flow in a vertical pipe. The accuracy of the model was investigated through comparison of numerical predictions with experimental data. The model then was used to study the dynamic behaviour of the two-phase flow around injection point in details. The predictions by the model were compared with other empirical correlations, as well. To obtain an optimum condition of gas-lift, the influence of the effective parameters including the quantity of injected gas, tubing diameter and bubble size distribution were investigated. The results revealed that increasing tubing diameter, the injected gas rate and decreasing bubble diameter improve gas-lift performance.

  5. A Computation Fluid Dynamic Model for Gas Lift Process Simulation in a Vertical Oil Well

    Science.gov (United States)

    Kadivar, Arash; Lay, Ebrahim Nemati

    2017-03-01

    Continuous gas-lift in a typical oil well was simulated using computational fluid dynamic (CFD) technique. A multi fluid model based on the momentum transfer between liquid and gas bubbles was employed to simulate two-phase flow in a vertical pipe. The accuracy of the model was investigated through comparison of numerical predictions with experimental data. The model then was used to study the dynamic behaviour of the two-phase flow around injection point in details. The predictions by the model were compared with other empirical correlations, as well. To obtain an optimum condition of gas-lift, the influence of the effective parameters including the quantity of injected gas, tubing diameter and bubble size distribution were investigated. The results revealed that increasing tubing diameter, the injected gas rate and decreasing bubble diameter improve gas-lift performance.

  6. Computer simulation of Gumboro disease outbreak. II. Results obtained with models G-1 and G-2.

    Science.gov (United States)

    Takizawa, T; Ito, T; Kosuge, M; Tanaka, T; Mizumura, Y

    1978-01-01

    The authors conducted a computer simulation with their Models G-1 and G-2 for Gumboro disease ten times in each of the following initial conditions: (1) size of population, 50, 100, and 1,000 chickens; (2) age of housing, 1, 7, 14, and 21 days; (3) nine levels of parentally conferred immunity in one-day-old chicks; (4) four levels of virus contamination; and (5) three steps of coefficient for aggravating morbid status. Every simulation was operated up to the age when all the birds of a flock turned to be insusceptible so as to yield the daily numbers of chickens (1) susceptible, (2) diseased, (3) immunized, and (4) removed, and (5) the accumulation of diseased chickens. The innate resistance, parentally conferred immunity, virus contamination, and morbid status were expressed in such values that they could be compared with one another. As a result, Model G-2 produced a more realistic epizootic pattern than Model G-1, but both models concealed the effect of differences in size of population and in age of housing. Notwithstanding the incompleteness of the models, the computer simulation gave valuable information for a further advancement in this series of studies.

  7. HIGH-FIDELITY SIMULATION-DRIVEN MODEL DEVELOPMENT FOR COARSE-GRAINED COMPUTATIONAL FLUID DYNAMICS

    Energy Technology Data Exchange (ETDEWEB)

    Hanna, Botros N.; Dinh, Nam T.; Bolotnov, Igor A.

    2016-06-01

    Nuclear reactor safety analysis requires identifying various credible accident scenarios and determining their consequences. For a full-scale nuclear power plant system behavior, it is impossible to obtain sufficient experimental data for a broad range of risk-significant accident scenarios. In single-phase flow convective problems, Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) can provide us with high fidelity results when physical data are unavailable. However, these methods are computationally expensive and cannot be afforded for simulation of long transient scenarios in nuclear accidents despite extraordinary advances in high performance scientific computing over the past decades. The major issue is the inability to make the transient computation parallel, thus making number of time steps required in high-fidelity methods unaffordable for long transients. In this work, we propose to apply a high fidelity simulation-driven approach to model sub-grid scale (SGS) effect in Coarse Grained Computational Fluid Dynamics CG-CFD. This approach aims to develop a statistical surrogate model instead of the deterministic SGS model. We chose to start with a turbulent natural convection case with volumetric heating in a horizontal fluid layer with a rigid, insulated lower boundary and isothermal (cold) upper boundary. This scenario of unstable stratification is relevant to turbulent natural convection in a molten corium pool during a severe nuclear reactor accident, as well as in containment mixing and passive cooling. The presented approach demonstrates how to create a correction for the CG-CFD solution by modifying the energy balance equation. A global correction for the temperature equation proves to achieve a significant improvement to the prediction of steady state temperature distribution through the fluid layer.

  8. Computational and Simulation Modeling of Political Attitudes: The 'Tiger' Area of Political Culture Research

    Directory of Open Access Journals (Sweden)

    Voinea, Camelia Florela

    2016-01-01

    Full Text Available In almost one century long history, political attitudes modeling research has accumulated a critical mass of theory and method. Its characteristics and particularities have often suggested that political attitude approach to political persuasion modeling reveals a strong theoretical autonomy of concept which entitles it to become a new separate discipline of research. Though this did not actually happen, political attitudes modeling research has remained the most challenging area – the “tiger” – of political culture modeling research. This paper reviews the research literature on the conceptual, computational and simulation modeling of political attitudes developed starting with the beginning of the 20th century until the present times. Several computational and simulation modeling paradigms have provided support to political attitudes modeling research. These paradigms and the shift from one to another are briefly presented for a period of time of almost one century. The dominant paradigmatic views are those inspired by the Newtonian mechanics, and those based on the principle of methodological individualism and the emergence of macro phenomena from the individual interactions at the micro level of a society. This period of time is divided in eight ages covering the history of ideas in a wide range of political domains, going from political attitudes to polity modeling. Internal and external pressures for paradigmatic change are briefly explained.

  9. Trends in Social Science: The Impact of Computational and Simulative Models

    Science.gov (United States)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  10. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Science.gov (United States)

    Guerrier, C.; Holcman, D.

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  11. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Energy Technology Data Exchange (ETDEWEB)

    Guerrier, C. [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Holcman, D., E-mail: david.holcman@ens.fr [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Mathematical Institute, Oxford OX2 6GG, Newton Institute (United Kingdom)

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  12. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    International Nuclear Information System (INIS)

    Guerrier, C.; Holcman, D.

    2017-01-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  13. Building a three-dimensional model of the upper gastrointestinal tract for computer simulations of swallowing.

    Science.gov (United States)

    Gastelum, Alfonso; Mata, Lucely; Brito-de-la-Fuente, Edmundo; Delmas, Patrice; Vicente, William; Salinas-Vázquez, Martín; Ascanio, Gabriel; Marquez, Jorge

    2016-03-01

    We aimed to provide realistic three-dimensional (3D) models to be used in numerical simulations of peristaltic flow in patients exhibiting difficulty in swallowing, also known as dysphagia. To this end, a 3D model of the upper gastrointestinal tract was built from the color cryosection images of the Visible Human Project dataset. Regional color heterogeneities were corrected by centering local histograms of the image difference between slices. A voxel-based model was generated by stacking contours from the color images. A triangle mesh was built, smoothed and simplified. Visualization tools were developed for browsing the model at different stages and for virtual endoscopy navigation. As result, a computer model of the esophagus and the stomach was obtained, mainly for modeling swallowing disorders. A central-axis curve was also obtained for virtual navigation and to replicate conditions relevant to swallowing disorders modeling. We show renderings of the model and discuss its use for simulating swallowing as a function of bolus rheological properties. The information obtained from simulation studies with our model could be useful for physicians in selecting the correct nutritional emulsions for patients with dysphagia.

  14. Computational Modeling and Simulation of Attitude Change. Part 1, Connectionist Models and Simulations of Cognitive Dissonance: an Overview

    OpenAIRE

    Voinea, Camelia Florela

    2013-01-01

    Cognitive Dissonance Theory is considered part of the cognitive consistency theories in Social Psychology. They uncover a class of conceptual models which describe the attitude change as a cognitive consistency-seeking issue. As these conceptual models requested more complex operational expression, algebraic, mathematical and, lately, computational modeling approaches of cognitive consistency have been developed. Part 1 of this work provides an overview of the connectionist modeling of cognit...

  15. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB

    OpenAIRE

    Sinha, Shriprakash

    2016-01-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational mo...

  16. Electromagnetic Computation and Visualization of Transmission Particle Model and Its Simulation Based on GPU

    Directory of Open Access Journals (Sweden)

    Yingnian Wu

    2014-01-01

    Full Text Available Electromagnetic calculation plays an important role in both military and civic fields. Some methods and models proposed for calculation of electromagnetic wave propagation in a large range bring heavy burden in CPU computation and also require huge amount of memory. Using the GPU to accelerate computation and visualization can reduce the computational burden on the CPU. Based on forward ray-tracing method, a transmission particle model (TPM for calculating electromagnetic field is presented to combine the particle method. The movement of a particle obeys the principle of the propagation of electromagnetic wave, and then the particle distribution density in space reflects the electromagnetic distribution status. The algorithm with particle transmission, movement, reflection, and diffraction is described in detail. Since the particles in TPM are completely independent, it is very suitable for the parallel computing based on GPU. Deduction verification of TPM with the electric dipole antenna as the transmission source is conducted to prove that the particle movement itself represents the variation of electromagnetic field intensity caused by diffusion. Finally, the simulation comparisons are made against the forward and backward ray-tracing methods. The simulation results verified the effectiveness of the proposed method.

  17. Computer simulation of Gumboro disease outbreak. III. Construction model G-4.

    Science.gov (United States)

    Takizawa, T; Ito, T; Tanaka, T; Mizumura, Y

    1980-01-01

    Following the simulation mode., G-3, of Gumboro disease outbreak, Model G-4 was constructed. The algorithm for computer simulation is shown in a flow chart. The postulates added to those for Models G-1 and G-2 are as follows: (1) The source of contamination is the virus remaining in the house and declining gradually in value with the lapse of time. (2) Any diseased bird excretes the virus during a certain period, so that the virus may be added to the source of contamination. (3) The morbid status of the diseased bird becomes worse in process of time, but the infection remains subclinical until a threshold value is reached. Beyond this value the bird becomes clinically diseased. In this model, more than 20 parameters are involved, and random numbers used for expressing the individual differences in the four variables, viz., the level of innate resistance, parentally conferred immunity, virus-intake, and threshold of clinical manifestation.

  18. Computer Simulation and Modeling of CO2 Removal Systems for Exploration 2013-2014

    Science.gov (United States)

    Coker, R.; Knox, J.; Gomez, C.

    2015-01-01

    The Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project was initiated in September of 2011 as part of the Advanced Exploration Systems (AES) program. Under the ARREM project and the follow-on Life Support Systems (LSS) project, testing of sub-scale and full-scale systems has been combined with multiphysics computer simulations for evaluation and optimization of subsystem approaches. In particular, this paper will describes the testing and 1-D modeling of the combined water desiccant and carbon dioxide sorbent subsystems of the carbon dioxide removal assembly (CDRA). The goal is a full system predictive model of CDRA to guide system optimization and development.

  19. New Algorithms for Computing the Time-to-Collision in Freeway Traffic Simulation Models

    Directory of Open Access Journals (Sweden)

    Jia Hou

    2014-01-01

    Full Text Available Ways to estimate the time-to-collision are explored. In the context of traffic simulation models, classical lane-based notions of vehicle location are relaxed and new, fast, and efficient algorithms are examined. With trajectory conflicts being the main focus, computational procedures are explored which use a two-dimensional coordinate system to track the vehicle trajectories and assess conflicts. Vector-based kinematic variables are used to support the calculations. Algorithms based on boxes, circles, and ellipses are considered. Their performance is evaluated in the context of computational complexity and solution time. Results from these analyses suggest promise for effective and efficient analyses. A combined computation process is found to be very effective.

  20. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  1. Supersonic propulsion simulation by incorporating component models in the large perturbation inlet (LAPIN) computer code

    Science.gov (United States)

    Cole, Gary L.; Richard, Jacques C.

    1991-01-01

    An approach to simulating the internal flows of supersonic propulsion systems is presented. The approach is based on a fairly simple modification of the Large Perturbation Inlet (LAPIN) computer code. LAPIN uses a quasi-one dimensional, inviscid, unsteady formulation of the continuity, momentum, and energy equations. The equations are solved using a shock capturing, finite difference algorithm. The original code, developed for simulating supersonic inlets, includes engineering models of unstart/restart, bleed, bypass, and variable duct geometry, by means of source terms in the equations. The source terms also provide a mechanism for incorporating, with the inlet, propulsion system components such as compressor stages, combustors, and turbine stages. This requires each component to be distributed axially over a number of grid points. Because of the distributed nature of such components, this representation should be more accurate than a lumped parameter model. Components can be modeled by performance map(s), which in turn are used to compute the source terms. The general approach is described. Then, simulation of a compressor/fan stage is discussed to show the approach in detail.

  2. [Economic benefits of overlapping induction: investigation using a computer simulation model].

    Science.gov (United States)

    Hunziker, S; Baumgart, A; Denz, C; Schüpfer, G

    2009-06-01

    The aim of this study was to investigate the potential economic benefit of overlapping anaesthesia induction given that all patient diagnosis-related groups (AP DRG) are used as the model for hospital reimbursement. A computer simulation model was used for this purpose. Due to the resource-intensive production process, the operating room (OR) environment is the most expensive part of the supply chain for surgical disciplines. The economical benefit of a parallel production process (additional personnel, adaptation of the process) as compared to a conventional serial layout was assessed. A computer-based simulation method was used with commercially available simulation software. Assumptions for revenues were made by reimbursement based on AP DRG. Based on a system analysis a model for the computer simulation was designed on a step-by-step abstraction process. In the model two operating rooms were used for parallel processing and two operating rooms for a serial production process. Six different types of surgical procedures based on historical case durations were investigated. The contribution margin was calculated based on the increased revenues minus the cost for the additional anaesthesia personnel. Over a period of 5 weeks 41 additional surgical cases were operated under the assumption of duration of surgery of 89+/-4 min (mean+/-SD). The additional contribution margin was CHF 104,588. In the case of longer surgical procedures with 103+/-25 min duration (mean+/-SD), an increase of 36 cases was possible in the same time period and the contribution margin was increased by CHF 384,836. When surgical cases with a mean procedural time of 243+/-55 min were simulated, 15 additional cases were possible. Therefore, the additional contribution margin was CHF 321,278. Although costs increased in this simulation when a serial production process was changed to a parallel system layout due to more personnel, an increase of the contribution margin was possible, especially with

  3. HRP's Healthcare Spin-Offs Through Computational Modeling and Simulation Practice Methodologies

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Peng, Grace; Morrison, Tina; Erdemir, Ahmet; Myers, Jerry

    2014-01-01

    Spaceflight missions expose astronauts to novel operational and environmental conditions that pose health risks that are currently not well understood, and perhaps unanticipated. Furthermore, given the limited number of humans that have flown in long duration missions and beyond low Earth-orbit, the amount of research and clinical data necessary to predict and mitigate these health and performance risks are limited. Consequently, NASA's Human Research Program (HRP) conducts research and develops advanced methods and tools to predict, assess, and mitigate potential hazards to the health of astronauts. In this light, NASA has explored the possibility of leveraging computational modeling since the 1970s as a means to elucidate the physiologic risks of spaceflight and develop countermeasures. Since that time, substantial progress has been realized in this arena through a number of HRP funded activates such as the Digital Astronaut Project (DAP) and the Integrated Medical Model (IMM). Much of this success can be attributed to HRP's endeavor to establish rigorous verification, validation, and credibility (VV&C) processes that ensure computational models and simulations (M&S) are sufficiently credible to address issues within their intended scope. This presentation summarizes HRP's activities in credibility of modeling and simulation, in particular through its outreach to the community of modeling and simulation practitioners. METHODS: The HRP requires all M&S that can have moderate to high impact on crew health or mission success must be vetted in accordance to NASA Standard for Models and Simulations, NASA-STD-7009 (7009) [5]. As this standard mostly focuses on engineering systems, the IMM and DAP have invested substantial efforts to adapt the processes established in this standard for their application to biological M&S, which is more prevalent in human health and performance (HHP) and space biomedical research and operations [6,7]. These methods have also generated

  4. Credibility Assessment of Deterministic Computational Models and Simulations for Space Biomedical Research and Operations

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry

    2015-01-01

    Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.

  5. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  6. A new approach to modeling of selected human respiratory system diseases, directed to computer simulations.

    Science.gov (United States)

    Redlarski, Grzegorz; Jaworski, Jacek

    2013-10-01

    This paper presents a new versatile approach to model severe human respiratory diseases via computer simulation. The proposed approach enables one to predict the time histories of various diseases via information accessible in medical publications. This knowledge is useful to bioengineers involved in the design and construction of medical devices that are employed for monitoring of respiratory condition. The approach provides the data that are crucial for testing diagnostic systems. This can be achieved without the necessity of probing the physiological details of the respiratory system as well as without identification of parameters that are based on measurement data. © 2013 Elsevier Ltd. All rights reserved.

  7. Computer simulation of the time evolution of a quenched model alloy in the nucleation region

    International Nuclear Information System (INIS)

    Marro, J.; Lebowitz, J.L.; Kalos, M.H.

    1979-01-01

    The time evolution of the structure function and of the cluster (or grain) distribution following quenching in a model binary alloy with a small concentration of minority atoms is obtained from computer simulations. The structure function S-bar (k,t) obeys a simple scaling relation, S-bar (k,t) = K -3 F (k/K) with K (t) proportional t/sup -a/, a approx. = 0.25, during the latter and larger part of the evolution. During the same period, the mean cluster size grows approximately linearly with time

  8. A computational platform for modeling and simulation of pipeline georeferencing systems

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, A.G.; Pellanda, P.C.; Gois, J.A. [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil); Roquette, P.; Pinto, M.; Durao, R. [Instituto de Pesquisas da Marinha (IPqM), Rio de Janeiro, RJ (Brazil); Silva, M.S.V.; Martins, W.F.; Camillo, L.M.; Sacsa, R.P.; Madeira, B. [Ministerio de Ciencia e Tecnologia (CT-PETRO2006MCT), Brasilia, DF (Brazil). Financiadora de Estudos e Projetos (FINEP). Plano Nacional de Ciencia e Tecnologia do Setor Petroleo e Gas Natural

    2009-07-01

    This work presents a computational platform for modeling and simulation of pipeline geo referencing systems, which was developed based on typical pipeline characteristics, on the dynamical modeling of Pipeline Inspection Gauge (PIG) and on the analysis and implementation of an inertial navigation algorithm. The software environment of PIG trajectory simulation and navigation allows the user, through a friendly interface, to carry-out evaluation tests of the inertial navigation system under different scenarios. Therefore, it is possible to define the required specifications of the pipeline geo referencing system components, such as: required precision of inertial sensors, characteristics of the navigation auxiliary system (GPS surveyed control points, odometers etc.), pipeline construction information to be considered in order to improve the trajectory estimation precision, and the signal processing techniques more suitable for the treatment of inertial sensors data. The simulation results are analyzed through the evaluation of several performance metrics usually considered in inertial navigation applications, and 2D and 3D plots of trajectory estimation error and of recovered trajectory in the three coordinates are made available to the user. This paper presents the simulation platform and its constituting modules and defines their functional characteristics and interrelationships.(author)

  9. Using nested discretization for a detailed yet computationally efficient simulation of local hydrology in a distributed hydrologic model.

    Science.gov (United States)

    Wang, Dongdong; Liu, Yanlan; Kumar, Mukesh

    2018-04-10

    Fully distributed hydrologic models are often used to simulate hydrologic states at fine spatio-temporal resolutions. However, simulations based on these models may become computationally expensive, constraining their applications to smaller domains. This study demonstrates that a nested-discretization based modeling strategy can be used to improve the efficiency of distributed hydrologic simulations, especially for applications where fine resolution estimates of hydrologic states are of the focus only within a part of a watershed. To this end, we consider two applications where the goal is to capture the groundwater dynamics within a defined target area. Our results show that at the target locations, a nested simulation is able to competently replicate the estimates of groundwater table as obtained from the fine simulation, while yielding significant computational savings. The results highlight the potential of using nested discretization for a detailed yet computationally efficient estimation of hydrologic states in part of the model domain.

  10. Avatars alive! The integration of physiology models and computer generated avatars in a multiplayer online simulation.

    Science.gov (United States)

    Kusumoto, Laura; Heinrichs, Wm Leroy; Dev, Parvati; Youngblood, Patricia

    2007-01-01

    In a mass casualty incident, injured and at-risk patients will pass through a continuum of care from many different providers acting as a team in a clinical environment. As presented at MMVR 14 [Kaufman, et al 2006], formative evaluations have shown that simulation practice is nearly as good as, and in some cases better than, live exercises for stimulating learners to integrate their procedural knowledge in new circumstances through experiential practice. However, to date, multiplayer game technologies have given limited physiological fidelity to their characters, thus limiting the realism and complexity of the scenarios that can be practiced by medical professionals. This paper describes the status of a follow-on program to merge medical and gaming technologies so that computer generated, but human-controlled, avatars used in a simulated, mass casualty training environment will exhibit realistic life signs. This advance introduces a new level of medical fidelity to simulated mass casualty scenarios that can represent thousands of injuries. The program is identifying the critical instructional challenges and related system engineering issues associated with the incorporation of multiple state-of-the-art physiological models into the computer generated synthetic representation of patients. The work is a collaboration between Forterra Systems and the SUMMIT group of Stanford University Medical School, and is sponsored by the US Army Medical Command's Telemedicine and Advanced Technologies Research Center (TATRC).

  11. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  12. Computer Simulation Western

    International Nuclear Information System (INIS)

    Rasmussen, H.

    1992-01-01

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  13. Accelerator simulation using computers

    International Nuclear Information System (INIS)

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications

  14. Advanced computers and simulation

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1993-01-01

    Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators

  15. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  16. Computational growth model of breast microcalcification clusters in simulated mammographic environments.

    Science.gov (United States)

    Plourde, Shayne M; Marin, Zach; Smith, Zachary R; Toner, Brian C; Batchelder, Kendra A; Khalil, Andre

    2016-09-01

    When screening for breast cancer, the radiological interpretation of mammograms is a difficult task, particularly when classifying precancerous growth such as microcalcifications (MCs). Biophysical modeling of benign vs. malignant growth of MCs in simulated mammographic backgrounds may improve characterization of these structures A mathematical model based on crystal growth rules for calcium oxide (benign) and hydroxyapatite (malignant) was used in conjunction with simulated mammographic backgrounds, which were generated by fractional Brownian motion of varying roughness and quantified by the Hurst exponent to mimic tissue of varying density. Simulated MC clusters were compared by fractal dimension, average circularity of individual MCs, average number of MCs per cluster, and average cluster area. Benign and malignant clusters were distinguishable by average circularity, average number of MCs per cluster, and average cluster area with pbreast tissue density, which suggests tissue environment plays a role in regulating MC growth. Benign and malignant MCs are distinguishable in all types of tissue by shape, size, and area, which is consistent with findings in the literature. These results may help to better understand the effects of the tissue environment on tumor progression, and improve classification of MCs in mammograms via computer-aided diagnosis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A hybrid model for the computationally-efficient simulation of the cerebellar granular layer

    Directory of Open Access Journals (Sweden)

    Anna eCattani

    2016-04-01

    Full Text Available The aim of the present paper is to efficiently describe the membrane potential dynamics of neural populations formed by species having a high density difference in specific brain areas. We propose a hybrid model whose main ingredients are a conductance-based model (ODE system and its continuous counterpart (PDE system obtained through a limit process in which the number of neurons confined in a bounded region of the brain tissue is sent to infinity. Specifically, in the discrete model, each cell is described by a set of time-dependent variables, whereas in the continuum model, cells are grouped into populations that are described by a set of continuous variables.Communications between populations, which translate into interactions among the discrete and the continuous models, are the essence of the hybrid model we present here. The cerebellum and cerebellum-like structures show in their granular layer a large difference in the relative density of neuronal species making them a natural testing ground for our hybrid model. By reconstructing the ensemble activity of the cerebellar granular layer network and by comparing our results to a more realistic computational network, we demonstrate that our description of the network activity, even though it is not biophysically detailed, is still capable of reproducing salient features of neural network dynamics. Our modeling approach yields a significant computational cost reduction by increasing the simulation speed at least $270$ times. The hybrid model reproduces interesting dynamics such as local microcircuit synchronization, traveling waves, center-surround and time-windowing.

  18. Development of computer program for simulation of an ice bank system operation, Part I: Mathematical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Halasz, Boris; Grozdek, Marino; Soldo, Vladimir [Faculty of Mechanical Engineering and Naval Architecture, University of Zagreb, Ivana Lucica 5, 10 000 Zagreb (Croatia)

    2009-09-15

    Since the use of standard engineering methods in the process of an ice bank performance evaluation offers neither adequate flexibility nor accuracy, the aim of this research was to provide a powerful tool for an industrial design of an ice storage system allowing to account for the various design parameters and system arrangements over a wide range of time varying operating conditions. In this paper the development of a computer application for the prediction of an ice bank system operation is presented. Static, indirect, cool thermal storage systems with external ice on coil building/melting were considered. The mathematical model was developed by means of energy and mass balance relations for each component of the system and is basically divided into two parts, the model of an ice storage system and the model of a refrigeration unit. Heat transfer processes in an ice silo were modelled by use of empirical correlations while the performance of refrigeration unit components were based on manufacturers data. Programming and application design were made in Fortran 95 language standard. Input of data is enabled through drop down menus and dialog boxes, while the results are presented via figures, diagrams and data (ASCII) files. In addition, to demonstrate the necessity for development of simulation program a case study was performed. Simulation results clearly indicate that no simple engineering methods or rule of thumb principles could be utilised in order to validate performance of an ice bank system properly. (author)

  19. Building Model for the University of Mosul Computer Network Using OPNET Simulator

    Directory of Open Access Journals (Sweden)

    Modhar Modhar A. Hammoudi

    2013-04-01

    Full Text Available This paper aims at establishing a model in OPNET (Optimized Network Engineering Tool simulator for the University of Mosul computer network. The proposed network model was made up of two routers (Cisco 2600, core switch (Cisco6509, two servers, ip 32 cloud and 37 VLANs. These VLANs were connected to the core switch using fiber optic cables (1000BaseX. Three applications were added to test the network model. These applications were FTP (File Transfer Protocol, HTTP (Hyper Text Transfer Protocol and VoIP (Voice over Internet Protocol. The results showed that the proposed model had a positive efficiency on designing and managing the targeted network and can be used to view the data flow in it. Also, the simulation results showed that the maximum number of VoIP service users could be raised upto 5000 users when working under IP Telephony. This means that the ability to utilize VoIP service in this network can be maintained and is better when subjected to IP telephony scheme.

  20. Computational Model and Numerical Simulation for Submerged Mooring Monitoring Platform’s Dynamical Response

    Directory of Open Access Journals (Sweden)

    He Kongde

    2015-01-01

    Full Text Available Computational model and numerical simulation for submerged mooring monitoring platform were formulated aimed at the dynamical response by the action of flow force, which based on Hopkinson impact load theory, taken into account the catenoid effect of mooring cable and revised the difference of tension and tangential direction action force by equivalent modulus of elasticity. Solved the equation by hydraulics theory and structural mechanics theory of oceaneering, studied the response of buoy on flow force. The validity of model were checked and the results were in good agreement; the result show the buoy will engender biggish heave and swaying displacement, but the swaying displacement got stable quickly and the heaven displacement cause vibration for the vortex-induced action by the flow.

  1. Computer simulation of ductile fracture

    International Nuclear Information System (INIS)

    Wilkins, M.L.; Streit, R.D.

    1979-01-01

    Finite difference computer simulation programs are capable of very accurate solutions to problems in plasticity with large deformations and rotation. This opens the possibility of developing models of ductile fracture by correlating experiments with equivalent computer simulations. Selected experiments were done to emphasize different aspects of the model. A difficult problem is the establishment of a fracture-size effect. This paper is a study of the strain field around notched tensile specimens of aluminum 6061-T651. A series of geometrically scaled specimens are tested to fracture. The scaled experiments are conducted for different notch radius-to-diameter ratios. The strains at fracture are determined from computer simulations. An estimate is made of the fracture-size effect

  2. Modelling of ballistic low energy ion solid interaction - conventional analytic theories versus computer simulations

    International Nuclear Information System (INIS)

    Littmark, U.

    1994-01-01

    The ''philosophy'' behind, and the ''psychology'' of the development from analytic theory to computer simulations in the field of atomic collisions in solids is discussed and a few examples of achievements and perspectives are given. (orig.)

  3. Computer-Aided Design, Modeling and Simulation of a New Solar Still Design

    Directory of Open Access Journals (Sweden)

    Jeremy (Zheng Li

    2011-01-01

    Full Text Available The clean and pure drinking water is important in today's life but current water sources are usually brackish with bacteria that cannot be used for drinking. About 78% of water available in the sea is salty, 21% of water is brackish, and only 1% of water is fresh. Distillation is one of the feasible processes applied to water purification, and it requires the energy inputs, such as solar radiation. Water is evaporated in this distillation process and water vapor can be separated and condensed to pure water. Now, with the change from conventional fuels to renewable and environment friendly fuels sources, the modern technology allows to use the abundant energy from the sun. It is better to use solar energy to process the water desalination since it is more economical than the use of conventional energies. The main focus of this paper is applying computer-aided modeling and simulation to design a less complex solar water distillation system. The prototype of this solar still system is also built to verify its feasibility, functionality, and reliability. The computational simulation and prototype testing show the reliability and proper functionality of this solar water distillation system.

  4. Computer Simulation Model to Train Medical Personnel on Glucose Clamp Procedures.

    Science.gov (United States)

    Maghoul, Pooya; Boulet, Benoit; Tardif, Annie; Haidar, Ahmad

    2017-10-01

    A glucose clamp procedure is the most reliable way to quantify insulin pharmacokinetics and pharmacodynamics, but skilled and trained research personnel are required to frequently adjust the glucose infusion rate. A computer environment that simulates glucose clamp experiments can be used for efficient personnel training and development and testing of algorithms for automated glucose clamps. We built 17 virtual healthy subjects (mean age, 25±6 years; mean body mass index, 22.2±3 kg/m 2 ), each comprising a mathematical model of glucose regulation and a unique set of parameters. Each virtual subject simulates plasma glucose and insulin concentrations in response to intravenous insulin and glucose infusions. Each virtual subject provides a unique response, and its parameters were estimated from combined intravenous glucose tolerance test-hyperinsulinemic-euglycemic clamp data using the Bayesian approach. The virtual subjects were validated by comparing their simulated predictions against data from 12 healthy individuals who underwent a hyperglycemic glucose clamp procedure. Plasma glucose and insulin concentrations were predicted by the virtual subjects in response to glucose infusions determined by a trained research staff performing a simulated hyperglycemic clamp experiment. The total amount of glucose infusion was indifferent between the simulated and the real subjects (85±18 g vs. 83±23 g; p=NS) as well as plasma insulin levels (63±20 mU/L vs. 58±16 mU/L; p=NS). The virtual subjects can reliably predict glucose needs and plasma insulin profiles during hyperglycemic glucose clamp conditions. These virtual subjects can be used to train personnel to make glucose infusion adjustments during clamp experiments. Copyright © 2017 Diabetes Canada. Published by Elsevier Inc. All rights reserved.

  5. Computer simulation models of pre-diabetes populations: a systematic review protocol.

    Science.gov (United States)

    Leal, Jose; Khurshid, Waqar; Pagano, Eva; Feenstra, Talitha

    2017-10-05

    Diabetes is a major public health problem and prediabetes (intermediate hyperglycaemia) is associated with a high risk of developing diabetes. With evidence supporting the use of preventive interventions for prediabetes populations and the discovery of novel biomarkers stratifying the risk of progression, there is a need to evaluate their cost-effectiveness across jurisdictions. In diabetes and prediabetes, it is relevant to inform cost-effectiveness analysis using decision models due to their ability to forecast long-term health outcomes and costs beyond the time frame of clinical trials. To support good implementation and reimbursement decisions of interventions in these populations, models should be clinically credible, based on best available evidence, reproducible and validated against clinical data. Our aim is to identify recent studies on computer simulation models and model-based economic evaluations of populations of individuals with prediabetes, qualify them and discuss the knowledge gaps, challenges and opportunities that need to be addressed for future evaluations. A systematic review will be conducted in MEDLINE, Embase, EconLit and National Health Service Economic Evaluation Database. We will extract peer-reviewed studies published between 2000 and 2016 that describe computer simulation models of the natural history of individuals with prediabetes and/or decision models to evaluate the impact of interventions, risk stratification and/or screening on these populations. Two reviewers will independently assess each study for inclusion. Data will be extracted using a predefined pro forma developed using best practice. Study quality will be assessed using a modelling checklist. A narrative synthesis of all studies will be presented, focussing on model structure, quality of models and input data, and validation status. This systematic review is exempt from ethics approval because the work is carried out on published documents. The findings of the review

  6. Moving on to the modeling and simulation using computational fluid dynamics

    International Nuclear Information System (INIS)

    Norasalwa Zakaria; Rohyiza Baan; Muhd Noor Muhd Yunus

    2006-01-01

    The heat is on but not at the co-combustor plant. Using the Computational Fluid Dynamics (CFD), modeling and simulation of an incinerator has been made easy and possible from the comfort of cozy room. CFD has become an important design tool in nearly every industrial field because it provides understanding of flow patterns. CFD provide values for fluid velocity, fluid temperature, pressure and species concentrations throughout a flow domain. MINT has acquired a complete CFD software recently, consisting of GAMBIT, which is use to build geometry and meshing, and FLUENT as the processor or solver. This paper discusses on several trial runs that was carried out on several parts of the co-combustor plant namely the under fire section and the mixing chamber section

  7. Numerical computation of the linear stability of the diffusion model for crystal growth simulation

    Energy Technology Data Exchange (ETDEWEB)

    Yang, C.; Sorensen, D.C. [Rice Univ., Houston, TX (United States); Meiron, D.I.; Wedeman, B. [California Institute of Technology, Pasadena, CA (United States)

    1996-12-31

    We consider a computational scheme for determining the linear stability of a diffusion model arising from the simulation of crystal growth. The process of a needle crystal solidifying into some undercooled liquid can be described by the dual diffusion equations with appropriate initial and boundary conditions. Here U{sub t} and U{sub a} denote the temperature of the liquid and solid respectively, and {alpha} represents the thermal diffusivity. At the solid-liquid interface, the motion of the interface denoted by r and the temperature field are related by the conservation relation where n is the unit outward pointing normal to the interface. A basic stationary solution to this free boundary problem can be obtained by writing the equations of motion in a moving frame and transforming the problem to parabolic coordinates. This is known as the Ivantsov parabola solution. Linear stability theory applied to this stationary solution gives rise to an eigenvalue problem of the form.

  8. Computer security simulation

    International Nuclear Information System (INIS)

    Schelonka, E.P.

    1979-01-01

    Development and application of a series of simulation codes used for computer security analysis and design are described. Boolean relationships for arrays of barriers within functional modules are used to generate composite effectiveness indices. The general case of multiple layers of protection with any specified barrier survival criteria is given. Generalized reduction algorithms provide numerical security indices in selected subcategories and for the system as a whole. 9 figures, 11 tables

  9. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB.

    Science.gov (United States)

    Sinha, Shriprakash

    2016-12-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.

  10. [Comparison between the Range of Movement Canine Real Cervical Spine and Numerical Simulation - Computer Model Validation].

    Science.gov (United States)

    Srnec, R; Horák, Z; Sedláček, R; Sedlinská, M; Krbec, M; Nečas, A

    2017-01-01

    PURPOSE OF THE STUDY In developing new or modifying the existing surgical treatment methods of spine conditions an integral part of ex vivo experiments is the assessment of mechanical, kinematic and dynamic properties of created constructions. The aim of the study is to create an appropriately validated numerical model of canine cervical spine in order to obtain a tool for basic research to be applied in cervical spine surgeries. For this purpose, canine is a suitable model due to the occurrence of similar cervical spine conditions in some breeds of dogs and in humans. The obtained model can also be used in research and in clinical veterinary practice. MATERIAL AND METHODS In order to create a 3D spine model, the LightSpeed 16 (GE, Milwaukee, USA) multidetector computed tomography was used to scan the cervical spine of Doberman Pinscher. The data were transmitted to Mimics 12 software (Materialise HQ, Belgium), in which the individual vertebrae were segmented on CT scans by thresholding. The vertebral geometry was exported to Rhinoceros software (McNeel North America, USA) for modelling, and subsequently the specialised software Abaqus (Dassault Systemes, France) was used to analyse the response of the physiological spine model to external load by the finite element method (FEM). All the FEM based numerical simulations were considered as nonlinear contact statistic tasks. In FEM analyses, angles between individual spinal segments were monitored in dependence on ventroflexion/ /dorziflexion. The data were validated using the latero-lateral radiographs of cervical spine of large breed dogs with no evident clinical signs of cervical spine conditions. The radiographs within the cervical spine range of motion were taken at three different positions: in neutral position, in maximal ventroflexion and in maximal dorziflexion. On X-rays, vertebral inclination angles in monitored spine positions were measured and compared with the results obtain0ed from FEM analyses of the

  11. Wave propagation simulation in normal and infarcted myocardium: computational and modelling issues.

    Science.gov (United States)

    Maglaveras, N; Van Capelle, F J; De Bakker, J M

    1998-01-01

    Simulation of propagating action potentials (PAP) in normal and abnormal myocardium is used for the understanding of mechanisms responsible for eliciting dangerous arrhythmias. One- and two-dimensional models dealing with PAP properties are reviewed in this paper viewed both from the computational and mathematical aspects. These models are used for linking theoretical and experimental results. The discontinuous nature of the PAP is demonstrated through the combination of experimental and theoretically derived results. In particular it can be shown that for increased intracellular coupling resistance the PAP upstroke phase properties (Vmax, dV/dtmax and tau foot) change considerably, and in some cases non-monotonically with increased coupling resistance. It is shown that tau foot) is a parameter that is very sensitive to the cell's distance to the stimulus site, the stimulus strength and the coupling resistance. In particular it can be shown that in a one-dimensional structure the tau foot value can increase dramatically for lower coupling resistance values near the stimulus site and subsequently can be reduced as we move to distances larger than five resting length constants from the stimulus site. The tau foot variability is reduced with increased coupling resistance, rendering the lower coupling resistance structures, under abnormal excitation sequences, more vulnerable to conduction block and arrhythmias. Using the theory of discontinuous propagation of the PAP in the myocardium it is demonstrated that for specific abnormal situations in the myocardium, such as infarcted tissue, one- and two-dimensional models can reliably simulate propagation characteristics and explain complex phenomena such as propagation at bifurcation sites and mechanisms of block and re-entry. In conclusion it is shown that applied mathematics and informatics can help in elucidating electrophysiologically complex mechanisms such as arrhythmias and conduction disturbances in the myocardium.

  12. Computational simulator of robotic manipulators

    International Nuclear Information System (INIS)

    Leal, Alexandre S.; Campos, Tarcisio P.R.

    1995-01-01

    Robotic application for industrial plants is discussed and a computational model for a mechanical manipulator of three links is presented. A neural network feed-forward type has been used to model the dynamic control of the manipulator. A graphic interface was developed in C programming language as a virtual world in order to visualize and simulate the arm movements handling radioactive waste environment. (author). 7 refs, 5 figs

  13. Material characterization and computer model simulation of low density polyurethane foam used in a rodent traumatic brain injury model.

    Science.gov (United States)

    Zhang, Liying; Gurao, Manish; Yang, King H; King, Albert I

    2011-05-15

    Computer models of the head can be used to simulate the events associated with traumatic brain injury (TBI) and quantify biomechanical response within the brain. Marmarou's impact acceleration rodent model is a widely used experimental model of TBI mirroring axonal pathology in humans. The mechanical properties of the low density polyurethane (PU) foam, an essential piece of energy management used in Marmarou's impact device, has not been fully characterized. The foam used in Marmarou's device was tested at seven strain rates ranging from quasi-static to dynamic (0.014-42.86 s⁻¹) to quantify the stress-strain relationships in compression. Recovery rate of the foam after cyclic compression was also determined through the periods of recovery up to three weeks. The experimentally determined stress-strain curves were incorporated into a material model in an explicit Finite Element (FE) solver to validate the strain rate dependency of the FE foam model. Compression test results have shown that the foam used in the rodent impact acceleration model is strain rate dependent. The foam has been found to be reusable for multiple impacts. However the stress resistance of used foam is reduced to 70% of the new foam. The FU_CHANG_FOAM material model in an FE solver has been found to be adequate to simulate this rate sensitive foam. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. An imaging-based computational model for simulating angiogenesis and tumour oxygenation dynamics

    Science.gov (United States)

    Adhikarla, Vikram; Jeraj, Robert

    2016-05-01

    Tumour growth, angiogenesis and oxygenation vary substantially among tumours and significantly impact their treatment outcome. Imaging provides a unique means of investigating these tumour-specific characteristics. Here we propose a computational model to simulate tumour-specific oxygenation changes based on the molecular imaging data. Tumour oxygenation in the model is reflected by the perfused vessel density. Tumour growth depends on its doubling time (T d) and the imaged proliferation. Perfused vessel density recruitment rate depends on the perfused vessel density around the tumour (sMVDtissue) and the maximum VEGF concentration for complete vessel dysfunctionality (VEGFmax). The model parameters were benchmarked to reproduce the dynamics of tumour oxygenation over its entire lifecycle, which is the most challenging test. Tumour oxygenation dynamics were quantified using the peak pO2 (pO2peak) and the time to peak pO2 (t peak). Sensitivity of tumour oxygenation to model parameters was assessed by changing each parameter by 20%. t peak was found to be more sensitive to tumour cell line related doubling time (~30%) as compared to tissue vasculature density (~10%). On the other hand, pO2peak was found to be similarly influenced by the above tumour- and vasculature-associated parameters (~30-40%). Interestingly, both pO2peak and t peak were only marginally affected by VEGFmax (~5%). The development of a poorly oxygenated (hypoxic) core with tumour growth increased VEGF accumulation, thus disrupting the vessel perfusion as well as further increasing hypoxia with time. The model with its benchmarked parameters, is applied to hypoxia imaging data obtained using a [64Cu]Cu-ATSM PET scan of a mouse tumour and the temporal development of the vasculature and hypoxia maps are shown. The work underscores the importance of using tumour-specific input for analysing tumour evolution. An extended model incorporating therapeutic effects can serve as a powerful tool for analysing

  15. Predicting knee replacement damage in a simulator machine using a computational model with a consistent wear factor.

    Science.gov (United States)

    Zhao, Dong; Sakoda, Hideyuki; Sawyer, W Gregory; Banks, Scott A; Fregly, Benjamin J

    2008-02-01

    Wear of ultrahigh molecular weight polyethylene remains a primary factor limiting the longevity of total knee replacements (TKRs). However, wear testing on a simulator machine is time consuming and expensive, making it impractical for iterative design purposes. The objectives of this paper were first, to evaluate whether a computational model using a wear factor consistent with the TKR material pair can predict accurate TKR damage measured in a simulator machine, and second, to investigate how choice of surface evolution method (fixed or variable step) and material model (linear or nonlinear) affect the prediction. An iterative computational damage model was constructed for a commercial knee implant in an AMTI simulator machine. The damage model combined a dynamic contact model with a surface evolution model to predict how wear plus creep progressively alter tibial insert geometry over multiple simulations. The computational framework was validated by predicting wear in a cylinder-on-plate system for which an analytical solution was derived. The implant damage model was evaluated for 5 million cycles of simulated gait using damage measurements made on the same implant in an AMTI machine. Using a pin-on-plate wear factor for the same material pair as the implant, the model predicted tibial insert wear volume to within 2% error and damage depths and areas to within 18% and 10% error, respectively. Choice of material model had little influence, while inclusion of surface evolution affected damage depth and area but not wear volume predictions. Surface evolution method was important only during the initial cycles, where variable step was needed to capture rapid geometry changes due to the creep. Overall, our results indicate that accurate TKR damage predictions can be made with a computational model using a constant wear factor obtained from pin-on-plate tests for the same material pair, and furthermore, that surface evolution method matters only during the initial

  16. Four-bar linkage modelling in teleost pharyngeal jaws: computer simulations of bite kinetics

    Science.gov (United States)

    Grubich, Justin R; Westneat, Mark W

    2006-01-01

    The pharyngeal arches of the red drum (Sciaenops ocellatus) possess large toothplates and a complex musculoskeletal design for biting and crushing hard prey. The morphology of the pharyngeal apparatus is described from dissections of six specimens, with a focus on the geometric conformation of contractile and rotational elements. Four major muscles operate the rotational 4th epibranchial (EB4) and 3rd pharyngobranchial (PB3) elements to create pharyngeal bite force, including the levator posterior (LP), levator externus 3/4 (LE), obliquus posterior (OP) and 3rd obliquus dorsalis (OD). A biomechanical model of upper pharyngeal jaw biting is developed using lever mechanics and four-bar linkage theory from mechanical engineering. A pharyngeal four-bar linkage is proposed that involves the posterior skull as the fixed link, the LP muscle as input link, the epibranchial bone as coupler link and the toothed pharyngobranchial as output link. We used a computer model to simulate contraction of the four major muscles, with the LP as the dominant muscle, the length of which determined the position of the linkage. When modelling lever mechanics, we found that the effective mechanical advantages of the pharyngeal elements were low, resulting in little resultant bite force. By contrast, the force advantage of the four-bar linkage was relatively high, transmitting approximately 50% of the total muscle force to the bite between the toothplates. Pharyngeal linkage modelling enables quantitative functional morphometry of a key component of the fish feeding system, and the model is now available for ontogenetic and comparative analyses of fishes with pharyngeal linkage mechanisms. PMID:16822272

  17. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    Science.gov (United States)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  18. Theoretical modeling and computational simulation of robust control for Mars aircraft

    Science.gov (United States)

    Oh, Seyool

    The focus of this dissertation is the development of control system design algorithms for autonomous operation of an aircraft in the Martian atmosphere. This research will show theoretical modeling and computational simulation of robust control and gain scheduling for a prototype Mars aircraft. A few hundred meters above the surface of Mars, the air density is less than 1% of the density of the Earth's atmosphere at sea level. However, at about 33 km (110,000 ft) above the Earth, the air density is similar to that near the surface of Mars. Marsflyer II was designed to investigate these flight regimes: 33 km above the Earth and the actual Mars environment. The fuselage for the preliminary design was cylindrical with a length of 2.59 m (8.49 ft), the wing span was 3.98 m (13.09 ft). The total weight of the demonstrator aircraft was around 4.54 kg (10.02 lb). Aircraft design tools have been developed based on successful aircraft for the Earth`s atmosphere. However, above Mars an airborne robotic explorer would encounter low Reynolds Number flow phenomena combined with high Mach numbers, a region that is unknown for normal Earth aerodynamic applications. These flows are more complex than those occurring at high Reynolds numbers. The performance of airfoils at low Reynolds numbers is poorly understood and generally results in unfavorable aerodynamic characteristics. Design and simulation tools for the low Reynolds number Martian environment could be used to develop Unmanned Aerial Vehicles (UAV). In this study, a robust control method is used to analyze a prototype Mars aircraft. The purpose of this aircraft is to demonstrate stability, control, and performance within a simulated Mars environment. Due to uncertainty regarding the actual Martian environment, flexibility in the operation of the aircraft`s control system is important for successful performance. The stability and control derivatives of Marsflyer II were obtained by using the Advanced Aircraft Analysis (AAA

  19. LIAR -- A computer program for the modeling and simulation of high performance linacs

    Energy Technology Data Exchange (ETDEWEB)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-04-01

    The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm.

  20. PLYMAP : a computer simulation model of the rotary peeled softwood plywood manufacturing process

    Science.gov (United States)

    Henry Spelter

    1990-01-01

    This report documents a simulation model of the plywood manufacturing process. Its purpose is to enable a user to make quick estimates of the economic impact of a particular process change within a mill. The program was designed to simulate the processing of plywood within a relatively simplified mill design. Within that limitation, however, it allows a wide range of...

  1. Computer simulation of liquid crystals

    International Nuclear Information System (INIS)

    McBride, C.

    1999-01-01

    Molecular dynamics simulation performed on modern computer workstations provides a powerful tool for the investigation of the static and dynamic characteristics of liquid crystal phases. In this thesis molecular dynamics computer simulations have been performed for two model systems. Simulations of 4,4'-di-n-pentyl-bibicyclo[2.2.2]octane demonstrate the growth of a structurally ordered phase directly from an isotropic fluid. This is the first time that this has been achieved for an atomistic model. The results demonstrate a strong coupling between orientational ordering and molecular shape, but indicate that the coupling between molecular conformational changes and molecular reorientation is relatively weak. Simulations have also been performed for a hybrid Gay-Berne/Lennard-Jones model resulting in thermodynamically stable nematic and smectic phases. Frank elastic constants have been calculated for the nematic phase formed by the hybrid model through analysis of the fluctuations of the nematic director, giving results comparable with those found experimentally. Work presented in this thesis also describes the parameterization of the torsional potential of a fragment of a dimethyl siloxane polymer chain, disiloxane diol (HOMe 2 Si) 2 O, using ab initio quantum mechanical calculations. (author)

  2. Computer simulation of electron beams

    Energy Technology Data Exchange (ETDEWEB)

    Sabchevski, S.P.; Mladenov, G.M. (Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. po Elektronika)

    1994-04-14

    Self-fields and forces as well as the local degree of space-charge neutralization in overcompensated electron beams are considered. The radial variation of the local degree of space-charge neutralization is analysed. A novel model which describes the equilibrium potential distribution in overcompensated beams is proposed and a method for computer simulation of the beam propagation is described. Results from numerical experiments which illustrate the propagation of finite emittance overneutralized beams are presented. (Author).

  3. Shear-induced aggregation or disaggregation in edible oils: Models, computer simulation, and USAXS measurements

    Science.gov (United States)

    Townsend, B.; Peyronel, F.; Callaghan-Patrachar, N.; Quinn, B.; Marangoni, A. G.; Pink, D. A.

    2017-12-01

    The effects of shear upon the aggregation of solid objects formed from solid triacylglycerols (TAGs) immersed in liquid TAG oils were modeled using Dissipative Particle Dynamics (DPD) and the predictions compared to experimental data using Ultra-Small Angle X-ray Scattering (USAXS). The solid components were represented by spheres interacting via attractive van der Waals forces and short range repulsive forces. A velocity was applied to the liquid particles nearest to the boundary, and Lees-Edwards boundary conditions were used to transmit this motion to non-boundary layers via dissipative interactions. The shear was created through the dissipative forces acting between liquid particles. Translational diffusion was simulated, and the Stokes-Einstein equation was used to relate DPD length and time scales to SI units for comparison with USAXS results. The SI values depended on how large the spherical particles were (250 nm vs. 25 nm). Aggregation was studied by (a) computing the Structure Function and (b) quantifying the number of pairs of solid spheres formed. Solid aggregation was found to be enhanced by low shear rates. As the shear rate was increased, a transition shear region was manifested in which aggregation was inhibited and shear banding was observed. Aggregation was inhibited, and eventually eliminated, by further increases in the shear rate. The magnitude of the transition region shear, γ˙ t, depended on the size of the solid particles, which was confirmed experimentally.

  4. Computer simulation of martensitic transformations

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ping [Univ. of California, Berkeley, CA (United States)

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  5. Parallel Computing for Brain Simulation.

    Science.gov (United States)

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  7. Computer simulation model for the striped bass young-of-the-year population in the Hudson River

    International Nuclear Information System (INIS)

    Eraslan, A.H.; Van Winkle, W.; Sharp, R.D.; Christensen, S.W.; Goodyear, C.P.; Rush, R.M.; Fulkerson, W.

    1975-09-01

    This report presents a daily transient (tidal-averaged), longitudinally one-dimensional (cross-section-averaged) computer simulation model for the assessment of the entrainment and impingement impacts of power plant operations on young-of-the-year populations of the striped bass, Morone saxatilis, in the Hudson River

  8. World Wide Web Access to Fluid Inclusion Data for Computational Modelling and Simulation

    Science.gov (United States)

    Mernagh, T. P.; Bastrakov, E.; Percival, D.; Girvan, S.; Wyborn, L. A.

    2007-12-01

    Accurate constraints on the chemistry of hydrothermal fluids are critical in our capacity to computationally model and simulate how ore deposits form. To maximize results and subsequent interpretation the fluid inclusion populations should be fully characterized using standardised observational and processing techniques. A Virtual Centre for Geofluids and Thermodynamic Data, which includes the fluid inclusion (FIncs) system, has been established to achieve this. The FIncs system is designed to pull together fluid inclusion data from many individual, often disparate studies. The FIncs database and web applications allow researchers to search and retrieve fluid inclusion data and images via a web browser interface. The database will help standardise the way fluid inclusion data and associated metadata are stored. Furthermore, it follows the principles outlined by the Open Geospatial Consortium (OGC) for the Observation and Measurement application schema. It is tightly coupled to enable formalisation of the observations and measurements made on fluid inclusions, and to standardise how these measurements are processed to achieve consistent constraints for geochemical models. FIncs uses both domain factual knowledge and problem-solving knowledge by providing a choice of models (equations of state) for obtaining additional fluid properties via a web-based calculator, which allows researchers to calculate isochoric T&P values and other chemical and physical properties (e.g. salinity, density, etc.). This method has the benefit of ensuring that all derived data are produced and standardised by a selected set of routines. It also enables data from multiple sources to be quickly reprocessed by new routines as they become available and are added to the database toolkit. The database is being developed as an "open" project, which intends to bring together researchers interested in the properties of geological fluids or fluid inclusions. The ultimate goal of the Virtual Centre

  9. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  10. LIAR: A COMPUTER PROGRAM FOR THE SIMULATION AND MODELING OF HIGH PERFORMANCE LINACS

    International Nuclear Information System (INIS)

    Adolphsen, Chris

    2003-01-01

    The computer program LIAR (''LInear Accelerator Research code'') is a numerical simulation and tracking program for linear colliders. The LIAR project was started at SLAC in August 1995 in order to provide a computing and simulation tool that specifically addresses the needs of high energy linear colliders. LIAR is designed to be used for a variety of different linear accelerators. It has been applied for and checked against the existing Stanford Linear Collider (SLC) as well as the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS). The program includes wakefield effects, a 4D coupled beam description, specific optimization algorithms and other advanced features. We describe the most important concepts and highlights of the program. After having presented the LIAR program at the LINAC96 and the PAC97 conferences, we do now introduce it to the European particle accelerator community

  11. Blood flow in intracranial aneurysms treated with Pipeline embolization devices: computational simulation and verification with Doppler ultrasonography on phantom models

    Directory of Open Access Journals (Sweden)

    Anderson Chun On Tsang

    2015-04-01

    Full Text Available Purpose: The aim of this study was to validate a computational fluid dynamics (CFD simulation of flow-diverter treatment through Doppler ultrasonography measurements in patient-specific models of intracranial bifurcation and side-wall aneurysms. Methods: Computational and physical models of patient-specific bifurcation and sidewall aneurysms were constructed from computed tomography angiography with use of stereolithography, a three-dimensional printing technology. Flow dynamics parameters before and after flow-diverter treatment were measured with pulse-wave and color Doppler ultrasonography, and then compared with CFD simulations. Results: CFD simulations showed drastic flow reduction after flow-diverter treatment in both aneurysms. The mean volume flow rate decreased by 90% and 85% for the bifurcation aneurysm and the side-wall aneurysm, respectively. Velocity contour plots from computer simulations before and after flow diversion closely resembled the patterns obtained by color Doppler ultrasonography. Conclusion: The CFD estimation of flow reduction in aneurysms treated with a flow-diverting stent was verified by Doppler ultrasonography in patient-specific phantom models of bifurcation and side-wall aneurysms. The combination of CFD and ultrasonography may constitute a feasible and reliable technique in studying the treatment of intracranial aneurysms with flow-diverting stents.

  12. Multiscale paradigms in integrated computational materials science and engineering materials theory, modeling, and simulation for predictive design

    CERN Document Server

    Runge, Keith; Muralidharan, Krishna

    2016-01-01

    This book presents cutting-edge concepts, paradigms, and research highlights in the field of computational materials science and engineering, and provides a fresh, up-to-date perspective on solving present and future materials challenges. The chapters are written by not only pioneers in the fields of computational materials chemistry and materials science, but also experts in multi-scale modeling and simulation as applied to materials engineering. Pedagogical introductions to the different topics and continuity between the chapters are provided to ensure the appeal to a broad audience and to address the applicability of integrated computational materials science and engineering for solving real-world problems.

  13. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  14. Computer simulation of 2D grain growth using a cellular automata model based on the lowest energy principle

    International Nuclear Information System (INIS)

    He Yizhu; Ding Hanlin; Liu Liufa; Shin, Keesam

    2006-01-01

    The morphology, topology and kinetics of normal grain growth in two-dimension were studied by computer simulation using a cellular automata (Canada) model based on the lowest energy principle. The thermodynamic energy that follows Maxwell-Boltzmann statistics has been introduced into this model for the calculation of energy change. The transition that can reduce the system energy to the lowest level is chosen to occur when there is more than one possible transition direction. The simulation results show that the kinetics of normal grain growth follows the Burke equation with the growth exponent m = 2. The analysis of topology further indicates that normal grain growth can be simulated fairly well by the present CA model. The vanishing of grains with different number of sides is discussed in the simulation

  15. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  16. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    Science.gov (United States)

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  17. A multiprocessor computer simulation model employing a feedback scheduler/allocator for memory space and bandwidth matching and TMR processing

    Science.gov (United States)

    Bradley, D. B.; Irwin, J. D.

    1974-01-01

    A computer simulation model for a multiprocessor computer is developed that is useful for studying the problem of matching multiprocessor's memory space, memory bandwidth and numbers and speeds of processors with aggregate job set characteristics. The model assumes an input work load of a set of recurrent jobs. The model includes a feedback scheduler/allocator which attempts to improve system performance through higher memory bandwidth utilization by matching individual job requirements for space and bandwidth with space availability and estimates of bandwidth availability at the times of memory allocation. The simulation model includes provisions for specifying precedence relations among the jobs in a job set, and provisions for specifying precedence execution of TMR (Triple Modular Redundant and SIMPLEX (non redundant) jobs.

  18. Validation of the AVM Blast Computational Modeling and Simulation Tool Set

    Science.gov (United States)

    2015-08-04

    benches leverage Modelica to compute results, while others must rely on more specific software applications such as Abaqus , OpenFOAM, and LS-DYNA. In...semi-empirical model [4-8]. By using extensive landmine live fire experimental data , the SFTC computes specific impulse as a function of standoff...through the right-hand or corkscrew rules, the normal of the segment. Finally LoadDyna reads the geometrical mesh (mesh.k) data , specifically only

  19. Computer simulation of superionic fluorides

    CERN Document Server

    Castiglione, M

    2000-01-01

    experimentally gives an indication of the correlations between nearby defects is well-reproduced. The most stringent test of simulation model transferability is presented in the studies of lead tin fluoride, in which significant 'covalent' effects are apparent. Other similarly-structured compounds are also investigated, and the reasons behind the adoption of such an unusual layered structure, and the mobility and site occupation of the anions is quantified. In this thesis the nature of ion mobility in cryolite and lead fluoride based compounds is investigated by computer simulation. The phase transition of cryolite is characterised in terms of rotation of AIF sub 6 octahedra, and the conductive properties are shown to result from diffusion of the sodium ions. The two processes appear to be unrelated. Very good agreement with NMR experimental results is found. The Pb sup 2 sup + ion has a very high polarisability, yet treatment of this property in previous simulations has been problematic. In this thesis a mor...

  20. Computer simulation of oxides

    International Nuclear Information System (INIS)

    Rowley, A.

    1998-01-01

    An ionic interaction model is developed which accounts for the effects of the ionic environment upon the electron densities of both cations and anions through changes in their size and shape and is transferable between materials. These variations are represented by additional dynamical variables which are handled within the model using the techniques of the Car-Parrinello method. The model parameters are determined as far as possible by input from external ab initio electronic structure calculations directed at examining the individual effects of the ionic environment upon the ions, particularly the oxide ion. Techniques for the evaluation of dipolar and quadrupolar Ewald sums in non-cubic simulation cells and the calculation of the pressure due to the terms in the potential are presented. This model is applied to the description of the perfect crystal properties and phonon dispersion curves of MgO. Consideration of the high symmetry phonon modes allows parameterization of the remaining model parameters in an unambiguous fashion. The same procedure is used to obtain parameters for CaO. These two parameter sets are examined to determine how they may be used to generate the parameters for SrO and simple scaling relationships based on ionic radii and polarizabilities are formulated. The transferability of the model to Cr 2 O 3 is investigated using parameters generated from the alkaline earth oxides. The importance of lower symmetry model terms, particularly quadrupolar interactions, at the low symmetry ion sites in the crystal structure is demonstrated. The correct ground-state crystal structure is predicted and the calculated surface energies and relaxation phenomena are found to agree well with previous ab initio studies. The model is applied to GeO 2 as a strong test of its applicability to ion environments far different from those encountered in MgO. An good description of the crystal structures is obtained and the interplay of dipolar and quadrupolar effects is

  1. Simple Urban Simulation Atop Complicated Models: Multi-Scale Equation-Free Computing of Sprawl Using Geographic Automata

    Directory of Open Access Journals (Sweden)

    Yu Zou

    2013-07-01

    Full Text Available Reconciling competing desires to build urban models that can be simple and complicated is something of a grand challenge for urban simulation. It also prompts difficulties in many urban policy situations, such as urban sprawl, where simple, actionable ideas may need to be considered in the context of the messily complex and complicated urban processes and phenomena that work within cities. In this paper, we present a novel architecture for achieving both simple and complicated realizations of urban sprawl in simulation. Fine-scale simulations of sprawl geography are run using geographic automata to represent the geographical drivers of sprawl in intricate detail and over fine resolutions of space and time. We use Equation-Free computing to deploy population as a coarse observable of sprawl, which can be leveraged to run automata-based models as short-burst experiments within a meta-simulation framework.

  2. 20170312 - Computer Simulation of Developmental ...

    Science.gov (United States)

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  3. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  4. Patient flow within UK emergency departments: a systematic review of the use of computer simulation modelling methods

    Science.gov (United States)

    Mohiuddin, Syed; Busby, John; Savović, Jelena; Richards, Alison; Northstone, Kate; Hollingworth, William; Donovan, Jenny L; Vasilakis, Christos

    2017-05-09

    Overcrowding in the emergency department (ED) is common in the UK as in other countries worldwide. Computer simulation is one approach used for understanding the causes of ED overcrowding and assessing the likely impact of changes to the delivery of emergency care. However, little is known about the usefulness of computer simulation for analysis of ED patient flow. We undertook a systematic review to investigate the different computer simulation methods and their contribution for analysis of patient flow within EDs in the UK. We searched eight bibliographic databases (MEDLINE, EMBASE, COCHRANE, WEB OF SCIENCE, CINAHL, INSPEC, MATHSCINET and ACM DIGITAL LIBRARY) from date of inception until 31 March 2016. Studies were included if they used a computer simulation method to capture patient progression within the ED of an established UK National Health Service hospital. Studies were summarised in terms of simulation method, key assumptions, input and output data, conclusions drawn and implementation of results. Twenty-one studies met the inclusion criteria. Of these, 19 used discrete event simulation and 2 used system dynamics models. The purpose of many of these studies (n=16; 76%) centred on service redesign. Seven studies (33%) provided no details about the ED being investigated. Most studies (n=18; 86%) used specific hospital models of ED patient flow. Overall, the reporting of underlying modelling assumptions was poor. Nineteen studies (90%) considered patient waiting or throughput times as the key outcome measure. Twelve studies (57%) reported some involvement of stakeholders in the simulation study. However, only three studies (14%) reported on the implementation of changes supported by the simulation. We found that computer simulation can provide a means to pretest changes to ED care delivery before implementation in a safe and efficient manner. However, the evidence base is small and poorly developed. There are some methodological, data, stakeholder

  5. A high performance computing framework for physics-based modeling and simulation of military ground vehicles

    Science.gov (United States)

    Negrut, Dan; Lamb, David; Gorsich, David

    2011-06-01

    This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and

  6. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model

    Science.gov (United States)

    Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  7. Simulating Idiopathic Parkinson’s Disease by In Vitro and Computational Models

    NARCIS (Netherlands)

    Heida, Tjitske; Stegenga, Jan; Lourens, Marcel; Meijer, Hil; van Gils, Stephan; Lazarov, Nikolai; Marani, Enrico; Naik, Ganesh R.

    2012-01-01

    In general there is a wide gap between experimental animal results, especially with respect to neuroanatomical data, and computational modeling. In order to be able to investigate the anatomical and functional properties of afferent and efferent connections between the different nuclei of the basal

  8. SEISMIC SIMULATIONS USING PARALLEL COMPUTING AND THREE-DIMENSIONAL EARTH MODELS TO IMPROVE NUCLEAR EXPLOSION PHENOMENOLOGY AND MONITORING

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Matzel, E; Pasyanos, M; Petersson, A; Sjogreen, B; Bono, C; Vorobiev, O; Antoun, T; Walter, W; Myers, S; Lomov, I

    2008-07-07

    The development of accurate numerical methods to simulate wave propagation in three-dimensional (3D) earth models and advances in computational power offer exciting possibilities for modeling the motions excited by underground nuclear explosions. This presentation will describe recent work to use new numerical techniques and parallel computing to model earthquakes and underground explosions to improve understanding of the wave excitation at the source and path-propagation effects. Firstly, we are using the spectral element method (SEM, SPECFEM3D code of Komatitsch and Tromp, 2002) to model earthquakes and explosions at regional distances using available 3D models. SPECFEM3D simulates anelastic wave propagation in fully 3D earth models in spherical geometry with the ability to account for free surface topography, anisotropy, ellipticity, rotation and gravity. Results show in many cases that 3D models are able to reproduce features of the observed seismograms that arise from path-propagation effects (e.g. enhanced surface wave dispersion, refraction, amplitude variations from focusing and defocusing, tangential component energy from isotropic sources). We are currently investigating the ability of different 3D models to predict path-specific seismograms as a function of frequency. A number of models developed using a variety of methodologies are available for testing. These include the WENA/Unified model of Eurasia (e.g. Pasyanos et al 2004), the global CUB 2.0 model (Shapiro and Ritzwoller, 2002), the partitioned waveform model for the Mediterranean (van der Lee et al., 2007) and stochastic models of the Yellow Sea Korean Peninsula region (Pasyanos et al., 2006). Secondly, we are extending our Cartesian anelastic finite difference code (WPP of Nilsson et al., 2007) to model the effects of free-surface topography. WPP models anelastic wave propagation in fully 3D earth models using mesh refinement to increase computational speed and improve memory efficiency. Thirdly

  9. Reducing the computational requirements for simulating tunnel fires by combining multiscale modelling and multiple processor calculation

    DEFF Research Database (Denmark)

    Vermesi, Izabella; Rein, Guillermo; Colella, Francesco

    2017-01-01

    in FDS version 6.0, a widely used fire-specific, open source CFD software. Furthermore, it compares the reduction in simulation time given by multiscale modelling with the one given by the use of multiple processor calculation. This was done using a 1200m long tunnel with a rectangular cross...... processor calculation (97% faster when using a single mesh and multiscale modelling; only 46% faster when using the full tunnel and multiple meshes). In summary, it was found that multiscale modelling with FDS v.6.0 is feasible, and the combination of multiple meshes and multiscale modelling was established...

  10. Modeling of tool-tissue interactions for computer-based surgical simulation: a literature review

    NARCIS (Netherlands)

    Misra, Sarthak; Ramesh, K.T.; Okamura, Allison M.

    2008-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in

  11. A computer simulation model for the practical planning of cervical cancer screening programmes.

    OpenAIRE

    Parkin, D. M.

    1985-01-01

    There is ample evidence of the efficacy of cytological screening in the prevention of cervical cancer but disagreement on the form which screening programmes should take. Simulation models have been used as a convenient and rapid method of exploring the outcome of different screening policies and of demonstrating the importance and interrelationships of the variables concerned. However, most such models are either too abstract or too simplistic to be of practical value in planning screening p...

  12. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-02-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing) that students apply to study how lakes around the globe are experiencing the effects of climate change. In the module, students develop hypotheses about the effects of different climate scenarios on lakes and then test their hypotheses using hundreds of model simulations. We taught the module in a 4-hour workshop and found that participation in the module significantly increased both undergraduate and graduate students' understanding about climate change effects on lakes. Moreover, participation in the module also significantly increased students' perceived experience level in using different software, technologies, and modeling tools. By embedding modeling in an environmental science context, non-computer science students were able to successfully use and master technologies that they had previously never been exposed to. Overall, our findings suggest that modeling is a powerful tool for catalyzing student learning on the effects of climate change.

  13. Finite element analysis of TAVI: Impact of native aortic root computational modeling strategies on simulation outcomes.

    Science.gov (United States)

    Finotello, Alice; Morganti, Simone; Auricchio, Ferdinando

    2017-09-01

    In the last few years, several studies, each with different aim and modeling detail, have been proposed to investigate transcatheter aortic valve implantation (TAVI) with finite elements. The present work focuses on the patient-specific finite element modeling of the aortic valve complex. In particular, we aim at investigating how different modeling strategies in terms of material models/properties and discretization procedures can impact analysis results. Four different choices both for the mesh size (from  20 k elements to  200 k elements) and for the material model (from rigid to hyperelastic anisotropic) are considered. Different approaches for modeling calcifications are also taken into account. Post-operative CT data of the real implant are used as reference solution with the aim of outlining a trade-off between computational model complexity and reliability of the results. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  14. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies

  15. Accounting Principles are Simulated on Quantum Computers

    OpenAIRE

    Diep, Do Ngoc; Giang, Do Hoang

    2005-01-01

    The paper is devoted to a new idea of simulation of accounting by quantum computing. We expose the actual accounting principles in a pure mathematics language. After that we simulated the accounting principles on quantum computers. We show that all arbitrary accounting actions are exhausted by the described basic actions. The main problem of accounting are reduced to some system of linear equations in the economic model of Leontief. In this simulation we use our constructed quantum Gau\\ss-Jor...

  16. Grid computing and biomolecular simulation.

    Science.gov (United States)

    Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

    2005-08-15

    Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

  17. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  18. A two-state stochastic model for nanoparticle self-assembly: theory, computer simulations and applications

    International Nuclear Information System (INIS)

    Schwen, E M; Mazilu, I; Mazilu, D A

    2015-01-01

    We introduce a stochastic cooperative model for particle deposition and evaporation relevant to ionic self-assembly of nanoparticles with applications in surface fabrication and nanomedicine, and present a method for mapping our model onto the Ising model. The mapping process allows us to use the established results for the Ising model to describe the steady-state properties of our system. After completing the mapping process, we investigate the time dependence of particle density using the mean field approximation. We complement this theoretical analysis with Monte Carlo simulations that support our model. These techniques, which can be used separately or in combination, are useful as pedagogical tools because they are tractable mathematically and they apply equally well to many other physical systems with nearest-neighbour interactions including voter and epidemic models. (paper)

  19. Reduced-order modeling (ROM) for simulation and optimization powerful algorithms as key enablers for scientific computing

    CERN Document Server

    Milde, Anja; Volkwein, Stefan

    2018-01-01

    This edited monograph collects research contributions and addresses the advancement of efficient numerical procedures in the area of model order reduction (MOR) for simulation, optimization and control. The topical scope includes, but is not limited to, new out-of-the-box algorithmic solutions for scientific computing, e.g. reduced basis methods for industrial problems and MOR approaches for electrochemical processes. The target audience comprises research experts and practitioners in the field of simulation, optimization and control, but the book may also be beneficial for graduate students alike. .

  20. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  1. Modeling and simulation of membrane separation process using computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    Kambiz Tahvildari

    2016-01-01

    Full Text Available Separation of CO2 from air was simulated in this work. The considered process for removal of CO2 was a hollow-fiber membrane contactor and an aqueous solution of 2-amino-2-metyl-1-propanol (AMP as absorbent. The model was developed based on mass transfer as well as chemical reaction for CO2 and solvent in the contactor. The equations of model were solved using finite element method. Simulation results were compared with experimental data, and good agreement was observed. The results revealed that increasing solvent velocity enhances removal of CO2 in the hollow-fiber membrane contactor. Moreover, it was found that counter-current process mode is more favorable to achieve the highest separation efficiency.

  2. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  3. Development of a computer model (REASON) for the simulation of behavioural decisions on the basis of inference and valuation processes

    International Nuclear Information System (INIS)

    Engemann, A.; Radtke, M.; Sachs, S.

    1981-07-01

    A computer model for the simulation of behavioural decisions and the preceding inference and valuation processes is under development under the sponsorship of the 'Stiftung Volkswagenwerk'. The present paper describes the basic ideas of the model from both the psychological and the mathematical point of view. The interdisciplinary character of the project is demonstrated quite clearly. In a semantic network, which contains knowlege, values and standarts related to the field under consideration, feasible actions and their consequences are evaluated. According to the behavioural model of Ajzen and Fishbein, valuations of the consequences are multiplied with the expections and added up. The command language for the program allows and algebraic definition of the Fishbein formula. The concept, consisting of object structres, predicates and implications, can be described in nearly natural language. In addition to this 'expected utility model' the program provides the possibility to exclude option by thresholds for the simulation of simplistic heuristics. (orig.) [de

  4. A computer simulation of the turbocharged turbo compounded diesel engine system: A description of the thermodynamic and heat transfer models

    Science.gov (United States)

    Assanis, D. N.; Ekchian, J. E.; Frank, R. M.; Heywood, J. B.

    1985-01-01

    A computer simulation of the turbocharged turbocompounded direct-injection diesel engine system was developed in order to study the performance characteristics of the total system as major design parameters and materials are varied. Quasi-steady flow models of the compressor, turbines, manifolds, intercooler, and ducting are coupled with a multicylinder reciprocator diesel model, where each cylinder undergoes the same thermodynamic cycle. The master cylinder model describes the reciprocator intake, compression, combustion and exhaust processes in sufficient detail to define the mass and energy transfers in each subsystem of the total engine system. Appropriate thermal loading models relate the heat flow through critical system components to material properties and design details. From this information, the simulation predicts the performance gains, and assesses the system design trade-offs which would result from the introduction of selected heat transfer reduction materials in key system components, over a range of operating conditions.

  5. Challenges in Computational Social Modeling and Simulation for National Security Decision Making

    Science.gov (United States)

    2011-06-01

    may literally be dealing with life-and- death challenges. Finally, no model or simulation is inherently a “decision support tool;” instead, tool... semiotics . In her paper, Resnyansky argues that the rapid rise of information and communications technologies has raised significant ethical 20...Dynamics in Human and Primate Societies, T. Koehler and G. Gumerman, Eds. New York: Oxford University Press, 2000. [20] P. Ormerod, The Death of

  6. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    International Nuclear Information System (INIS)

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational

  7. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Alan [The University of Edinburgh, Edinburgh EH9 3JZ, Scotland (United Kingdom); Harlen, Oliver G. [University of Leeds, Leeds LS2 9JT (United Kingdom); Harris, Sarah A., E-mail: s.a.harris@leeds.ac.uk [University of Leeds, Leeds LS2 9JT (United Kingdom); University of Leeds, Leeds LS2 9JT (United Kingdom); Khalid, Syma; Leung, Yuk Ming [University of Southampton, Southampton SO17 1BJ (United Kingdom); Lonsdale, Richard [Max-Planck-Institut für Kohlenforschung, Kaiser-Wilhelm-Platz 1, 45470 Mülheim an der Ruhr (Germany); Philipps-Universität Marburg, Hans-Meerwein Strasse, 35032 Marburg (Germany); Mulholland, Adrian J. [University of Bristol, Bristol BS8 1TS (United Kingdom); Pearson, Arwen R. [University of Leeds, Leeds LS2 9JT (United Kingdom); University of Hamburg, Hamburg (Germany); Read, Daniel J.; Richardson, Robin A. [University of Leeds, Leeds LS2 9JT (United Kingdom); The University of Edinburgh, Edinburgh EH9 3JZ, Scotland (United Kingdom)

    2015-01-01

    The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

  8. Micro-computer simulation software: A review

    Directory of Open Access Journals (Sweden)

    P.S. Kruger

    2003-12-01

    Full Text Available Simulation modelling has proved to be one of the most powerful tools available to the Operations Research Analyst. The development of micro-computer technology has reached a state of maturity where the micro-computer can provide the necessary computing power and consequently various powerful and inexpensive simulation languages for micro-computers have became available. This paper will attempt to provide an introduction to the general philosophy and characteristics of some of the available micro-computer simulation languages. The emphasis will be on the characteristics of the specific micro-computer implementation rather than on a comparison of the modelling features of the various languages. Such comparisons may be found elsewhere.

  9. Computer Simulation Study of Human Locomotion with a Three-Dimensional Entire-Body Neuro-Musculo-Skeletal Model

    Science.gov (United States)

    Hase, Kazunori; Obuchi, Shuichi

    The three-dimensional entire-body neuro-musculo-skeletal model generating normal walking motion was modified to synthesize pathological walking including asymmetricalcompensatorymotions. Inadditiontotheneuronalparameters, musculo-skeletal parameters were employed as search parameters to represent affected musculo-skeletal systems. This model successfully generated pathological walking patterns, such as walking by a person with one lower extremity shorter than the other and walking by a person with an affected gluteus medius muscle. The simulated walking patterns were of the entire body, three-dimensional, continuous and asymmetrical, and demonstrated the characteristics of actual pathological walking. The walking model with an artificial foot also predicted not only the walking pattern adapted to the artificial foot but also the design parameters of the artificial foot adapted to the effective walking pattern simultaneously. Such simulation methods will establish a novel methodology that we call computational rehabilitation engineering.

  10. Computational Model for the Neutronic Simulation of Pebble Bed Reactor’s Core Using MCNPX

    Directory of Open Access Journals (Sweden)

    J. Rosales

    2014-01-01

    Full Text Available Very high temperature reactor (VHTR designs offer promising performance characteristics; they can provide sustainable energy, improved proliferation resistance, inherent safety, and high temperature heat supply. These designs also promise operation to high burnup and large margins to fuel failure with excellent fission product retention via the TRISO fuel design. The pebble bed reactor (PBR is a design of gas cooled high temperature reactor, candidate for Generation IV of Nuclear Energy Systems. This paper describes the features of a detailed geometric computational model for PBR whole core analysis using the MCNPX code. The validation of the model was carried out using the HTR-10 benchmark. Results were compared with experimental data and calculations of other authors. In addition, sensitivity analysis of several parameters that could have influenced the results and the accuracy of model was made.

  11. Patient flow within UK emergency departments: a systematic review of the use of computer simulation modelling methods

    OpenAIRE

    Mohiuddin, Syed; Busby, John; Savovi?, Jelena; Richards, Alison; Northstone, Kate; Hollingworth, William; Donovan, Jenny L; Vasilakis, Christos

    2017-01-01

    Objectives Overcrowding in the emergency department (ED) is common in the UK as in other countries worldwide. Computer simulation is one approach used for understanding the causes of ED overcrowding and assessing the likely impact of changes to the delivery of emergency care. However, little is known about the usefulness of computer simulation for analysis of ED patient flow. We undertook a systematic review to investigate the different computer simulation methods and their contribution for a...

  12. Motor Vehicle Emission Modeling and Software Simulation Computing for Roundabout in Urban City

    Directory of Open Access Journals (Sweden)

    Haiwei Wang

    2013-01-01

    Full Text Available In urban road traffic systems, roundabout is considered as one of the core traffic bottlenecks, which are also a core impact of vehicle emission and city environment. In this paper, we proposed a transport control and management method for solving traffic jam and reducing emission in roundabout. The platform of motor vehicle testing system and VSP-based emission model was established firstly. By using the topology chart of the roundabout and microsimulation software, we calculated the instantaneous emission rates of different vehicle and total vehicle emissions. We argued that Integration-Model, combing traffic simulation and vehicle emission, can be performed to calculate the instantaneous emission rates of different vehicle and total vehicle emissions at the roundabout. By contrasting the exhaust emissions result between no signal control and signal control in this area at the rush hour, it draws a conclusion that setting the optimizing signal control can effectively reduce the regional vehicle emission. The proposed approach has been submitted to a simulation and experiment that involved an environmental assessment in Satellite Square, a roundabout in medium city located in China. It has been verified that setting signal control with knowledge engineering and Integration-Model is a practical way for solving the traffic jams and environmental pollution.

  13. Computer simulated modeling of healthy and diseased right ventricular and pulmonary circulation.

    Science.gov (United States)

    Chou, Jody; Rinehart, Joseph B

    2018-01-12

    We have previously developed a simulated cardiovascular physiology model for in-silico testing and validation of novel closed-loop controllers. To date, a detailed model of the right heart and pulmonary circulation was not needed, as previous controllers were not intended for use in patients with cardiac or pulmonary pathology. With new development of controllers for vasopressors, and looking forward, for combined vasopressor-fluid controllers, modeling of right-sided and pulmonary pathology is now relevant to further in-silico validation, so we aimed to expand our existing simulation platform to include these elements. Our hypothesis was that the completed platform could be tuned and stabilized such that the distributions of a randomized sample of simulated patients' baseline characteristics would be similar to reported population values. Our secondary outcomes were to further test the system in representing acute right heart failure and pulmonary artery hypertension. After development and tuning of the right-sided circulation, the model was validated against clinical data from multiple previously published articles. The model was considered 'tuned' when 100% of generated randomized patients converged to stability (steady, physiologically-plausible compartmental volumes, flows, and pressures) and 'valid' when the means for the model data in each health condition were contained within the standard deviations for the published data for the condition. A fully described right heart and pulmonary circulation model including non-linear pressure/volume relationships and pressure dependent flows was created over a 6-month span. The model was successfully tuned such that 100% of simulated patients converged into a steady state within 30 s. Simulation results in the healthy state for central venous volume (3350 ± 132 ml) pulmonary blood volume (405 ± 39 ml), pulmonary artery pressures (systolic 20.8 ± 4.1 mmHg and diastolic 9.4 ± 1.8 mmHg), left

  14. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  15. SU-F-J-178: A Computer Simulation Model Observer for Task-Based Image Quality Assessment in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Dolly, S; Mutic, S; Anastasio, M; Li, H [Washington University School of Medicine, Saint Louis, MO (United States); Yu, L [Mayo Clinic, Rochester, MN (United States)

    2016-06-15

    Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework was developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation

  16. Downscaling seasonal to centennial simulations on distributed computing infrastructures using WRF model. The WRF4G project

    Science.gov (United States)

    Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.

    2013-12-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models

  17. Fel simulations using distributed computing

    NARCIS (Netherlands)

    Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.

    2016-01-01

    While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code

  18. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  19. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  20. Computational fluid dynamic simulations of image-based stented coronary bifurcation models

    Science.gov (United States)

    Chiastra, Claudio; Morlacchi, Stefano; Gallo, Diego; Morbiducci, Umberto; Cárdenes, Rubén; Larrabide, Ignacio; Migliavacca, Francesco

    2013-01-01

    One of the relevant phenomenon associated with in-stent restenosis in coronary arteries is an altered haemodynamics in the stented region. Computational fluid dynamics (CFD) offers the possibility to investigate the haemodynamics at a level of detail not always accessible within experimental techniques. CFD can quantify and correlate the local haemodynamics structures which might lead to in-stent restenosis. The aim of this work is to study the fluid dynamics of realistic stented coronary artery models which replicate the complete clinical procedure of stent implantation. Two cases of pathologic left anterior descending coronary arteries with their bifurcations are reconstructed from computed tomography angiography and conventional coronary angiography images. Results of wall shear stress and relative residence time show that the wall regions more prone to the risk of restenosis are located next to stent struts, to the bifurcations and to the stent overlapping zone for both investigated cases. Considering a bulk flow analysis, helical flow structures are generated by the curvature of the zone upstream from the stent and by the bifurcation regions. Helical recirculating microstructures are also visible downstream from the stent struts. This study demonstrates the feasibility to virtually investigate the haemodynamics of patient-specific coronary bifurcation geometries. PMID:23676893

  1. 3D phase contrast MRI in models of human airways: Validation of computational fluid dynamics simulations of steady inspiratory flow.

    Science.gov (United States)

    Collier, Guilhem J; Kim, Minsuok; Chung, Yongmann; Wild, Jim M

    2018-04-06

    Knowledge of airflow patterns in the large airways is of interest in obstructive airways disease and in the development of inhaled therapies. Computational fluid dynamics (CFD) simulations are used to study airflow in realistic airway models but usually need experimental validation. To develop MRI-based methods to study airway flow in realistic 3D-printed models. Case control. Two 3D-printed lung models. 1.5-3T, flow MRI. Two human airway models, respectively including and excluding the oral cavity and upper airways derived from MR and CT imaging, were 3D-printed. 3D flow MRI was performed at different flow conditions corresponding to slow and steady airflow inhalation rates. Water was used as the working fluid to mimic airflow. Dynamic acquisition of 1D velocity profiles was also performed at different locations in the trachea to observe variability during nonsteady conditions. Linear regression analysis to compare both flow velocity fields and local flow rates from CFD simulations and experimental measurement with flow MRI. A good agreement was obtained between 3D velocity maps measured with flow MRI and predicted by CFD simulations, with linear regression R-squared values ranging from 0.39 to 0.94 when performing a pixel-by-pixel comparison of each velocity component. The flow distribution inside the lung models was also similar, with average slope and R-squared values of 0.96 and 0.99, respectively, when comparing local flow rates assessed at different branching locations. In the model including the upper airways, a turbulent laryngeal jet flow was observed with both methods and affected remarkably the velocity profiles in the trachea. We propose flow MRI using water as a surrogate fluid to air, as a validation tool for CFD simulations of airflow in geometrically realistic models of the human airways. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.

  2. Computational Modeling Using OpenSim to Simulate a Squat Exercise Motion

    Science.gov (United States)

    Gallo, C. A.; Thompson, W. K.; Lewandowski, B. E.; Humphreys, B. T.; Funk, J. H.; Funk, N. H.; Weaver, A. S.; Perusek, G. P.; Sheehan, C. C.; Mulugeta, L.

    2015-01-01

    Long duration space travel to destinations such as Mars or an asteroid will expose astronauts to extended periods of reduced gravity. Astronauts will use an exercise regime for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Since the area available in the spacecraft for an exercise device is limited and gravity is not present to aid loading, compact resistance exercise device prototypes are being developed. Since it is difficult to rigorously test these proposed devices in space flight, computational modeling provides an estimation of the muscle forces, joint torques and joint loads during exercise to gain insight on the efficacy to protect the musculoskeletal health of astronauts.

  3. Thermodynamic properties of diamond and wurtzite model fluids from computer simulation and thermodynamic perturbation theory

    Science.gov (United States)

    Zhou, S.; Solana, J. R.

    2018-03-01

    Monte Carlo NVT simulations have been performed to obtain the thermodynamic and structural properties and perturbation coefficients up to third order in the inverse temperature expansion of the Helmholtz free energy of fluids with potential models proposed in the literature for diamond and wurtzite lattices. These data are used to analyze performance of a coupling parameter series expansion (CPSE). The main findings are summarized as follows, (1) The CPSE provides accurate predictions of the first three coefficient in the inverse temperature expansion of Helmholtz free energy for the potential models considered and the thermodynamic properties of these fluids are predicted more accurately when the CPSE is truncated at second or third order. (2) The Barker-Henderson (BH) recipe is appropriate for determining the effective hard sphere diameter for strongly repulsive potential cores, but its performance worsens with increasing the softness of the potential core. (3) For some thermodynamic properties the first-order CPSE works better for the diamond potential, whose tail is dominated by repulsive interactions, than for the potential, whose tail is dominated by attractive interactions. However, the first-order CPSE provides unsatisfactory results for the excess internal energy and constant-volume excess heat capacity for the two potential models.

  4. An evaluation of screening policies for cervical cancer in England and Wales using a computer simulation model.

    OpenAIRE

    Parkin, D M; Moss, S M

    1986-01-01

    Several screening policies have been recommended for implementation in England and Wales in the last 20 years, although no evidence as to their relative effectiveness or efficiency has been provided. Using a computer simulation model, the outcomes expected from those policies had they been implemented over a 30 year period (1961-90) have been examined. The original policies based on five-yearly testing of women aged over 35 appear to be the most cost-effective, and extension of screening to y...

  5. Modeling and computational simulation and the potential of virtual and augmented reality associated to the teaching of nanoscience and nanotechnology

    Science.gov (United States)

    Ribeiro, Allan; Santos, Helen

    With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.

  6. Development of computational models for the simulation of isodose curves on dosimetry films generated by iodine-125 brachytherapy seeds

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Adriano M.; Meira-Belo, Luiz C.; Reis, Sergio C.; Grynberg, Suely E., E-mail: amsantos@cdtn.b [Center for Development of Nuclear Technology (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-07-01

    The interstitial brachytherapy is one modality of radiotherapy in which radioactive sources are placed directly in the region to be treated or close to it. The seeds that are used in the treatment of prostate cancer are generally cylindrical radioactive sources, consisting of a ceramic or metal matrix, which acts as the carrier of the radionuclide and as the X-ray marker, encapsulated in a sealed titanium tube. This study aimed to develop a computational model to reproduce the film-seed geometry, in order to obtain the spatial regions of the isodose curves produced by the seed when it is put over the film surface. The seed modeled in this work was the OncoSeed 6711, a sealed source of iodine-125, which its isodose curves were obtained experimentally in previous work with the use of dosimetric films. For the films modeling, compositions and densities of the two types of dosimetric films were used: Agfa Personal Monitoring photographic film 2/10, manufactured by Agfa-Geavaert; and the model EBT radiochromic film, by International Specialty Products. The film-seed models were coupled to the Monte Carlo code MCNP5. The results obtained by simulations showed to be in good agreement with experimental results performed in a previous work. This indicates that the computational model can be used in future studies for other seeds models. (author)

  7. Computer Simulations, Disclosure and Duty of Care

    Directory of Open Access Journals (Sweden)

    John Barlow

    2006-05-01

    Full Text Available Computer simulations provide cost effective methods for manipulating and modeling 'reality'. However they are not real. They are imitations of a system or event, real or fabricated, and as such mimic, duplicate or represent that system or event. The degree to which a computer simulation aligns with and reproduces the ‘reality’ of the system or event it attempts to mimic or duplicate depends upon many factors including the efficiency of the simulation algorithm, the processing power of the computer hardware used to run the simulation model, and the expertise, assumptions and prejudices of those concerned with designing, implementing and interpreting the simulation output. Computer simulations in particular are increasingly replacing physical experimentation in many disciplines, and as a consequence, are used to underpin quite significant decision-making which may impact on ‘innocent’ third parties. In this context, this paper examines two interrelated issues: Firstly, how much and what kind of information should a simulation builder be required to disclose to potential users of the simulation? Secondly, what are the implications for a decision-maker who acts on the basis of their interpretation of a simulation output without any reference to its veracity, which may in turn comprise the safety of other parties?

  8. Exploring Shifts in Middle School Learners' Modeling Activity While Generating Drawings, Animations, and Computational Simulations of Molecular Diffusion

    Science.gov (United States)

    Wilkerson-Jerde, Michelle H.; Gravel, Brian E.; Macrander, Christopher A.

    2015-04-01

    Modeling and using technology are two practices of particular interest to K-12 science educators. These practices are inextricably linked among professionals, who engage in modeling activity with and across a variety of representational technologies. In this paper, we explore the practices of five sixth-grade girls as they generated models of smell diffusion using drawing, stop-motion animation, and computational simulation during a multi-day workshop. We analyze video, student discourse, and artifacts to address the questions: In what ways did learners' modeling practices, reasoning about mechanism, and ideas about smell shift as they worked across this variety of representational technologies? And, what supports enabled them to persist and progress in the modeling activity? We found that the girls engaged in two distinct modeling cycles that reflected persistence and deepening engagement in the task. In the first, messing about, they focused on describing and representing many ideas related to the spread of smell at once. In the second, digging in, they focused on testing and revising specific mechanisms that underlie smell diffusion. Upon deeper analysis, we found these cycles were linked to the girls' invention of "oogtom," a representational object that encapsulated many ideas from the first cycle and allowed the girls to restart modeling with the mechanistic focus required to construct simulations. We analyze the role of activity design, facilitation, and technological infrastructure in this pattern of engagement over the course of the workshop and discuss implications for future research, curriculum design, and classroom practice.

  9. Advanced Physical Models and Numerical Algorithms to Enable High-Fidelity Aerothermodynamic Simulations of Planetary Entry Vehicles on Emerging Distributed Heterogeneous Computing Architectures

    Data.gov (United States)

    National Aeronautics and Space Administration — The design and qualification of entry systems for planetary exploration largely rely on computational simulations. However, state-of-the-art modeling capabilities...

  10. Documentation and user guides for SPBLOB: a computer simulation model of the join population dynamics for loblolly pine and the southern pine beetle

    Science.gov (United States)

    John Bishir; James Roberds; Brian Strom; Xiaohai Wan

    2009-01-01

    SPLOB is a computer simulation model for the interaction between loblolly pine (Pinus taeda L.), the economically most important forest crop in the United States, and the southern pine beetle (SPB: Dendroctonus frontalis Zimm.), the major insect pest for this species. The model simulates loblolly pine stands from time of planting...

  11. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  12. Computer simulation of gear tooth manufacturing processes

    Science.gov (United States)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  13. Ecological impacts of environmental toxicants and radiation on the microbial ecosystem: a model simulation of computational microbiology

    International Nuclear Information System (INIS)

    Doi, Masahiro; Sakashita, Tetsuya; Ishii, Nobuyoshi; Fuma, Shoichi; Takeda, Hiroshi; Miyamoto, Kiriko; Yanagisawa, K.; Nakamura, Yuji; Kawabata, Zenichiro

    2000-01-01

    This study explores a microorganic closed-ecosystem by computer simulation to illustrate symbiosis among populations in the microcosm that consists of heterotroph protozoa, Tetrahymena thermophila B as a consumer, autotroph algae, Euglena gracilis Z as a primary producer and saprotroph Bacteria, Escherichia coli DH5 as decomposer. The simulation program is written as a procedure of StarLogoT1.5.1, which is developed by Center for Connected Learning and Computer-Based Modeling, Tufts University. The virtual microcosm is structured and operated by the following rules; 1) Environment is defined as a lattice model, which consists of 10,201 square patches, 300 micron Wide, 300 micron Length and 100 micron Hight. 2) Each patch has its own attributes, Nutrient, Detritus and absolute coordinates, 3) Components of the species, Tetrahymena, Euglena and E-coli are defined as sub-system, and each sub-system has its own attributes as location, heading direction, cell-age, structured biomass, reserves energy and demographic parameters (assimilation rate, breeding threshold, growth rate, etc.). 4) Each component of the species, Tetrahymena, Euglena and E-coli, lives by foraging (Tetrahymena eats E-coli), excreting its metabolic products to the environment (as a substrate of E-coli), breeding and dying according vital condition. 5) Euglena utilizes sunlight energy by photosynthesis process and produces organic compounds. E-coli breaks down the organic compounds of dead protoplasm or metabolic wastes (Detritus) and releases inorganic substances to construct down stream of food cycle. Virtual ecosystem in this study is named SIM-COSM, a parallel computing model for self-sustaining system of complexity. It found that SIM-COSM is a valuable to illustrate symbiosis among populations in the microcosm, where a feedback mechanism acts in response to disturbances and interactions among species and environment. In the simulation, microbes increased demographic and environmental

  14. COMPUTATIONAL MODELING AND SIMULATION IN BIOLOGY TEACHING: A MINIMALLY EXPLORED FIELD OF STUDY WITH A LOT OF POTENTIAL

    Directory of Open Access Journals (Sweden)

    Sonia López

    2016-09-01

    Full Text Available This study is part of a research project that aims to characterize the epistemological, psychological and didactic presuppositions of science teachers (Biology, Physics, Chemistry that implement Computational Modeling and Simulation (CMS activities as a part of their teaching practice. We present here a synthesis of a literature review on the subject, evidencing how in the last two decades this form of computer usage for science teaching has boomed in disciplines such as Physics and Chemistry, but in a lesser degree in Biology. Additionally, in the works that dwell on the use of CMS in Biology, we identified a lack of theoretical bases that support their epistemological, psychological and/or didactic postures. Accordingly, this generates significant considerations for the fields of research and teacher instruction in Science Education.

  15. Modelling and Simulation: An Overview

    OpenAIRE

    McAleer, Michael; Chan, Felix; Oxley, Les

    2013-01-01

    This discussion paper resulted in a publication in 'Selected Papers of the MSSANZ 19th Biennial Conference on Modelling and Simulation Mathematics and Computers in Simulation', 2013, pp. viii. The papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal: the emp...

  16. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  17. Simulation of Blast Loading on an Ultrastructurally-based Computational Model of the Ocular Lens

    Science.gov (United States)

    2016-12-01

    work was started with the data available using Abaqus (simulia.com) and Tahoe (tahoe.sourceforge.net). Dedication First and foremost, this thesis is...42 3.45 Overlay of the Abaqus simulation with an image of the compressed lens. . . . . . . . 43 3.46 Graph of Data Produced From Tahoe Simulation and...using the fitted parameters and the theoretical data produced by the simulation within Abaqus . In order to get the 2 sets of data to match, the

  18. Computational modeling of concrete flow

    DEFF Research Database (Denmark)

    Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic

    2007-01-01

    This paper provides a general overview of the present status regarding computational modeling of the flow of fresh concrete. The computational modeling techniques that can be found in the literature may be divided into three main families: single fluid simulations, numerical modeling of discrete...

  19. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  20. Computationally efficient models for simulation of non-ideal DC–DC ...

    Indian Academy of Sciences (India)

    buck, boost and buck–boost) under continuous and discontinuous modes of operation. Three types of models are presented for each converter, namely, switching model, average model and harmonic model. These models include significant ...

  1. Integrating computer-aided modeling and micro-simulation in multi-criteria evaluation of service infrastructure assignment approaches

    Directory of Open Access Journals (Sweden)

    Alfonso Duran

    2013-07-01

    Full Text Available Purpose: This paper proposes an integrated computer-supported multi-staged approach to the flexible design and multicriteria evaluation of service infrastructure assignment processes/ algorithms. Design/methodology/approach: It involves particularizing a metamodel encompassing the main generic components and relationships into process models and process instances, by incorporating structural data from the real-life system. Existing data on the target user population is fed into a micro-modeling system to generate a matching population of individual “virtual” users, each with its own set of trait values. The micro-simulation of their interaction with the assignment process of both the incumbent and the competitors generates a rich multi-dimensional output, encompassing both “revealed” and non-observable data. This enables a comprehensive multi-criteria evaluation of the foreseeable performance of the designed process/ algorithm, and therefore its iterative improvement. Findings: The research project developed a set of methodologies and associated supporting tools encompassing the modeling, micro-simulation and performance assessment of service infrastructure assignment processes. Originality/value: The proposed approach facilitates, in a multicriteria environment, the flexible modeling/design of situation-specific assignment processes/algorithms and their performance assessment when facing their case-specific user population.

  2. Interaction of the cardiovascular system with an implanted rotary assist device: simulation study with a refined computer model.

    Science.gov (United States)

    Vollkron, Michael; Schima, Heinrich; Huber, Leopold; Wieselthaler, Georg

    2002-04-01

    In recent years, implanted rotary pumps have achieved the level of extended clinical application including complete mobilization and physical exercise of the recipients. A computer model was developed to study the interaction between a continuous-flow pump and the recovering cardiovascular system, the effects of changing pre- and afterloads, and the possibilities for indirect estimation of hemodynamic parameters and pump control. A numerical model of the cardiovascular system using Matlab Simulink simulation software was established. Data of circulatory system modules were derived from patients, our own in vitro and in vivo experiments, and the literature. Special care was taken to simulate properly the dynamic pressure-volume characteristics of both left and right ventricle, the Frank-Starling behavior, and the impedance of the proximal vessels. Excellent correlation with measured data was achieved including pressure and flow patterns within the time domain, response to varying loads, and effects of previously observed pressure-flow hysteresis in rotary pumps. Potential energy, external work, pressure-volume area, and other derived heart work parameters could be calculated. The model offers the possibility to perform parameter variations to study the effects of changing patient condition and therapy and to display them with three-dimensional graphics (demonstrated with the effects on right ventricular work and efficiency). The presented model gives an improved understanding of the interaction between the pump and both ventricles. It can be used for the investigation of various clinical and control questions in normal and pathological conditions of the left ventricular assist device recipient.

  3. Uncertainty and error in computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  4. HTTR plant dynamic simulation using a hybrid computer

    International Nuclear Information System (INIS)

    Shimazaki, Junya; Suzuki, Katsuo; Nabeshima, Kunihiko; Watanabe, Koichi; Shinohara, Yoshikuni; Nakagawa, Shigeaki.

    1990-01-01

    A plant dynamic simulation of High-Temperature Engineering Test Reactor has been made using a new-type hybrid computer. This report describes a dynamic simulation model of HTTR, a hybrid simulation method for SIMSTAR and some results obtained from dynamics analysis of HTTR simulation. It concludes that the hybrid plant simulation is useful for on-line simulation on account of its capability of computation at high speed, compared with that of all digital computer simulation. With sufficient accuracy, 40 times faster computation than real time was reached only by changing an analog time scale for HTTR simulation. (author)

  5. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  6. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  7. Computationally efficient models for simulation of non-ideal DC–DC ...

    Indian Academy of Sciences (India)

    Abstract. This paper discusses dynamic modeling of non-isolated DC–DC con- verters (buck, boost and buck–boost) under continuous and discontinuous modes of operation. Three types of models are presented for each converter, namely, switching model, average model and harmonic model. These models include ...

  8. Effect of Inquiry-Based Computer Simulation Modeling on Pre-Service Teachers' Understanding of Homeostasis and Their Perceptions of Design Features

    Science.gov (United States)

    Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn

    2016-01-01

    This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…

  9. The Use of Model Matching Video Analysis and Computational Simulation to Study the Ankle Sprain Injury Mechanism

    Directory of Open Access Journals (Sweden)

    Daniel Tik-Pui Fong

    2012-10-01

    Full Text Available Lateral ankle sprains continue to be the most common injury sustained by athletes and create an annual healthcare burden of over $4 billion in the U.S. alone. Foot inversion is suspected in these cases, but the mechanism of injury remains unclear. While kinematics and kinetics data are crucial in understanding the injury mechanisms, ligament behaviour measures – such as ligament strains – are viewed as the potential causal factors of ankle sprains. This review article demonstrates a novel methodology that integrates model matching video analyses with computational simulations in order to investigate injury-producing events for a better understanding of such injury mechanisms. In particular, ankle joint kinematics from actual injury incidents were deduced by model matching video analyses and then input into a generic computational model based on rigid bone surfaces and deformable ligaments of the ankle so as to investigate the ligament strains that accompany these sprain injuries. These techniques may have the potential for guiding ankle sprain prevention strategies and targeted rehabilitation therapies.

  10. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  11. Self-assembly of peptide scaffolds in biosilica formation: computer simulations of a coarse-grained model.

    Science.gov (United States)

    Lenoci, Leonardo; Camp, Philip J

    2006-08-09

    The self-assembly of model peptides is studied using Brownian dynamics computer simulations. A coarse-grained, bead-spring model is designed to mimic silaffins, small peptides implicated in the biomineralization of certain silica diatom skeletons and observed to promote the formation of amorphous silica nanospheres in vitro. The primary characteristics of the silaffin are a 15 amino acid hydrophilic backbone and two modified lysine residues near the ends of the backbone carrying long polyamine chains. In the simulations, the model peptides self-assemble to form spherical clusters, networks of strands, or bicontinuous structures, depending on the peptide concentration and effective temperature. The results indicate that over a broad range of volume fractions (0.05-25%) the characteristic structural lengthscales fall in the range 12-45 nm. On this basis, we suggest that self-assembled structures act as either nucleation points or scaffolds for the deposition of 10-100 nm silica-peptide building blocks from which diatom skeletons and synthetic nanospheres are constructed.

  12. Principle for the Validation of a Driving Support using a Computer Vision-Based Driver Modelization on a Simulator

    Directory of Open Access Journals (Sweden)

    Baptiste Rouzier

    2015-07-01

    Full Text Available This paper presents a new structure for a driving support designed to compensate for the problems caused by the behaviour of the driver without causing a feeling of unease. This assistance is based on a shared control between the human and an automatic support that computes and applies an assisting torque on the steering wheel. This torque is computed from a representation of the hazards encountered on the road by virtual potentials. However, the equilibrium between the relative influences of the human and the support on the steering wheel are difficult to find and depend upon the situation. This is why this driving support includes a modelization of the driver based on an analysis of several face features using a computer vision algorithm. The goal is to determine whether the driver is drowsy or whether he is paying attention to some specific points in order to adapt the strength of the support. The accuracy of the measurements made on the face features is estimated, and the interest of the proposal as well as the concepts raised by such assistance are studied through simulations.

  13. CPV modelling with Solcore: An extensible modelling framework for the rapid computational simulation and evaluation of solar cell designs and concepts

    Science.gov (United States)

    Führer, Markus; Farrell, Daniel; Ekins-Daukes, Nicholas

    2013-09-01

    Computer modelling can reduce the costs of CPV solar cell development by allowing the evaluation of designs without physical device growth. We present solcore, a powerful multi-tier modelling framework for simulation of nano-structured solar cells, written in the open source, popular, and approachable programming language Python. Capabilities include modules for materials (parameterisation, database), 1D arbitrary potential Schrödinger equation solver and absorption calculator, kṡp band structure solver, spectral irradiance model and database, and multijunction quantum efficiency and IV calculators.

  14. Geochemical modelling. Column 2: a computer program for simulation of migration

    International Nuclear Information System (INIS)

    Nielsen, O.J.; Carlsen, L.; Bo, P.

    1985-01-01

    COLUMN2 is a 1D FORTRAN77 computer program designed for studies of the effects of various physicochemical processes on migration. It solves the solute transport equation and can take into account dispersion, sorption, ion exchange, first and second order homogeneous chemical reactions. Spatial variations of input pulses and retention factors are possible. The method of solution is based on a finite difference discretion followed by the application of the method of characteristics and two separate grid systems. This report explains the mathematical and numerical methods used, describes the necessary input, contains a number of test examples, provides a listing of the program and explains how to acquire the program, adapt it to other computers and run it. This report serves as a manual for the program

  15. Dynamic modelling of an adsorption storage tank using a hybrid approach combining computational fluid dynamics and process simulation

    Science.gov (United States)

    Mota, J.P.B.; Esteves, I.A.A.C.; Rostam-Abadi, M.

    2004-01-01

    A computational fluid dynamics (CFD) software package has been coupled with the dynamic process simulator of an adsorption storage tank for methane fuelled vehicles. The two solvers run as independent processes and handle non-overlapping portions of the computational domain. The codes exchange data on the boundary interface of the two domains to ensure continuity of the solution and of its gradient. A software interface was developed to dynamically suspend and activate each process as necessary, and be responsible for data exchange and process synchronization. This hybrid computational tool has been successfully employed to accurately simulate the discharge of a new tank design and evaluate its performance. The case study presented here shows that CFD and process simulation are highly complementary computational tools, and that there are clear benefits to be gained from a close integration of the two. ?? 2004 Elsevier Ltd. All rights reserved.

  16. Casting directly from a computer model by using advanced simulation software FLOW-3D Cast ®

    Directory of Open Access Journals (Sweden)

    M. Sirviö

    2009-01-01

    Full Text Available ConiferRob - A patternless casting technique, originally conceived at VTT Technical Research Centre of Finland and furtherdeveloped at its spin-off company, Simtech Systems, offers up to 40% savings in product development costs, and up to two months shorterdevelopment times compared to conventional techniques. Savings of this order can be very valuable on today's highly competitivemarkets. Casting simulation is commonly used for designing of casting systems. However, most of the software are today old fashioned and predicting just shrinkage porosity. Flow Science, VTT and Simtech have developed new software called FLOW-3D Cast ® , whichcan simulate surface defects, air entrainment, filters, core gas problems and even a cavitation.

  17. Computer simulation model of the structure of ion implanted impurities in semiconductors

    International Nuclear Information System (INIS)

    Roman, E.; Majlis, N.

    1983-02-01

    A system of ion implanted impurities in a semiconductor is described by a Monte Carlo simulation of a non-equilibrium system of random distributed hard spheres. The radial distribution function of this system is found. The comparison is made with the fluid hard sphere case. The assumption of the absence either of annealing or diffusion of the impurities after the implantation process is also made. (author)

  18. Casting directly from a computer model by using advanced simulation software FLOW-3D Cast ®

    OpenAIRE

    M. Sirviö; M. Woś

    2009-01-01

    ConiferRob - A patternless casting technique, originally conceived at VTT Technical Research Centre of Finland and furtherdeveloped at its spin-off company, Simtech Systems, offers up to 40% savings in product development costs, and up to two months shorterdevelopment times compared to conventional techniques. Savings of this order can be very valuable on today's highly competitivemarkets. Casting simulation is commonly used for designing of casting systems. However, most of the software are ...

  19. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  20. Designing Online Scaffolds for Interactive Computer Simulation

    Science.gov (United States)

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high school…

  1. Numerical Implementation and Computer Simulation of Tracer ...

    African Journals Online (AJOL)

    Numerical Implementation and Computer Simulation of Tracer Experiments in a Physical Aquifer Model. ... African Research Review ... A sensitivity analysis showed that the time required for complete source depletion, was most dependent on the source definition and the hydraulic conductivity K of the porous medium.

  2. Computer Modeling and Simulation of Bullet Impact to the Human Thorax

    National Research Council Canada - National Science Library

    Jolly, Johannes

    2000-01-01

    .... The objective of the study was to create a viable finite element model of the human thorax. The model was validated by comparing the results of tests of body armor systems conducted on cadavers to results obtained from finite element analysis...

  3. Computer simulation of nonequilibrium processes

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.

  4. Memory interface simulator: A computer design aid

    Science.gov (United States)

    Taylor, D. S.; Williams, T.; Weatherbee, J. E.

    1972-01-01

    Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.

  5. A Computational Model and Multi-Agent Simulation for Information Assurance

    National Research Council Canada - National Science Library

    VanPutte, Michael

    2002-01-01

    The field of information assurance (IA) is too complex for current modeling tools, While security analysts may understand individual mechanisms at a particular moment, the interactions among the mechanisms, combined with evolving nature...

  6. Improvements in Thermal Performance of Mango Hot-water Treatment Equipments: Data Analysis, Mathematical Modelling and Numerical-computational Simulation

    Directory of Open Access Journals (Sweden)

    Elder M. Mendoza Orbegoso

    2017-06-01

    Full Text Available Mango is one of the most popular and best paid tropical fruits in worldwide markets, its exportation is regulated within a phytosanitary quality control for killing the “fruit fly”. Thus, mangoes must be subject to hot-water treatment process that involves their immersion in hot water over a period of time. In this work, field measurements, analytical and simulation studies are developed on available hot-water treatment equipment called “Original” that only complies with United States phytosanitary protocols. These approaches are made to characterize the fluid-dynamic and thermal behaviours that occur during the mangoes’ hot-water treatment process. Then, analytical model and Computational fluid dynamics simulations are developed for designing new hot-water treatment equipment called “Hybrid” that simultaneously meets with both United States and Japan phytosanitary certifications. Comparisons of analytical results with data field measurements demonstrate that “Hybrid” equipment offers a better fluid-dynamic and thermal performance than “Original” ones.

  7. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  8. Vernier Caliper and Micrometer Computer Models Using Easy Java Simulation and Its Pedagogical Design Features--Ideas for Augmenting Learning with Real Instruments

    Science.gov (United States)

    Wee, Loo Kang; Ning, Hwee Tiang

    2014-01-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a…

  9. Computer simulation of radiation damage in NaCl using a kinetic rate reaction model

    International Nuclear Information System (INIS)

    Soppe, W.J.

    1993-01-01

    Sodium chloride and other alkali halides are known to be very susceptible to radiation damage in the halogen sublattice when exposed to ionizing radiation. The formation of radiation damage in NaCl has generated interest because of the relevance of this damage to the disposal of radioactive waste in rock salt formations. In order to estimate the long-term behaviour of a rock salt repository, an accurate theory describing the major processes of radiation damage in NaCl is required. The model presented in this paper is an extended version of the Jain-Lidiard model; its extensions comprise the effect of impurities and the colloid nucleation stage on the formation of radiation damage. The new model has been tested against various experimental data obtained from the literature and accounts for several well known aspects of radiation damage in alkali halides which were not covered by the original Jain-Lidiard model. The new model thus may be expected to provide more reliable predictions for the build-up of radiation damage in a rock salt nuclear waste repository. (Author)

  10. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  11. Cell-Free Transmission of Human Adenovirus by Passive Mass Transfer in Cell Culture Simulated in a Computer Model

    Science.gov (United States)

    Yakimovich, Artur; Gumpert, Heidi; Burckhardt, Christoph J.; Lütschg, Verena A.; Jurgeit, Andreas; Sbalzarini, Ivo F.

    2012-01-01

    Viruses spread between cells, tissues, and organisms by cell-free and cell-cell transmissions. Both mechanisms enhance disease development, but it is difficult to distinguish between them. Here, we analyzed the transmission mode of human adenovirus (HAdV) in monolayers of epithelial cells by wet laboratory experimentation and a computer simulation. Using live-cell fluorescence microscopy and replication-competent HAdV2 expressing green fluorescent protein, we found that the spread of infection invariably occurred after cell lysis. It was affected by convection and blocked by neutralizing antibodies but was independent of second-round infections. If cells were overlaid with agarose, convection was blocked and round plaques developed around lytic infected cells. Infected cells that did not lyse did not give rise to plaques, highlighting the importance of cell-free transmission. Key parameters for cell-free virus transmission were the time from infection to lysis, the dose of free viruses determining infection probability, and the diffusion of single HAdV particles in aqueous medium. With these parameters, we developed an in silico model using multiscale hybrid dynamics, cellular automata, and particle strength exchange. This so-called white box model is based on experimentally determined parameters and reproduces viral infection spreading as a function of the local concentration of free viruses. These analyses imply that the extent of lytic infections can be determined by either direct plaque assays or can be predicted by calculations of virus diffusion constants and modeling. PMID:22787215

  12. Computer Simulation for Dispersion of Air Pollution Released from a Line Source According to Gaussian Model

    International Nuclear Information System (INIS)

    Emad, A.A.; El Shazly, S.M.; Kassem, Kh.O.

    2010-01-01

    A line source model, developed in laboratory of environmental physics, faculty of science at Qena, Egypt is proposed to describe the downwind dispersion of pollutants near roadways, at different cities in Egypt. The model is based on the Gaussian plume methodology and is used to predict air pollutants' concentrations near roadways. In this direction, simple software has been presented in this paper, developed by authors, adopted completely Graphical User Interface (GUI) technique for operating in various windows-based microcomputers. The software interface and code have been designed by Microsoft Visual basic 6.0 based on the Gaussian diffusion equation. This software is developed to predict concentrations of gaseous pollutants (eg. CO, SO 2 , NO 2 and particulates) at a user specified receptor grid

  13. Development of simulators algorithms of planar radioactive sources for use in computer models of exposure

    International Nuclear Information System (INIS)

    Vieira, Jose Wilson; Leal Neto, Viriato; Lima Filho, Jose de Melo; Lima, Fernando Roberto de Andrade

    2013-01-01

    This paper presents as algorithm of a planar and isotropic radioactive source and by rotating the probability density function (PDF) Gaussian standard subjected to a translatory method which displaces its maximum throughout its field changes its intensity and makes the dispersion around the mean right asymmetric. The algorithm was used to generate samples of photons emerging from a plane and reach a semicircle involving a phantom voxels. The PDF describing this problem is already known, but the generating function of random numbers (FRN) associated with it can not be deduced by direct MC techniques. This is a significant problem because it can be adjusted to simulations involving natural terrestrial radiation or accidents in medical establishments or industries where the radioactive material spreads in a plane. Some attempts to obtain a FRN for the PDF of the problem have already been implemented by the Research Group in Numerical Dosimetry (GND) from Recife-PE, Brazil, always using the technique rejection sampling MC. This article followed methodology of previous work, except on one point: The problem of the PDF was replaced by a normal PDF transferred. To perform dosimetric comparisons, we used two MCES: the MSTA (Mash standing, composed by the adult male voxel phantom in orthostatic position, MASH (male mesh), available from the Department of Nuclear Energy (DEN) of the Federal University of Pernambuco (UFPE), coupled to MC EGSnrc code and the GND planar source based on the rejection technique) and MSTA N T. The two MCES are similar in all but FRN used in planar source. The results presented and discussed in this paper establish the new algorithm for a planar source to be used by GND

  14. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  15. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    International Nuclear Information System (INIS)

    Brown, D.L.

    2009-01-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  16. Computer simulation of temperature-dependent growth of fractal and compact domains in diluted Ising models

    DEFF Research Database (Denmark)

    Sørensen, Erik Schwartz; Fogedby, Hans C.; Mouritsen, Ole G.

    1989-01-01

    temperature are studied as functions of temperature, time, and concentration. At zero temperature and high dilution, the growing solid is found to have a fractal morphology and the effective fractal exponent D varies with concentration and ratio of time scales of the two dynamical processes. The mechanism...... responsible for forming the fractal solid is shown to be a buildup of a locally high vacancy concentration in the active growth zone. The growth-probability measure of the fractals is analyzed in terms of multifractality by calculating the f(α) spectrum. It is shown that the basic ideas of relating...... probability measures of static fractal objects to the growth-probability distribution during formation of the fractal apply to the present model. The f(α) spectrum is found to be in the universality class of diffusion-limited aggregation. At finite temperatures, the fractal solid domains become metastable...

  17. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  18. Progress in Computational Simulation of Earthquakes

    Science.gov (United States)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  19. Experiential Learning through Computer-Based Simulations.

    Science.gov (United States)

    Maynes, Bill; And Others

    1992-01-01

    Describes experiential learning instructional model and simulation for student principals. Describes interactive laser videodisc simulation. Reports preliminary findings about student principal learning from simulation. Examines learning approaches by unsuccessful and successful students and learning levels of model learners. Simulation's success…

  20. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  1. Computer simulation on molten ionic salts

    International Nuclear Information System (INIS)

    Kawamura, K.; Okada, I.

    1978-01-01

    The extensive advances in computer technology have since made it possible to apply computer simulation to the evaluation of the macroscopic and microscopic properties of molten salts. The evaluation of the potential energy in molten salts systems is complicated by the presence of long-range energy, i.e. Coulomb energy, in contrast to simple liquids where the potential energy is easily evaluated. It has been shown, however, that no difficulties are encountered when the Ewald method is applied to the evaluation of Coulomb energy. After a number of attempts had been made to approximate the pair potential, the Huggins-Mayer potential based on ionic crystals became the most often employed. Since it is thought that the only appreciable contribution to many-body potential, not included in Huggins-Mayer potential, arises from the internal electrostatic polarization of ions in molten ionic salts, computer simulation with a provision for ion polarization has been tried recently. The computations, which are employed mainly for molten alkali halides, can provide: (1) thermodynamic data such as internal energy, internal pressure and isothermal compressibility; (2) microscopic configurational data such as radial distribution functions; (3) transport data such as the diffusion coefficient and electrical conductivity; and (4) spectroscopic data such as the intensity of inelastic scattering and the stretching frequency of simple molecules. The computed results seem to agree well with the measured results. Computer simulation can also be used to test the effectiveness of a proposed pair potential and the adequacy of postulated models of molten salts, and to obtain experimentally inaccessible data. A further application of MD computation employing the pair potential based on an ionic model to BeF 2 , ZnCl 2 and SiO 2 shows the possibility of quantitative interpretation of structures and glass transformation phenomena

  2. Computer simulation model for the striped bass young-of-the-year population in the Hudson River. [Effects of entrainment and impingement at power plants on population dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Eraslan, A.H.; Van Winkle, W.; Sharp, R.D.; Christensen, S.W.; Goodyear, C.P.; Rush, R.M.; Fulkerson, W.

    1975-09-01

    This report presents a daily transient (tidal-averaged), longitudinally one-dimensional (cross-section-averaged) computer simulation model for the assessment of the entrainment and impingement impacts of power plant operations on young-of-the-year populations of the striped bass, Morone saxatilis, in the Hudson River.

  3. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  4. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  5. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  6. Shadow effects in simulated ultrasound images derived from computed tomography images using a focused beam tracing model

    DEFF Research Database (Denmark)

    Pham, An Hoai; Lundgren, Bo; Stage, Bjarne

    2012-01-01

    Simulation of ultrasound images based on computed tomography (CT) data has previously been performed with different approaches. Shadow effects are normally pronounced in ultrasound images, so they should be included in the simulation. In this study, a method to capture the shadow effects has been......Focus ultrasound scanner (BK Medical, Herlev, Denmark) equipped with a dedicated research interface giving access to beamformed radio frequency data. CT images were obtained with an Aquilion ONE Toshiba CT scanner (Toshiba Medical Systems Corp., Tochigi, Japan). CT data were mapped from Hounsfield units...

  7. Assessing the relationship between computational speed and precision: a case study comparing an interpreted versus compiled programming language using a stochastic simulation model in diabetes care.

    Science.gov (United States)

    McEwan, Phil; Bergenheim, Klas; Yuan, Yong; Tetlow, Anthony P; Gordon, Jason P

    2010-01-01

    Simulation techniques are well suited to modelling diseases yet can be computationally intensive. This study explores the relationship between modelled effect size, statistical precision, and efficiency gains achieved using variance reduction and an executable programming language. A published simulation model designed to model a population with type 2 diabetes mellitus based on the UKPDS 68 outcomes equations was coded in both Visual Basic for Applications (VBA) and C++. Efficiency gains due to the programming language were evaluated, as was the impact of antithetic variates to reduce variance, using predicted QALYs over a 40-year time horizon. The use of C++ provided a 75- and 90-fold reduction in simulation run time when using mean and sampled input values, respectively. For a series of 50 one-way sensitivity analyses, this would yield a total run time of 2 minutes when using C++, compared with 155 minutes for VBA when using mean input values. The use of antithetic variates typically resulted in a 53% reduction in the number of simulation replications and run time required. When drawing all input values to the model from distributions, the use of C++ and variance reduction resulted in a 246-fold improvement in computation time compared with VBA - for which the evaluation of 50 scenarios would correspondingly require 3.8 hours (C++) and approximately 14.5 days (VBA). The choice of programming language used in an economic model, as well as the methods for improving precision of model output can have profound effects on computation time. When constructing complex models, more computationally efficient approaches such as C++ and variance reduction should be considered; concerns regarding model transparency using compiled languages are best addressed via thorough documentation and model validation.

  8. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  9. Multilaboratory particle image velocimetry analysis of the FDA benchmark nozzle model to support validation of computational fluid dynamics simulations.

    Science.gov (United States)

    Hariharan, Prasanna; Giarra, Matthew; Reddy, Varun; Day, Steven W; Manning, Keefe B; Deutsch, Steven; Stewart, Sandy F C; Myers, Matthew R; Berman, Michael R; Burgreen, Greg W; Paterson, Eric G; Malinauskas, Richard A

    2011-04-01

    This study is part of a FDA-sponsored project to evaluate the use and limitations of computational fluid dynamics (CFD) in assessing blood flow parameters related to medical device safety. In an interlaboratory study, fluid velocities and pressures were measured in a nozzle model to provide experimental validation for a companion round-robin CFD study. The simple benchmark nozzle model, which mimicked the flow fields in several medical devices, consisted of a gradual flow constriction, a narrow throat region, and a sudden expansion region where a fluid jet exited the center of the nozzle with recirculation zones near the model walls. Measurements of mean velocity and turbulent flow quantities were made in the benchmark device at three independent laboratories using particle image velocimetry (PIV). Flow measurements were performed over a range of nozzle throat Reynolds numbers (Re(throat)) from 500 to 6500, covering the laminar, transitional, and turbulent flow regimes. A standard operating procedure was developed for performing experiments under controlled temperature and flow conditions and for minimizing systematic errors during PIV image acquisition and processing. For laminar (Re(throat)=500) and turbulent flow conditions (Re(throat)≥3500), the velocities measured by the three laboratories were similar with an interlaboratory uncertainty of ∼10% at most of the locations. However, for the transitional flow case (Re(throat)=2000), the uncertainty in the size and the velocity of the jet at the nozzle exit increased to ∼60% and was very sensitive to the flow conditions. An error analysis showed that by minimizing the variability in the experimental parameters such as flow rate and fluid viscosity to less than 5% and by matching the inlet turbulence level between the laboratories, the uncertainties in the velocities of the transitional flow case could be reduced to ∼15%. The experimental procedure and flow results from this interlaboratory study (available

  10. Purex optimization by computer simulation

    International Nuclear Information System (INIS)

    Campbell, T.G.; McKibben, J.M.

    1980-08-01

    For the past 2 years computer simulation has been used to study the performance of several solvent extraction banks in the Purex facility at the Savannah River Plant in Aiken, South Carolina. Individual process parameters were varied about their normal base case values to determine their individual effects on concentration profiles and end-stream compositions. The data are presented in graphical form to show the extent to which product losses, decontamination factors, solvent extraction bank inventories of fissile materials, and other key properties are affected by process changes. Presented in this way, the data are useful for adapting flowsheet conditions to a particular feed material or product specification, and for evaluating nuclear safety as related to bank inventories

  11. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  12. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  13. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    Science.gov (United States)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between

  14. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    International Nuclear Information System (INIS)

    Xu, Zuwei; Zhao, Haibo; Zheng, Chuguang

    2015-01-01

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule provides a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are

  15. Computer-Graphical Simulation Of Robotic Welding

    Science.gov (United States)

    Fernandez, Ken; Cook, George

    1988-01-01

    Computer program ROBOSIM, developed to simulate operations of robots, applied to preliminary design of robotic arc-welding operation. Limitations on equipment investigated in advance to prevent expensive mistakes. Computer makes drawing of robotic welder and workpiece on positioning table. Such numerical simulation used to perform rapid, safe experiments in computer-aided design or manufacturing.

  16. QCE : A Simulator for Quantum Computer Hardware

    NARCIS (Netherlands)

    Michielsen, Kristel; Raedt, Hans De

    2003-01-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms.

  17. MO-G-17A-04: Internal Dosimetric Calculations for Pediatric Nuclear Imaging Applications, Using Monte Carlo Simulations and High-Resolution Pediatric Computational Models

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)

    2014-06-15

    Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and

  18. EMC Simulation and Modeling

    Science.gov (United States)

    Takahashi, Takehiro; Schibuya, Noboru

    The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.

  19. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  20. Computer simulation of bubble formation

    International Nuclear Information System (INIS)

    Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.

    2007-01-01

    Properties of liquid metals (Li, Pb, Na) containing nano-scale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5 Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory. (authors)

  1. Welding simulation of large-diameter thick-walled stainless steel pipe joints. Fast computation of residual stress and influence of heat source model

    International Nuclear Information System (INIS)

    Maekawa, Akira; Serizawa, Hisashi; Nakacho, Keiji; Murakawa, Hidekazu

    2011-01-01

    There are many weld zones in the apparatus and piping installed in nuclear power plants and residual stress generated in the zone by weld process is the most important influence factor for maintaining structural integrity. Though the weld residual stress is frequently evaluated using numerical simulation, fast simulation techniques have been demanded because of the enormous calculation times used. Recently, the fast weld residual stress evaluation based on three-dimensional accurate analysis became available through development of the Iterative Substructure Method (ISM). In this study, the computational performance of the welding simulation code using the ISM was improved to get faster computations and more accurate welding simulation. By adding functions such as parallel processing, the computation speed was much faster than that of the conventional finite element method code. Furthermore, the accuracy of the improved code was validated by measurements. The influence of two different weld heat source models on the simulation results was also investigated and it was found that the moving heat source was effective to achieve accurate weld simulation for multi-pass welds. (author)

  2. Computer simulations applied in materials

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  3. Computer simulations applied in materials

    International Nuclear Information System (INIS)

    2003-01-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La 2 Zr 2 O 7 pyrochlores; first principle calculations of defects formation energies in the Y 2 (Ti,Sn,Zr) 2 O 7 pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO 2 ; composition defect maps for A 3+ B 3+ O 3 perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  4. Cliff retreat and talus development at the caldera wall of Mount St. Helens: Computer simulation using a mathematical model

    OpenAIRE

    Obanawa, Hiroyuki; Matsukura, Yukinori

    2008-01-01

    To simulate the landform evolution at the caldera wall of Mount St. Helens, USA, a mathematical model for talus developmentwas applied to model the topographic change during the 11years from the volcanic eruption, i.e., from formation of the cliff.Simulated results show that the topographic change is predicted to be large for about 10years after the eruption and to declinethereafter. If snow accumulation in the talus slope deposits is negligible, the talus top will not reach the cliff top wit...

  5. Computer simulation of language competition by physicists

    OpenAIRE

    Schulze, Christian; Stauffer, Dietrich

    2006-01-01

    Computer simulation of languages is an old subject, but since the paper of Abrams and Strogatz (2003) several physics groups independently took up this field. We shortly review their work and bring more details on our own simulations.

  6. Computer programs for developing source terms for a UF{sub 6} dispersion model to simulate postulated UF{sub 6} releases from buildings

    Energy Technology Data Exchange (ETDEWEB)

    Williams, W.R.

    1985-03-01

    Calculational methods and computer programs for the analysis of source terms for postulated releases of UF{sub 6} are presented. Required thermophysical properties of UF{sub 6}, HF, and H{sub 2}O are described in detail. UF{sub 6} reacts with moisture in the ambient environment to form HF and H{sub 2}O. The coexistence of HF and H{sub 2}O significantly alters their pure component properties, and HF vapor polymerizes. Transient compartment models for simulating UF{sub 6} releases inside gaseous diffusion plant feed and withdrawl buildings and cascade buildings are also described. The basic compartment model mass and energy balances are supported by simple heat transfer, ventilation system, and deposition models. A model that can simulate either a closed compartment or a steady-state ventilation system is also discussed. The transient compartment models provide input to an atmospheric dispersion model as output.

  7. Properties of a soft-core model of methanol: An integral equation theory and computer simulation study

    Science.gov (United States)

    Huš, Matej; Munaò, Gianmarco; Urbic, Tomaz

    2014-01-01

    Thermodynamic and structural properties of a coarse-grained model of methanol are examined by Monte Carlo simulations and reference interaction site model (RISM) integral equation theory. Methanol particles are described as dimers formed from an apolar Lennard-Jones sphere, mimicking the methyl group, and a sphere with a core-softened potential as the hydroxyl group. Different closure approximations of the RISM theory are compared and discussed. The liquid structure of methanol is investigated by calculating site-site radial distribution functions and static structure factors for a wide range of temperatures and densities. Results obtained show a good agreement between RISM and Monte Carlo simulations. The phase behavior of methanol is investigated by employing different thermodynamic routes for the calculation of the RISM free energy, drawing gas-liquid coexistence curves that match the simulation data. Preliminary indications for a putative second critical point between two different liquid phases of methanol are also discussed. PMID:25362323

  8. Computer Simulation of Developmental Processes and ...

    Science.gov (United States)

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  9. Computational modelling of polymers

    Science.gov (United States)

    Celarier, Edward A.

    1991-01-01

    Polymeric materials and polymer/graphite composites show a very diverse range of material properties, many of which make them attractive candidates for a variety of high performance engineering applications. Their properties are ultimately determined largely by their chemical structure, and the conditions under which they are processed. It is the aim of computational chemistry to be able to simulate candidate polymers on a computer, and determine what their likely material properties will be. A number of commercially available software packages purport to predict the material properties of samples, given the chemical structures of their constituent molecules. One such system, Cerius, has been in use at LaRC. It is comprised of a number of modules, each of which performs a different kind of calculation on a molecule in the programs workspace. Particularly, interest is in evaluating the suitability of this program to aid in the study of microcrystalline polymeric materials. One of the first model systems examined was benzophenone. The results of this investigation are discussed.

  10. Computer simulation in nuclear science and engineering

    International Nuclear Information System (INIS)

    Akiyama, Mamoru; Miya, Kenzo; Iwata, Shuichi; Yagawa, Genki; Kondo, Shusuke; Hoshino, Tsutomu; Shimizu, Akinao; Takahashi, Hiroshi; Nakagawa, Masatoshi.

    1992-01-01

    The numerical simulation technology used for the design of nuclear reactors includes the scientific fields of wide range, and is the cultivated technology which grew in the steady efforts to high calculation accuracy through safety examination, reliability verification test, the assessment of operation results and so on. Taking the opportunity of putting numerical simulation to practical use in wide fields, the numerical simulation of five basic equations which describe the natural world and the progress of its related technologies are reviewed. It is expected that numerical simulation technology contributes to not only the means of design study but also the progress of science and technology such as the construction of new innovative concept, the exploration of new mechanisms and substances, of which the models do not exist in the natural world. The development of atomic energy and the progress of computers, Boltzmann's transport equation and its periphery, Navier-Stokes' equation and its periphery, Maxwell's electromagnetic field equation and its periphery, Schroedinger wave equation and its periphery, computational solid mechanics and its periphery, and probabilistic risk assessment and its periphery are described. (K.I.)

  11. Investigation of Carbohydrate Recognition via Computer Simulation

    Directory of Open Access Journals (Sweden)

    Quentin R. Johnson

    2015-04-01

    Full Text Available Carbohydrate recognition by proteins, such as lectins and other (biomolecules, can be essential for many biological functions. Recently, interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. We focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  12. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are

  13. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  14. New Computer Simulations of Macular Neural Functioning

    Science.gov (United States)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  15. Computer simulation of sputtering: A review

    International Nuclear Information System (INIS)

    Robinson, M.T.; Hou, M.

    1992-08-01

    In 1986, H. H. Andersen reviewed attempts to understand sputtering by computer simulation and identified several areas where further research was needed: potential energy functions for molecular dynamics (MD) modelling; the role of inelastic effects on sputtering, especially near the target surface; the modelling of surface binding in models based on the binary collision approximation (BCA); aspects of cluster emission in MD models; and angular distributions of sputtered particles. To these may be added kinetic energy distributions of sputtered particles and the relationships between MD and BCA models, as well as the development of intermediate models. Many of these topics are discussed. Recent advances in BCA modelling include the explicit evaluation of the time in strict BCA codes and the development of intermediate codes able to simulate certain many-particle problems realistically. Developments in MD modelling include the wide-spread use of many-body potentials in sputtering calculations, inclusion of realistic electron excitation and electron-phonon interactions, and several studies of cluster ion impacts on solid surfaces

  16. The possibility of coexistence and co-development in language competition: ecology-society computational model and simulation.

    Science.gov (United States)

    Yun, Jian; Shang, Song-Chao; Wei, Xiao-Dan; Liu, Shuang; Li, Zhi-Jie

    2016-01-01

    Language is characterized by both ecological properties and social properties, and competition is the basic form of language evolution. The rise and decline of one language is a result of competition between languages. Moreover, this rise and decline directly influences the diversity of human culture. Mathematics and computer modeling for language competition has been a popular topic in the fields of linguistics, mathematics, computer science, ecology, and other disciplines. Currently, there are several problems in the research on language competition modeling. First, comprehensive mathematical analysis is absent in most studies of language competition models. Next, most language competition models are based on the assumption that one language in the model is stronger than the other. These studies tend to ignore cases where there is a balance of power in the competition. The competition between two well-matched languages is more practical, because it can facilitate the co-development of two languages. A third issue with current studies is that many studies have an evolution result where the weaker language inevitably goes extinct. From the integrated point of view of ecology and sociology, this paper improves the Lotka-Volterra model and basic reaction-diffusion model to propose an "ecology-society" computational model for describing language competition. Furthermore, a strict and comprehensive mathematical analysis was made for the stability of the equilibria. Two languages in competition may be either well-matched or greatly different in strength, which was reflected in the experimental design. The results revealed that language coexistence, and even co-development, are likely to occur during language competition.

  17. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  18. A mathematical model of neuromuscular adaptation to resistance training and its application in a computer simulation of accommodating loads.

    Science.gov (United States)

    Arandjelović, Ognjen

    2010-10-01

    A large corpus of data obtained by means of empirical study of neuromuscular adaptation is currently of limited use to athletes and their coaches. One of the reasons lies in the unclear direct practical utility of many individual trials. This paper introduces a mathematical model of adaptation to resistance training, which derives its elements from physiological fundamentals on the one side, and empirical findings on the other. The key element of the proposed model is what is here termed the athlete's capability profile. This is a generalization of length and velocity dependent force production characteristics of individual muscles, to an exercise with arbitrary biomechanics. The capability profile, a two-dimensional function over the capability plane, plays the central role in the proposed model of the training-adaptation feedback loop. Together with a dynamic model of resistance the capability profile is used in the model's predictive stage when exercise performance is simulated using a numerical approximation of differential equations of motion. Simulation results are used to infer the adaptational stimulus, which manifests itself through a fed back modification of the capability profile. It is shown how empirical evidence of exercise specificity can be formulated mathematically and integrated in this framework. A detailed description of the proposed model is followed by examples of its application-new insights into the effects of accommodating loading for powerlifting are demonstrated. This is followed by a discussion of the limitations of the proposed model and an overview of avenues for future work.

  19. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  20. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  1. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  2. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    Science.gov (United States)

    Khawli, Toufik Al; Gebhardt, Sascha; Eppelt, Urs; Hermanns, Torsten; Kuhlen, Torsten; Schulz, Wolfgang

    2016-06-01

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  3. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    Energy Technology Data Exchange (ETDEWEB)

    Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten [RWTH Aachen University, Chair for Nonlinear Dynamics, Steinbachstr. 15, 52047 Aachen (Germany); Gebhardt, Sascha [RWTH Aachen University, Virtual Reality Group, IT Center, Seffenter Weg 23, 52074 Aachen (Germany); Kuhlen, Torsten [Forschungszentrum Jülich GmbH, Institute for Advanced Simulation (IAS), Jülich Supercomputing Centre (JSC), Wilhelm-Johnen-Straße, 52425 Jülich (Germany); Schulz, Wolfgang [Fraunhofer, ILT Laser Technology, Steinbachstr. 15, 52047 Aachen (Germany)

    2016-06-08

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  4. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  5. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  6. Efficient SDH Computation In Molecular Simulations Data.

    Science.gov (United States)

    Tu, Yi-Cheng; Chen, Shaoping; Pandit, Sagar; Kumar, Anand; Grupcev, Vladimir

    2012-10-01

    Analysis of large particle or molecular simulation data is integral part of the basic-science research community. It often involves computing functions such as point-to-point interactions of particles. Spatial distance histogram (SDH) is one such vital computation in scientific discovery. SDH is frequently used to compute Radial Distribution Function (RDF), and it takes quadratic time to compute using naive approach. Naive SDH computation is even more expensive as it is computed continuously over certain period of time to analyze simulation systems. Tree-based SDH computation is a popular approach. In this paper we look at different tree-based SDH computation techniques and briefly discuss about their performance. We present different strategies to improve the performance of these techniques. Specifically, we study the density map (DM) based SDH computation techniques. A DM is essentially a grid dividing simulated space into cells (3D cubes) of equal size (volume), which can be easily implemented by augmenting a Quad-tree (or Oct-tree) index. DMs are used in various configurations to compute SDH continuously over snapshots of the simulation system. The performance improvements using some of these configurations is presented in this paper. We also present the effect of utilizing computation power of Graphics Processing Units (GPUs) in computing SDH.

  7. Dosimetric reconstruction of radiological accident by numerical simulations by means associating an anthropomorphic model and a Monte Carlo computation code

    International Nuclear Information System (INIS)

    Courageot, Estelle

    2010-01-01

    After a description of the context of radiological accidents (definition, history, context, exposure types, associated clinic symptoms of irradiation and contamination, medical treatment, return on experience) and a presentation of dose assessment in the case of external exposure (clinic, biological and physical dosimetry), this research thesis describes the principles of numerical reconstruction of a radiological accident, presents some computation codes (Monte Carlo code, MCNPX code) and the SESAME tool, and reports an application to an actual case (an accident which occurred in Equator in April 2009). The next part reports the developments performed to modify the posture of voxelized phantoms and the experimental and numerical validations. The last part reports a feasibility study for the reconstruction of radiological accidents occurring in external radiotherapy. This work is based on a Monte Carlo simulation of a linear accelerator, with the aim of identifying the most relevant parameters to be implemented in SESAME in the case of external radiotherapy

  8. Development of a Computer Simulation for a Car Deceleration ...

    African Journals Online (AJOL)

    This is very practical, technical, and it happens every day. In this paper, we studied the factors responsible for this event. Using a computer simulation that is based on a mathematical model, we implemented the simulation of a car braking model and showed how long it takes a car to come to rest while considering certain ...

  9. The effects of simulating a realistic eye model on the eye dose of an adult male undergoing head computed tomography.

    Science.gov (United States)

    Akhlaghi, Parisa; Ebrahimi-Khankook, Atiyeh; Vejdani-Noghreiyan, Alireza

    2017-05-01

    In head computed tomography, radiation upon the eye lens (as an organ with high radiosensitivity) may cause lenticular opacity and cataracts. Therefore, quantitative dose assessment due to exposure of the eye lens and surrounding tissue is a matter of concern. For this purpose, an accurate eye model with realistic geometry and shape, in which different eye substructures are considered, is needed. To calculate the absorbed radiation dose of visual organs during head computed tomography scans, in this study, an existing sophisticated eye model was inserted at the related location in the head of the reference adult male phantom recommended by the International Commission on Radiological Protection (ICRP). Then absorbed doses and distributions of energy deposition in different parts of this eye model were calculated and compared with those based on a previous simple eye model. All calculations were done using the Monte Carlo code MCNP4C for tube voltages of 80, 100, 120 and 140 kVp. In spite of the similarity of total dose to the eye lens for both eye models, the dose delivered to the sensitive zone, which plays an important role in the induction of cataracts, was on average 3% higher for the sophisticated model as compared to the simple model. By increasing the tube voltage, differences between the total dose to the eye lens between the two phantoms decrease to 1%. Due to this level of agreement, use of the sophisticated eye model for patient dosimetry is not necessary. However, it still helps for an estimation of doses received by different eye substructures separately.

  10. Computer Simulations of Molecular Propellers

    National Research Council Canada - National Science Library

    Vacek, Jaroslav

    1999-01-01

    ...). The new program will be used to explore computationally a variety of possible structures for the synthesis of new materials capable acquiring significant internal mechanical angular momentum along...

  11. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  12. Computer simulation of surface and film processes

    Science.gov (United States)

    Tiller, W. A.; Halicioglu, M. T.

    1984-01-01

    All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

  13. Computer simulation of nonequilibrium processes

    International Nuclear Information System (INIS)

    Hoover, W.G.; Moran, B.; Holian, B.L.; Posch, H.A.; Bestiale, S.

    1987-01-01

    Recent atomistic simulations of irreversible macroscopic hydrodynamic flows are illustrated. An extension of Nose's reversible atomistic mechanics makes it possible to simulate such non-equilibrium systems with completely reversible equations of motion. The new techniques show that macroscopic irreversibility is a natural inevitable consequence of time-reversible Lyapunov-unstable microscopic equations of motion

  14. Vernier caliper and micrometer computer models using Easy Java Simulation and its pedagogical design features—ideas for augmenting learning with real instruments

    Science.gov (United States)

    Wee, Loo Kang; Tiang Ning, Hwee

    2014-09-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a simple two-dimensional view for learning from pen and paper questions and the real world; (2) hints, answers, different scale options and the inclusion of zero error; (3) assessment for learning feedback. The initial positive feedback from Singaporean students and educators indicates that these tools could be successfully shared and implemented in learning communities. Educators are encouraged to change the source code for these computer models to suit their own purposes; they have creative commons attribution licenses for the benefit of all.

  15. Information diffusion, Facebook clusters, and the simplicial model of social aggregation: a computational simulation of simplicial diffusers for community health interventions.

    Science.gov (United States)

    Kee, Kerk F; Sparks, Lisa; Struppa, Daniele C; Mannucci, Mirco A; Damiano, Alberto

    2016-01-01

    By integrating the simplicial model of social aggregation with existing research on opinion leadership and diffusion networks, this article introduces the constructs of simplicial diffusers (mathematically defined as nodes embedded in simplexes; a simplex is a socially bonded cluster) and simplicial diffusing sets (mathematically defined as minimal covers of a simplicial complex; a simplicial complex is a social aggregation in which socially bonded clusters are embedded) to propose a strategic approach for information diffusion of cancer screenings as a health intervention on Facebook for community cancer prevention and control. This approach is novel in its incorporation of interpersonally bonded clusters, culturally distinct subgroups, and different united social entities that coexist within a larger community into a computational simulation to select sets of simplicial diffusers with the highest degree of information diffusion for health intervention dissemination. The unique contributions of the article also include seven propositions and five algorithmic steps for computationally modeling the simplicial model with Facebook data.

  16. Validation of a Monte Carlo model used for simulating tube current modulation in computed tomography over a wide range of phantom conditions/challenges

    Energy Technology Data Exchange (ETDEWEB)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); DeMarco, John J. [Department of Radiation Oncology, University of California, Los Angeles, Los Angeles, California 90095 (United States)

    2014-11-01

    Purpose: Monte Carlo (MC) simulation methods have been widely used in patient dosimetry in computed tomography (CT), including estimating patient organ doses. However, most simulation methods have undergone a limited set of validations, often using homogeneous phantoms with simple geometries. As clinical scanning has become more complex and the use of tube current modulation (TCM) has become pervasive in the clinic, MC simulations should include these techniques in their methodologies and therefore should also be validated using a variety of phantoms with different shapes and material compositions to result in a variety of differently modulated tube current profiles. The purpose of this work is to perform the measurements and simulations to validate a Monte Carlo model under a variety of test conditions where fixed tube current (FTC) and TCM were used. Methods: A previously developed MC model for estimating dose from CT scans that models TCM, built using the platform of MCNPX, was used for CT dose quantification. In order to validate the suitability of this model to accurately simulate patient dose from FTC and TCM CT scan, measurements and simulations were compared over a wide range of conditions. Phantoms used for testing range from simple geometries with homogeneous composition (16 and 32 cm computed tomography dose index phantoms) to more complex phantoms including a rectangular homogeneous water equivalent phantom, an elliptical shaped phantom with three sections (where each section was a homogeneous, but different material), and a heterogeneous, complex geometry anthropomorphic phantom. Each phantom requires varying levels of x-, y- and z-modulation. Each phantom was scanned on a multidetector row CT (Sensation 64) scanner under the conditions of both FTC and TCM. Dose measurements were made at various surface and depth positions within each phantom. Simulations using each phantom were performed for FTC, detailed x–y–z TCM, and z-axis-only TCM to obtain

  17. Validation of a Monte Carlo model used for simulating tube current modulation in computed tomography over a wide range of phantom conditions/challenges.

    Science.gov (United States)

    Bostani, Maryam; McMillan, Kyle; DeMarco, John J; Cagnon, Chris H; McNitt-Gray, Michael F

    2014-11-01

    Monte Carlo (MC) simulation methods have been widely used in patient dosimetry in computed tomography (CT), including estimating patient organ doses. However, most simulation methods have undergone a limited set of validations, often using homogeneous phantoms with simple geometries. As clinical scanning has become more complex and the use of tube current modulation (TCM) has become pervasive in the clinic, MC simulations should include these techniques in their methodologies and therefore should also be validated using a variety of phantoms with different shapes and material compositions to result in a variety of differently modulated tube current profiles. The purpose of this work is to perform the measurements and simulations to validate a Monte Carlo model under a variety of test conditions where fixed tube current (FTC) and TCM were used. A previously developed MC model for estimating dose from CT scans that models TCM, built using the platform of mcnpx, was used for CT dose quantification. In order to validate the suitability of this model to accurately simulate patient dose from FTC and TCM CT scan, measurements and simulations were compared over a wide range of conditions. Phantoms used for testing range from simple geometries with homogeneous composition (16 and 32 cm computed tomography dose index phantoms) to more complex phantoms including a rectangular homogeneous water equivalent phantom, an elliptical shaped phantom with three sections (where each section was a homogeneous, but different material), and a heterogeneous, complex geometry anthropomorphic phantom. Each phantom requires varying levels of x-, y- and z-modulation. Each phantom was scanned on a multidetector row CT (Sensation 64) scanner under the conditions of both FTC and TCM. Dose measurements were made at various surface and depth positions within each phantom. Simulations using each phantom were performed for FTC, detailed x-y-z TCM, and z-axis-only TCM to obtain dose estimates. This

  18. Validation of a Monte Carlo model used for simulating tube current modulation in computed tomography over a wide range of phantom conditions/challenges

    International Nuclear Information System (INIS)

    Bostani, Maryam; McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F.; DeMarco, John J.

    2014-01-01

    Purpose: Monte Carlo (MC) simulation methods have been widely used in patient dosimetry in computed tomography (CT), including estimating patient organ doses. However, most simulation methods have undergone a limited set of validations, often using homogeneous phantoms with simple geometries. As clinical scanning has become more complex and the use of tube current modulation (TCM) has become pervasive in the clinic, MC simulations should include these techniques in their methodologies and therefore should also be validated using a variety of phantoms with different shapes and material compositions to result in a variety of differently modulated tube current profiles. The purpose of this work is to perform the measurements and simulations to validate a Monte Carlo model under a variety of test conditions where fixed tube current (FTC) and TCM were used. Methods: A previously developed MC model for estimating dose from CT scans that models TCM, built using the platform of MCNPX, was used for CT dose quantification. In order to validate the suitability of this model to accurately simulate patient dose from FTC and TCM CT scan, measurements and simulations were compared over a wide range of conditions. Phantoms used for testing range from simple geometries with homogeneous composition (16 and 32 cm computed tomography dose index phantoms) to more complex phantoms including a rectangular homogeneous water equivalent phantom, an elliptical shaped phantom with three sections (where each section was a homogeneous, but different material), and a heterogeneous, complex geometry anthropomorphic phantom. Each phantom requires varying levels of x-, y- and z-modulation. Each phantom was scanned on a multidetector row CT (Sensation 64) scanner under the conditions of both FTC and TCM. Dose measurements were made at various surface and depth positions within each phantom. Simulations using each phantom were performed for FTC, detailed x–y–z TCM, and z-axis-only TCM to obtain

  19. Isobaric-isothermal molecular dynamics computer simulations of the properties of water-1,2-dimethoxyethane model mixtures

    Directory of Open Access Journals (Sweden)

    J. Gujt

    2017-12-01

    Full Text Available Isothermal-isobaric molecular dynamics simulations have been performed to examine a broad set of properties of the model water-1,2-dimethoxyethane (DME mixture as a function of composition. The SPC-E and TIP4P-Ew water models and the modified TraPPE model for DME were applied. Our principal focus was to explore the trends of behaviour of the structural properties in terms of the radial distribution functions, coordination numbers and number of hydrogen bonds between molecules of different species, and of conformations of DME molecules. Thermodynamic properties, such as density, molar volume, enthalpy of mixing and heat capacity at constant pressure have been examined. Finally, the self-diffusion coefficients of species and the dielectric constant of the system were calculated and analyzed.

  20. Modeling and Simulation: An Overview

    OpenAIRE

    Michael McAleer; Felix Chan; Les Oxley

    2013-01-01

    The papers in this special issue of Mathematics and Computers in Simulation cover the following topics. Improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal. The empirical properties of some estimators of long memory, characterising trader manipulation in a limitorder driven market, measuring bias in a term-structure model of commodity prices through the c...

  1. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    Science.gov (United States)

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  2. Computer simulation in physics and engineering

    CERN Document Server

    Steinhauser, Martin Oliver

    2013-01-01

    This work is a needed reference for widely used techniques and methods of computer simulation in physics and other disciplines, such as materials science. The work conveys both: the theoretical foundations of computer simulation as well as applications and "tricks of the trade", that often are scattered across various papers. Thus it will meet a need and fill a gap for every scientist who needs computer simulations for his/her task at hand. In addition to being a reference, case studies and exercises for use as course reading are included.

  3. Product Costing in FMT: Comparing Deterministic and Stochastic Models Using Computer-Based Simulation for an Actual Case Study

    DEFF Research Database (Denmark)

    Nielsen, Steen

    2000-01-01

    This paper expands the traditional product costing technique be including a stochastic form in a complex production process for product costing. The stochastic phenomenon in flesbile manufacturing technologies is seen as an important phenomenon that companies try to decreas og eliminate. DFM has ...... been used for evaluating the appropriateness of the firm's production capability. In this paper a simulation model is developed to analyze the relevant cost behaviour with respect to DFM and to develop a more streamlined process in the layout of the manufacturing process....

  4. Computer simulation of bounded plasmas

    International Nuclear Information System (INIS)

    Lawson, W.S.

    1987-01-01

    The problems of simulating a one-dimensional bounded plasma system using particles in a gridded space are systematically explored and solutions to them are given. Such problems include the injection of particles at the boundaries, the solution of Poisson's equation, and the inclusion of an external circuit between the confining boundaries. A recently discovered artificial cooling effect is explained as being a side-effect of quiet injection, and its potential for causing serious but subtle errors in bounded simulation is noted. The methods described in the first part of the thesis are then applied to the simulation of an extension of the Pierce diode problem, specifically a Pierce diode modified by an external circuit between the electrodes. The results of these simulations agree to high accuracy with theory when a theory exists, and also show some interesting chaotic behavior in certain parameter regimes. The chaotic behavior is described in detail

  5. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  6. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  7. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  8. Computer simulations of the mouse spermatogenic cycle

    Directory of Open Access Journals (Sweden)

    Debjit Ray

    2014-12-01

    Full Text Available The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal–spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.

  9. Evaluation of Visual Computer Simulator for Computer Architecture Education

    Science.gov (United States)

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  10. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  11. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  12. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  13. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  14. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  15. Computer simulations for the nano-scale

    International Nuclear Information System (INIS)

    Stich, I.

    2007-01-01

    A review of methods for computations for the nano-scale is presented. The paper should provide a convenient starting point into computations for the nano-scale as well as a more in depth presentation for those already working in the field of atomic/molecular-scale modeling. The argument is divided in chapters covering the methods for description of the (i) electrons, (ii) ions, and (iii) techniques for efficient solving of the underlying equations. A fairly broad view is taken covering the Hartree-Fock approximation, density functional techniques and quantum Monte-Carlo techniques for electrons. The customary quantum chemistry methods, such as post Hartree-Fock techniques, are only briefly mentioned. Description of both classical and quantum ions is presented. The techniques cover Ehrenfest, Born-Oppenheimer, and Car-Parrinello dynamics. The strong and weak points of both principal and technical nature are analyzed. In the second part we introduce a number of applications to demonstrate the different approximations and techniques introduced in the first part. They cover a wide range of applications such as non-simple liquids, surfaces, molecule-surface interactions, applications in nano technology, etc. These more in depth presentations, while certainly not exhaustive, should provide information on technical aspects of the simulations, typical parameters used, and ways of analysis of the huge amounts of data generated in these large-scale supercomputer simulations. (author)

  16. Conformational Properties of a Polymer in an Ionic Liquid: Computer Simulations and Integral Equation Theory of a Coarse-Grained Model.

    Science.gov (United States)

    Choi, Eunsong; Yethiraj, Arun

    2015-07-23

    We study the conformational properties of polymers in room temperature ionic liquids using theory and simulations of a coarse-grained model. Atomistic simulations have shown that single poly(ethylene oxide) (PEO) molecules in the ionic liquid 1-butyl 3-methyl imidazolium tetrafluoroborate ([BMIM][BF4]) are expanded at room temperature (i.e., the radius of gyration, Rg), scales with molecular weight, Mw, as Rg ∼ Mw(0.9), instead of the expected self-avoiding walk behavior. The simulations were restricted to fairly short chains, however, which might not be in the true scaling regime. In this work, we investigate a coarse-grained model for the behavior of PEO in [BMIM][BF4]. We use existing force fields for PEO and [BMIM][BF4] and Lorentz–Berthelot mixing rules for the cross interactions. The coarse-grained model predicts that PEO collapses in the ionic liquid. We also present an integral equation theory for the structure of the ionic liquid and the conformation properties of the polymer. The theory is in excellent agreement with the simulation results. We conclude that the properties of polymers in ionic liquids are unusually sensitive to the details of the intermolecular interactions. The integral equation theory is sufficiently accurate to be a useful guide to computational work.

  17. Computer Systems/Database Simulation.

    Science.gov (United States)

    1978-10-15

    simulater, he runs the risk of running a simulation he does not understand. Technical documentation of CASE internals is virutally non- i-xi!;taiit; the...few users outside3 the domain of the team of researchers who worked to make the IPSS design methodology a reality . It was demonstrated effectively in

  18. A computationally efficient framework for the simulation of cardiac perfusion using a multi-compartment Darcy porous-media flow model.

    Science.gov (United States)

    Michler, C; Cookson, A N; Chabiniok, R; Hyde, E; Lee, J; Sinclair, M; Sochi, T; Goyal, A; Vigueras, G; Nordsletten, D A; Smith, N P

    2013-02-01

    We present a method to efficiently simulate coronary perfusion in subject-specific models of the heart within clinically relevant time frames. Perfusion is modelled as a Darcy porous-media flow, where the permeability tensor is derived from homogenization of an explicit anatomical representation of the vasculature. To account for the disparity in length scales present in the vascular network, in this study, this approach is further refined through the implementation of a multi-compartment medium where each compartment encapsulates the spatial scales in a certain range by using an effective permeability tensor. Neighbouring compartments then communicate through distributed sources and sinks, acting as volume fluxes. Although elegant from a modelling perspective, the full multi-compartment Darcy system is computationally expensive to solve. We therefore enhance computational efficiency of this model by reducing the N-compartment system of Darcy equations to N pressure equations, and N subsequent projection problems to recover the Darcy velocity. The resulting 'reduced' Darcy formulation leads to a dramatic reduction in algebraic-system size and is therefore computationally cheaper to solve than the full multi-compartment Darcy system. A comparison of the reduced and the full formulation in terms of solution time and memory usage clearly highlights the superior performance of the reduced formulation. Moreover, the implementation of flux and, specifically, impermeable boundary conditions on arbitrarily curved boundaries such as epicardium and endocardium is straightforward in contrast to the full Darcy formulation. Finally, to demonstrate the applicability of our methodology to a personalized model and its solvability in clinically relevant time frames, we simulate perfusion in a subject-specific model of the left ventricle. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  20. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  1. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    Science and Automation. Indian Institute of Science. Bangalore 560 012. Email: mjt@csa.iisc.ernet.in. Computers can be used to simulate the operation of complex systems and thereby study their performance. This article introduces you to the technique of discrete event simulation through a simple example. Introduction.

  2. REACTOR: a computer simulation for schools

    International Nuclear Information System (INIS)

    Squires, D.

    1985-01-01

    The paper concerns computer simulation of the operation of a nuclear reactor, for use in schools. The project was commissioned by UKAEA, and carried out by the Computers in the Curriculum Project, Chelsea College. The program, for an advanced gas cooled reactor, is briefly described. (U.K.)

  3. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  4. Teaching Macroeconomics with a Computer Simulation. Final Report.

    Science.gov (United States)

    Dolbear, F. Trenery, Jr.

    The study of macroeconomics--the determination and control of aggregative variables such as gross national product, unemployment and inflation--may be facilitated by the use of a computer simulation policy game. An aggregative model of the economy was constructed and programed for a computer and (hypothetical) historical data were generated. The…

  5. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  6. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  7. Learning and instruction with computer simulations

    NARCIS (Netherlands)

    de Jong, Anthonius J.M.

    1991-01-01

    The present volume presents the results of an inventory of elements of such a computer learning environment. This inventory was conducted within a DELTA project called SIMULATE. In the project a learning environment that provides intelligent support to learners and that has a simulation as its

  8. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  9. Computational fluid dynamics (CFD) simulation of hot air flow ...

    African Journals Online (AJOL)

    Computational Fluid Dynamics simulation of air flow distribution, air velocity and pressure field pattern as it will affect moisture transient in a cabinet tray dryer is performed using SolidWorks Flow Simulation (SWFS) 2014 SP 4.0 program. The model used for the drying process in this experiment was designed with Solid ...

  10. Wealth distribution, Pareto law, and stretched exponential decay of money: Computer simulations analysis of agent-based models

    Science.gov (United States)

    Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf

    2018-01-01

    We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.

  11. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  12. Computer graphics in heat-transfer simulations

    International Nuclear Information System (INIS)

    Hamlin, G.A. Jr.

    1980-01-01

    Computer graphics can be very useful in the setup of heat transfer simulations and in the display of the results of such simulations. The potential use of recently available low-cost graphics devices in the setup of such simulations has not been fully exploited. Several types of graphics devices and their potential usefulness are discussed, and some configurations of graphics equipment are presented in the low-, medium-, and high-price ranges

  13. Fast computation of the inverse CMH model

    Science.gov (United States)

    Patel, Umesh D.; Della Torre, Edward

    2001-12-01

    A fast computational method based on differential equation approach for inverse Della Torre, Oti, Kádár (DOK) model has been extended for the inverse Complete Moving Hysteresis (CMH) model. A cobweb technique for calculating the inverse CMH model is also presented. The two techniques differ from the point of view of flexibility, accuracy, and computation time. Simulation results of the inverse computation for both methods are presented.

  14. Computer Simulation of Electron Positron Annihilation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, y

    2003-10-02

    With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create

  15. Polymer Composites Corrosive Degradation: A Computational Simulation

    Science.gov (United States)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  16. Insertion of control systems models in the Almod 3 computer code for the simulation of Angra I reactor start-up tests

    International Nuclear Information System (INIS)

    Camargo, C.T.M.

    1981-09-01

    The Almod 3 computer code was modified, aiming at the simulation of Angra I nuclear power plant behavior during some reactor start-up tests. The results obtained with the modified computer code (Almod 3W) are compared with those obtained with the Retran computer code. (E.G.) [pt

  17. Creating science simulations through Computational Thinking Patterns

    Science.gov (United States)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  18. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  19. Computer Simulation of Noise Effects of the Neighborhood of Stimulus Threshold for a Mathematical Model of Homeostatic Regulation of Sleep-Wake Cycles

    Directory of Open Access Journals (Sweden)

    Wuyin Jin

    2017-01-01

    Full Text Available The noise effects on a homeostatic regulation of sleep-wake cycles’ neuronal mathematical model determined by the hypocretin/orexin and the local glutamate interneurons spatiotemporal behaviors are studied within the neighborhood of stimulus threshold in this work; the neuronal noise added to the stimulus, the conductance, and the activation variable of the modulation function are investigated, respectively, based on a circadian input skewed in sine function. The computer simulation results suggested that the increased amplitude of external current input will lead to the fact that awakening time is advanced but the sleepy time remains the same; for the bigger conductance and modulation noise, the regulatory mechanism of the model sometimes will be collapsed and the coupled two neurons of the model show very irregular activities; the falling asleep or wake transform appears at nondeterminate time.

  20. Sonification of simulations in computational physics

    International Nuclear Information System (INIS)

    Vogt, K.

    2010-01-01

    Sonification is the translation of information for auditory perception, excluding speech itself. The cognitive performance of pattern recognition is striking for sound, and has too long been disregarded by the scientific mainstream. Examples of 'spontaneous sonification' and systematic research for about 20 years have proven that sonification provides a valuable tool for the exploration of scientific data. The data in this thesis stem from computational physics, where numerical simulations are applied to problems in physics. Prominent examples are spin models and lattice quantum field theories. The corresponding data lend themselves very well to innovative display methods: they are structured on discrete lattices, often stochastic, high-dimensional and abstract, and they provide huge amounts of data. Furthermore, they have no inher- ently perceptual dimension. When designing the sonification of simulation data, one has to make decisions on three levels, both for the data and the sound model: the level of meaning (phenomenological; metaphoric); of structure (in time and space), and of elements ('display units' vs. 'gestalt units'). The design usually proceeds as a bottom-up or top-down process. This thesis provides a 'toolbox' for helping in these decisions. It describes tools that have proven particularly useful in the context of simulation data. An explicit method of top-down sonification design is the metaphoric sonification method, which is based on expert interviews. Furthermore, qualitative and quantitative evaluation methods are presented, on the basis of which a set of evaluation criteria is proposed. The translation between a scientific and the sound synthesis domain is elucidated by a sonification operator. For this formalization, a collection of notation modules is provided. Showcases are discussed in detail that have been developed in the interdisciplinary research projects SonEnvir and QCD-audio, during the second Science By Ear workshop and during a

  1. Simulation modelling in agriculture: General considerations. | R.I. ...

    African Journals Online (AJOL)

    The computer does all the necessary arithmetic when the hypothesis is invoked to predict the future behaviour of the simulated system under given conditions.A general ... in the advisory service. Keywords: agriculture; botany; computer simulation; modelling; simulation model; simulation modelling; south africa; techniques ...

  2. Computer simulations and modeling-assisted ToxR screening in deciphering 3D structures of transmembrane α-helical dimers: ephrin receptor A1

    International Nuclear Information System (INIS)

    Volynsky, P E; Mineeva, E A; Goncharuk, M V; Ermolyuk, Ya S; Arseniev, A S; Efremov, R G

    2010-01-01

    Membrane-spanning segments of numerous proteins (e.g. receptor tyrosine kinases) represent a novel class of pharmacologically important targets, whose activity can be modulated by specially designed artificial peptides, the so-called interceptors. Rational construction of such peptides requires understanding of the main factors driving peptide–peptide association in lipid membranes. Here we present a new method for rapid prediction of the spatial structure of transmembrane (TM) helix–helix complexes. It is based on computer simulations in membrane-like media and subsequent refinement/validation of the results using experimental studies of TM helix dimerization in a bacterial membrane by means of the ToxR system. The approach was applied to TM fragments of the ephrin receptor A1 (EphA1). A set of spatial structures of the dimer was proposed based on Monte Carlo simulations in an implicit membrane followed by molecular dynamics relaxation in an explicit lipid bilayer. The resulting models were employed for rational design of wild-type and mutant genetic constructions for ToxR assays. The computational and the experimental data are self-consistent and provide an unambiguous spatial model of the TM dimer of EphA1. The results of this work can be further used to develop new biologically active 'peptide interceptors' specifically targeting membrane domains of proteins

  3. Computer Simulations of Lipid Nanoparticles

    Directory of Open Access Journals (Sweden)

    Xavier F. Fernandez-Luengo

    2017-12-01

    Full Text Available Lipid nanoparticles (LNP are promising soft matter nanomaterials for drug delivery applications. In spite of their interest, little is known about the supramolecular organization of the components of these self-assembled nanoparticles. Here, we present a molecular dynamics simulation study, employing the Martini coarse-grain forcefield, of self-assembled LNPs made by tripalmitin lipid in water. We also study the adsorption of Tween 20 surfactant as a protective layer on top of the LNP. We show that, at 310 K (the temperature of interest in biological applications, the structure of the lipid nanoparticles is similar to that of a liquid droplet, in which the lipids show no nanostructuration and have high mobility. We show that, for large enough nanoparticles, the hydrophilic headgroups develop an interior surface in the NP core that stores liquid water. The surfactant is shown to organize in an inhomogeneous way at the LNP surface, with patches with high surfactant concentrations and surface patches not covered by surfactant.

  4. Simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  5. Evaluation of vertical and lateral flow through agricultural loessial hillslopes using a two-dimensional computer simulation model

    NARCIS (Netherlands)

    Ritsema, C.J.; Oostindie, K.; Stolte, J.

    1996-01-01

    On four hill-slopes in the loess region of the Netherlands pressure heads were monitored during rain events with time intervals of five minutes. Water flow through these hill-slopes during erosive rain events in summer and winter was simulated two-dimensionally. These simulations showed that

  6. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  7. Assessment of CREAMS [Chemicals, Runoff, and Erosion from Agricultural Management Systems] and ERHYM-II [Ekalaka Rangeland Hydrology and Yield Model] computer models for simulating soil water movement on the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Laundre, J.W.

    1990-05-01

    The major goal of radioactive waste management is long-term containment of radioactive waste. Long-term containment is dependent on understanding water movement on, into, and through trench caps. Several computer simulation models are available for predicting water movement. Of the several computer models available, CREAMS (Chemicals, Runoff, and Erosion from Agricultural Management Systems) and ERHYM-II (Ekalaka Rangeland Hydrology and Yield Model) were tested for use on the Idaho National Engineering Laboratory (INEL). The models were calibrated, tested for sensitivity, and used to evaluate some basic trench cap designs. Each model was used to postdict soil moisture, evapotranspiration, and runoff of two watersheds for which such data were already available. Sensitivity of the models was tested by adjusting various input parameters from high to low values and then comparing model outputs to those generated from average values. Ten input parameters of the CREAMS model were tested for sensitivity. 17 refs., 23 figs., 20 tabs

  8. Moment aberrations in magneto-electrostatic plasma lenses (computer simulation)

    CERN Document Server

    Butenko, V I

    2001-01-01

    In this work moment aberrations in the plasma magneto-electrostatic lenses are considered in more detail with the use of the computer modeling. For solution of the problem we have developed a special computer code - the model of plasma optical focusing device, allowing to display the main parameters and operations of experimental sample of a lens, to simulate the moment and geometrical aberrations and give recommendations on their elimination.

  9. Computer simulation of grain growth in HAZ

    Science.gov (United States)

    Gao, Jinhua

    Two different models for Monte Carlo simulation of normal grain growth in metals and alloys were developed. Each simulation model was based on a different approach to couple the Monte Carlo simulation time to real time-temperature. These models demonstrated the applicability of Monte Carlo simulation to grain growth in materials processing. A grain boundary migration (GBM) model coupled the Monte Carlo simulation to a first principle grain boundary migration model. The simulation results, by applying this model to isothermal grain growth in zone-refined tin, showed good agreement with experimental results. An experimental data based (EDB) model coupled the Monte Carlo simulation with grain growth kinetics obtained from the experiment. The results of the application of the EDB model to the grain growth during continuous heating of a beta titanium alloy correlated well with experimental data. In order to acquire the grain growth kinetics from the experiment, a new mathematical method was developed and utilized to analyze the experimental data on isothermal grain growth. Grain growth in the HAZ of 0.2% Cu-Al alloy was successfully simulated using the EDB model combined with grain growth kinetics obtained from the experiment and measured thermal cycles from the welding process. The simulated grain size distribution in the HAZ was in good agreement with experimental results. The pinning effect of second phase particles on grain growth was also simulated in this work. The simulation results confirmed that by introducing the variable R, degree of contact between grain boundaries and second phase particles, the Zener pinning model can be modified as${D/ r} = {K/{Rf}}$where D is the pinned grain size, r the mean size of second phase particles, K a constant, f the area fraction (or the volume fraction in 3-D) of second phase.

  10. Structural Composites Corrosive Management by Computational Simulation

    Science.gov (United States)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  11. Computer simulation models for teaching and learning Modelos de simulación en salud : una alternativa para la docencia

    Directory of Open Access Journals (Sweden)

    Juan Gonzálo Restrepo Salazar

    1997-04-01

    Full Text Available Computer programs are being used for teaching and learning of pharmacology and physiology at the University of Antioquia, in Medellín, Colombia. They should be more widely used since they offer clear advantages over traditional systems of teach.ng; they allow direct presentations of models in motion, as well as a more active, interesting and flexible way of learning; besides they can save time and cut costs. Los Modelos de Simulación por Computador son programas de aprendizaje para enseñar materias como farmacología y fisiología a los estudiantes del área de la salud y de las ciencias básicas biomédicas. Los experimentos de simulación pueden usarse como soporte para la docencia y en algunas circunstancias como alternativa en las prácticas de laboratorio. La tecnología por computador ahora disponible permite la presentación directa de modelos en movimiento y posibilita un aprendizaje menos pasivo, más eficiente e interesante. La respuesta simulada de los tejidos se genera ya sea por resultados de experimentos actuales o por modelos predictivos y se presenta en la pantalla con gráficas de alta resolución comparables con las situaciones reales. Los estudiantes pueden realizar experimentos simulados, cambiar fácilmente sus parámetros y obtener información de igual manera que si hubieran realizado el experimento en el laboratorio.

  12. Time reversibility, computer simulation, and chaos

    CERN Document Server

    Hoover, William Graham

    1999-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful

  13. Noise simulation in cone beam CT imaging with parallel computing

    International Nuclear Information System (INIS)

    Tu, S.-J.; Shaw, Chris C; Chen, Lingyun

    2006-01-01

    We developed a computer noise simulation model for cone beam computed tomography imaging using a general purpose PC cluster. This model uses a mono-energetic x-ray approximation and allows us to investigate three primary performance components, specifically quantum noise, detector blurring and additive system noise. A parallel random number generator based on the Weyl sequence was implemented in the noise simulation and a visualization technique was accordingly developed to validate the quality of the parallel random number generator. In our computer simulation model, three-dimensional (3D) phantoms were mathematically modelled and used to create 450 analytical projections, which were then sampled into digital image data. Quantum noise was simulated and added to the analytical projection image data, which were then filtered to incorporate flat panel detector blurring. Additive system noise was generated and added to form the final projection images. The Feldkamp algorithm was implemented and used to reconstruct the 3D images of the phantoms. A 24 dual-Xeon PC cluster was used to compute the projections and reconstructed images in parallel with each CPU processing 10 projection views for a total of 450 views. Based on this computer simulation system, simulated cone beam CT images were generated for various phantoms and technique settings. Noise power spectra for the flat panel x-ray detector and reconstructed images were then computed to characterize the noise properties. As an example among the potential applications of our noise simulation model, we showed that images of low contrast objects can be produced and used for image quality evaluation

  14. Autonomous Micro-Modular Mobile Data Center Cloud Computing Study for Modeling, Simulation, Information Processing and Cyber-Security Viability

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing environments offer opportunities for malicious users to penetrate security layers and damage, destroy or steal data. This ability can be exploited to...

  15. Computer modeling of human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    Models of human decision making are reviewed. Models which treat just the cognitive aspects of human behavior are included as well as models which include motivation. Both models which have associated computer programs, and those that do not, are considered. Since flow diagrams, that assist in constructing computer simulation of such models, were not generally available, such diagrams were constructed and are presented. The result provides a rich source of information, which can aid in construction of more realistic future simulations of human decision making.

  16. Modeling and Simulation for Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Swinhoe, Martyn T. [Los Alamos National Laboratory

    2012-07-26

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R&D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  17. Perspective: Computer simulations of long time dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Elber, Ron [Department of Chemistry, The Institute for Computational Engineering and Sciences, University of Texas at Austin, Austin, Texas 78712 (United States)

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.

  18. Challenges & Roadmap for Beyond CMOS Computing Simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).

  19. Neuromusculoskeletal computer modeling and simulation of upright, straight-legged, bipedal locomotion of Australopithecus afarensis (A.L. 288-1).

    Science.gov (United States)

    Nagano, Akinori; Umberger, Brian R; Marzke, Mary W; Gerritsen, Karin G M

    2005-01-01

    The skeleton of Australopithecus afarensis (A.L. 288-1, better known as "Lucy") is by far the most complete record of locomotor morphology of early hominids currently available. Even though researchers agree that the postcranial skeleton of Lucy shows morphological features indicative of bipedality, only a few studies have investigated Lucy's bipedal locomotion itself. Lucy's energy expenditure during locomotion has been the topic of much speculation, but has not been investigated, except for several estimates derived from experimental data collected on other animals. To gain further insights into how Lucy may have walked, we generated a full three-dimensional (3D) reconstruction and forward-dynamic simulation of upright bipedal locomotion of this ancient human ancestor. Laser-scanned 3D bone geometries were combined with state-of-the-art neuromusculoskeletal modeling and simulation techniques from computational biomechanics. A detailed full 3D neuromusculoskeletal model was developed that encompassed all major bones, joints (10), and muscles (52) of the lower extremity. A model of muscle force and heat production was used to actuate the musculoskeletal system, and to estimate total energy expenditure during locomotion. Neural activation profiles for each of the 52 muscles that produced a single step of locomotion, while at the same time minimizing the energy consumed per meter traveled, were searched through numerical optimization. The numerical optimization resulted in smooth locomotor kinematics, and the predicted energy expenditure was appropriate for upright bipedal walking in an individual of Lucy's body size. (c) 2004 Wiley-Liss, Inc.

  20. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  1. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    Science.gov (United States)

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  2. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    systems and thereby study their performance. This article introduces you to the technique of discrete event simulation through a simple example. Introduction. Computers are playing an increasingly integral part in our daily lives. Communication, entertainment, finance, education, governance, health care … the list of areas ...

  3. Creation of an idealized nasopharynx geometry for accurate computational fluid dynamics simulations of nasal airflow in patient-specific models lacking the nasopharynx anatomy.

    Science.gov (United States)

    A T Borojeni, Azadeh; Frank-Ito, Dennis O; Kimbell, Julia S; Rhee, John S; Garcia, Guilherme J M

    2017-05-01

    Virtual surgery planning based on computational fluid dynamics (CFD) simulations has the potential to improve surgical outcomes for nasal airway obstruction patients, but the benefits of virtual surgery planning must outweigh the risks of radiation exposure. Cone beam computed tomography (CT) scans represent an attractive imaging modality for virtual surgery planning due to lower costs and lower radiation exposures compared with conventional CT scans. However, to minimize the radiation exposure, the cone beam CT sinusitis protocol sometimes images only the nasal cavity, excluding the nasopharynx. The goal of this study was to develop an idealized nasopharynx geometry for accurate representation of outlet boundary conditions when the nasopharynx geometry is unavailable. Anatomically accurate models of the nasopharynx created from 30 CT scans were intersected with planes rotated at different angles to obtain an average geometry. Cross sections of the idealized nasopharynx were approximated as ellipses with cross-sectional areas and aspect ratios equal to the average in the actual patient-specific models. CFD simulations were performed to investigate whether nasal airflow patterns were affected when the CT-based nasopharynx was replaced by the idealized nasopharynx in 10 nasal airway obstruction patients. Despite the simple form of the idealized geometry, all biophysical variables (nasal resistance, airflow rate, and heat fluxes) were very similar in the idealized vs patient-specific models. The results confirmed the expectation that the nasopharynx geometry has a minimal effect in the nasal airflow patterns during inspiration. The idealized nasopharynx geometry will be useful in future CFD studies of nasal airflow based on medical images that exclude the nasopharynx. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Computer simulations of phospholipid - membrane thermodynamic fluctuations

    DEFF Research Database (Denmark)

    Pedersen, U.R.; Peters, Günther H.j.; Schröder, T.B.

    2008-01-01

    This paper reports all-atom computer simulations of five phospholipid membranes, DMPC, DPPC, DMPG, DMPS, and DMPSH, with a focus on the thermal equilibrium fluctuations of volume, energy, area, thickness, and order parameter. For the slow fluctuations at constant temperature and pressure (defined...

  5. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  6. Computer simulation as an operational and training aid

    International Nuclear Information System (INIS)

    Lee, D.J.; Tottman-Trayner, E.

    1995-01-01

    The paper describes how the rapid development of desktop computing power, the associated fall in prices, and the advancement of computer graphics technology driven by the entertainment industry has enabled the nuclear industry to achieve improvements in operation and training through the use of computer simulation. Applications are focused on the fuel handling operations at Torness Power Station where visualization through computer modelling is being used to enhance operator awareness and to assist in a number of operational scenarios. It is concluded that there are significant benefits to be gained from the introduction of the facility at Torness as well as other locations. (author)

  7. A computer-simulation study on the effects of MRI voxel dimensions on carotid plaque lipid-core and fibrous cap segmentation and stress modeling.

    Directory of Open Access Journals (Sweden)

    Harm A Nieuwstadt

    Full Text Available The benefits of a decreased slice thickness and/or in-plane voxel size in carotid MRI for atherosclerotic plaque component quantification accuracy and biomechanical peak cap stress analysis have not yet been investigated in detail because of practical limitations.In order to provide a methodology that allows such an investigation in detail, numerical simulations of a T1-weighted, contrast-enhanced, 2D MRI sequence were employed. Both the slice thickness (2 mm, 1 mm, and 0.5 mm and the in plane acquired voxel size (0.62x0.62 mm2 and 0.31x0.31 mm2 were varied. This virtual MRI approach was applied to 8 histology-based 3D patient carotid atherosclerotic plaque models.A decreased slice thickness did not result in major improvements in lumen, vessel wall, and lipid-rich necrotic core size measurements. At 0.62x0.62 mm2 in-plane, only a 0.5 mm slice thickness resulted in improved minimum fibrous cap thickness measurements (a 2-3 fold reduction in measurement error and only marginally improved peak cap stress computations. Acquiring voxels of 0.31x0.31 mm2 in-plane, however, led to either similar or significantly larger improvements in plaque component quantification and computed peak cap stress.This study provides evidence that for currently-used 2D carotid MRI protocols, a decreased slice thickness might not be more beneficial for plaque measurement accuracy than a decreased in-plane voxel size. The MRI simulations performed indicate that not a reduced slice thickness (i.e. more isotropic imaging, but the acquisition of anisotropic voxels with a relatively smaller in-plane voxel size could improve carotid plaque quantification and computed peak cap stress accuracy.

  8. External validation of type 2 diabetes computer simulation models: definitions, approaches, implications and room for improvement-a protocol for a systematic review.

    Science.gov (United States)

    Ogurtsova, Katherine; Heise, Thomas L; Linnenkamp, Ute; Dintsios, Charalabos-Markos; Lhachimi, Stefan K; Icks, Andrea

    2017-12-29

    Type 2 diabetes mellitus (T2DM), a highly prevalent chronic disease, puts a large burden on individual health and health care systems. Computer simulation models, used to evaluate the clinical and economic effectiveness of various interventions to handle T2DM, have become a well-established tool in diabetes research. Despite the broad consensus about the general importance of validation, especially external validation, as a crucial instrument of assessing and controlling for the quality of these models, there are no systematic reviews comparing such validation of diabetes models. As a result, the main objectives of this systematic review are to identify and appraise the different approaches used for the external validation of existing models covering the development and progression of T2DM. We will perform adapted searches by applying respective search strategies to identify suitable studies from 14 electronic databases. Retrieved study records will be included or excluded based on predefined eligibility criteria as defined in this protocol. Among others, a publication filter will exclude studies published before 1995. We will run abstract and full text screenings and then extract data from all selected studies by filling in a predefined data extraction spreadsheet. We will undertake a descriptive, narrative synthesis of findings to address the study objectives. We will pay special attention to aspects of quality of these models in regard to the external validation based upon ISPOR and ADA recommendations as well as Mount Hood Challenge reports. All critical stages within the screening, data extraction and synthesis processes will be conducted by at least two authors. This protocol adheres to PRISMA and PRISMA-P standards. The proposed systematic review will provide a broad overview of the current practice in the external validation of models with respect to T2DM incidence and progression in humans built on simulation techniques. PROSPERO CRD42017069983 .

  9. Computational fluid dynamic simulation of axial and radial flow membrane chromatography: mechanisms of non-ideality and validation of the zonal rate model.

    Science.gov (United States)

    Ghosh, Pranay; Vahedipour, Kaveh; Lin, Min; Vogel, Jens H; Haynes, Charles; von Lieres, Eric

    2013-08-30

    Membrane chromatography (MC) is increasingly being used as a purification platform for large biomolecules due to higher operational flow rates. The zonal rate model (ZRM) has previously been applied to accurately characterize the hydrodynamic behavior in commercial MC capsules at different configurations and scales. Explorations of capsule size, geometry and operating conditions using the model and experiment were used to identify possible causes of inhomogeneous flow and their contributions to band broadening. In the present study, the hydrodynamics within membrane chromatography capsules are more rigorously investigated by computational fluid dynamics (CFD). The CFD models are defined according to precisely measured capsule geometries in order to avoid the estimation of geometry related model parameters. In addition to validating the assumptions and hypotheses regarding non-ideal flow mechanisms encoded in the ZRM, we show that CFD simulations can be used to mechanistically understand and predict non-binding breakthrough curves without need for estimation of any parameters. When applied to a small-scale axial flow MC capsules, CFD simulations identify non-ideal flows in the distribution (hold-up) volumes upstream and downstream of the membrane stack as the major source of band broadening. For the large-scale radial flow capsule, the CFD model quantitatively predicts breakthrough data using binding parameters independently determined using the small-scale axial flow capsule, identifying structural irregularities within the membrane pleats as an important source of band broadening. The modeling and parameter determination scheme described here therefore facilitates a holistic mechanistic-based method for model based scale-up, obviating the need of performing expensive large-scale experiments under binding conditions. As the CFD model described provides a rich mechanistic analysis of membrane chromatography systems and the ability to explore operational space, but

  10. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  11. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  12. Interoceanic canal excavation scheduling via computer simulation

    International Nuclear Information System (INIS)

    Baldonado, Orlino C.

    1970-01-01

    The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)

  13. Computational Intelligence for Medical Imaging Simulations.

    Science.gov (United States)

    Chang, Victor

    2017-11-25

    This paper describes how to simulate medical imaging by computational intelligence to explore areas that cannot be easily achieved by traditional ways, including genes and proteins simulations related to cancer development and immunity. This paper has presented simulations and virtual inspections of BIRC3, BIRC6, CCL4, KLKB1 and CYP2A6 with their outputs and explanations, as well as brain segment intensity due to dancing. Our proposed MapReduce framework with the fusion algorithm can simulate medical imaging. The concept is very similar to the digital surface theories to simulate how biological units can get together to form bigger units, until the formation of the entire unit of biological subject. The M-Fusion and M-Update function by the fusion algorithm can achieve a good performance evaluation which can process and visualize up to 40 GB of data within 600 s. We conclude that computational intelligence can provide effective and efficient healthcare research offered by simulations and visualization.

  14. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  15. Computer simulation of proton channelling in silicon

    Indian Academy of Sciences (India)

    2000-06-12

    Jun 12, 2000 ... (23). For the system in figure 7 this yields a value of. 2 = 325 ˚A in fair agreement with the computed value. 5. Conclusion. This paper reports an indigenously developed computer code for channelling of fast ions in crystals using Vineyard model and screened binary Coulombic potential. The paper reports.

  16. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  17. Performance predictions for solar-chemical convertors by computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Luttmer, J.D.; Trachtenberg, I.

    1985-08-01

    A computer model which simulates the operation of Texas Instruments solar-chemical convertor (SCC) was developed. The model allows optimization of SCC processes, material, and configuration by facilitating decisions on tradeoffs among ease of manufacturing, power conversion efficiency, and cost effectiveness. The model includes various algorithms which define the electrical, electrochemical, and resistance parameters and which describ the operation of the discrete components of the SCC. Results of the model which depict the effect of material and geometric changes on various parameters are presented. The computer-calculated operation is compared with experimentall observed hydrobromic acid electrolysis rates.

  18. Highway traffic simulation on multi-processor computers

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Doss, E.; Tentner, A.M.

    1997-04-01

    A computer model has been developed to simulate highway traffic for various degrees of automation with a high level of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway traffic system and allows for the use of Intelligent Transportation System (ITS) technologies such as an Automated Intelligent Cruise Control (AICC). The structure of the computer model facilitates the use of parallel computers for the highway traffic simulation, since domain decomposition techniques can be applied in a straight forward fashion. In this model, the highway system (i.e. a network of road links) is divided into multiple regions; each region is controlled by a separate link manager residing on an individual processor. A graphical user interface augments the computer model kv allowing for real-time interactive simulation control and interaction with each individual vehicle and road side infrastructure element on each link. Average speed and traffic volume data is collected at user-specified loop detector locations. Further, as a measure of safety the so- called Time To Collision (TTC) parameter is being recorded.

  19. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2001-01-01

    Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...

  20. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  1. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  2. Computer Simulation of Sexual Selection on Age-Structured Populations

    Science.gov (United States)

    Martins, S. G. F.; Penna, T. J. P.

    Using computer simulations of a bit-string model for age-structured populations, we found that sexual selection of older males is advantageous, from an evolutionary point of view. These results are in opposition to a recent proposal of females choosing younger males. Our simulations are based on findings from recent studies of polygynous bird species. Since secondary sex characters are found mostly in males, we could make use of asexual populations that can be implemented in a fast and efficient way.

  3. Computer simulation and image guidance for individualised dynamic spinal stabilization.

    Science.gov (United States)

    Kantelhardt, S R; Hausen, U; Kosterhon, M; Amr, A N; Gruber, K; Giese, A

    2015-08-01

    Dynamic implants for the human spine are used to re-establish regular segmental motion. However, the results have often been unsatisfactory and complications such as screw loosening are common. Individualisation of appliances and precision implantation are needed to improve the outcome of this procedure. Computer simulation, virtual implant optimisation and image guidance were used to improve the technique. A human lumbar spine computer model was developed using multi-body simulation software. The model simulates spinal motion under load and degenerative changes. After virtual degeneration of a L4/5 segment, virtual pedicle screw-based implants were introduced. The implants' positions and properties were iteratively optimised. The resulting implant positions were used as operative plan for image guidance and finally implemented in a physical spine model. In the simulation, the introduction and optimisation of virtually designed dynamic implants could partly compensate for the effects of virtual lumbar segment degeneration. The optimised operative plan was exported to two different image-guidance systems for transfer to a physical spine model. Three-dimensional computer graphic simulation is a feasible means to develop operative plans for dynamic spinal stabilization. These operative plans can be transferred to commercially available image-guidance systems for use in implantation of physical implants in a spine model. This concept has important potential in the design of operative plans and implants for individualised dynamic spine stabilization surgery.

  4. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  5. Field, model, and computer simulation study of some aspects of the origin and distribution of Colorado Plateau-type uranium deposits

    Science.gov (United States)

    Ethridge, F.G.; Sunada, D.K.; Tyler, Noel; Andrews, Sarah

    1982-01-01

    Numerous hypotheses have been proposed to account for the nature and distribution of tabular uranium and vanadium-uranium deposits of the Colorado Plateau. In one of these hypotheses it is suggested that the deposits resulted from geochemical reactions at the interface between a relatively stagnant groundwater solution and a dynamic, ore-carrying groundwater solution which permeated the host sandstones (Shawe, 1956; Granger, et al., 1961; Granger, 1968, 1976; and Granger and Warren, 1979). The study described here was designed to investigate some aspects of this hypothesis, particularly the nature of fluid flow in sands and sandstones, the nature and distribution of deposits, and the relations between the deposits and the host sandstones. The investigation, which was divided into three phases, involved physical model, field, and computer simulation studies. During the initial phase of the investigation, physical model studies were conducted in porous-media flumes. These studies verified the fact that humic acid precipitates could form at the interface between a humic acid solution and a potassium aluminum sulfate solution and that the nature and distribution of these precipitates were related to flow phenomena and to the nature and distribution of the host porous-media. During the second phase of the investigation field studies of permeability and porosity patterns in Holocene stream deposits were investigated and the data obtained were used to design more realistic porous media models. These model studies, which simulated actual stream deposits, demonstrated that precipitates possess many characteristics, in terms of their nature and relation to host sandstones, that are similar to ore deposits of the Colorado Plateau. The final phase of the investigation involved field studies of actual deposits, additional model studies in a large indoor flume, and computer simulation studies. The field investigations provided an up-to-date interpretation of the depositional

  6. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  7. And So It Grows: Using a Computer-Based Simulation of a Population Growth Model to Integrate Biology & Mathematics

    Science.gov (United States)

    Street, Garrett M.; Laubach, Timothy A.

    2013-01-01

    We provide a 5E structured-inquiry lesson so that students can learn more of the mathematics behind the logistic model of population biology. By using models and mathematics, students understand how population dynamics can be influenced by relatively simple changes in the environment.

  8. Transient Modeling and Simulation of Compact Photobioreactors

    OpenAIRE

    Ribeiro, Robert Luis Lara; Mariano, André Bellin; Souza, Jeferson Avila; Vargas, Jose Viriato Coelho

    2008-01-01

    In this paper, a mathematical model is developed to make possible the simulation of microalgae growth and its dependency on medium temperature and light intensity. The model is utilized to simulate a compact photobioreactor response in time with physicochemical parameters of the microalgae Phaeodactylum tricornutum. The model allows for the prediction of the transient and local evolution of the biomass concentration in the photobioreactor with low computational time. As a result, the model is...

  9. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  10. Computer simulations improve university instructional laboratories.

    Science.gov (United States)

    Gibbons, Nicola J; Evans, Chris; Payne, Annette; Shah, Kavita; Griffin, Darren K

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and "wet" laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff.

  11. Computer Simulations Improve University Instructional Laboratories1

    Science.gov (United States)

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  12. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  13. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  14. Computational fluid dynamics for sport simulation

    CERN Document Server

    2009-01-01

    All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.

  15. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  16. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  17. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  18. Computer simulation for integrated pest management of spruce budworms

    Science.gov (United States)

    Carroll B. Williams; Patrick J. Shea

    1982-01-01

    Some field studies of the effects of various insecticides on the spruce budworm (Choristoneura sp.) and their parasites have shown severe suppression of host (budworm) populations and increased parasitism after treatment. Computer simulation using hypothetical models of spruce budworm-parasite systems based on these field data revealed that (1)...

  19. Solving wood chip transport problems with computer simulation.

    Science.gov (United States)

    Dennis P. Bradley; Sharon A. Winsauer

    1976-01-01

    Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.

  20. The tension of framed membranes from computer simulations

    DEFF Research Database (Denmark)

    Hamkens, Daniel; Jeppesen, Claus; Ipsen, John H.

    2018-01-01

    Abstract.: We have analyzed the behavior of a randomly triangulated, self-avoiding surface model of a flexible, fluid membrane subject to a circular boundary by Wang-Landau Monte Carlo computer simulation techniques. The dependence of the canonical free energy and frame tension on the frame area...

  1. Electromagnetic Modeling of the Propagation Characteristics of Satellite Communications Through Composite Precipitation Layers, Part II: Results of Computer Simulations

    Directory of Open Access Journals (Sweden)

    H.M. Al-Rizzo

    2000-12-01

    Full Text Available A versatile Propagation Simulation Program (PSP is developed to assess the degrading effects caused by the concurrent occurrences of an arbitrary mixture of ice plates and needles, melting snow and raindrops which may impede the reliability of dual-polarized satellite communications systems carrying independent channels on a single radio path. Specifically, results are presented for the Cross Polarization Discrimination (XPD due to ice and rain, differential attenuation, Da, and differential phase shift, Df, due to rain and average specific attenuation, a, and phase shift, f , due to the melting layer at hitherto unconsidered frequencies. The inclusion of an ice-cloud medium is found to possess significant effects on rain-induced XPD even for low ice concentrations, particularly at low fade levels. The relative contribution of the melting layer on rain-induced attenuation is extensively studied for frequencies from 1 to 100 GHz and rain rates below 20 mm/h.

  2. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  3. Computational plasticity algorithm for particle dynamics simulations

    Science.gov (United States)

    Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

    2018-01-01

    The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

  4. Computer simulation of two-phase flow in nuclear reactors

    International Nuclear Information System (INIS)

    Wulff, W.

    1993-01-01

    Two-phase flow models dominate the requirements of economic resources for the development and use of computer codes which serve to analyze thermohydraulic transients in nuclear power plants. An attempt is made to reduce the effort of analyzing reactor transients by combining purpose-oriented modelling with advanced computing techniques. Six principles are presented on mathematical modeling and the selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited than the two-fluid model for the analysis of two-phase flow in nuclear reactors, because of the latter's closure problems. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost. (orig.)

  5. Computer simulation of complexity in plasmas

    International Nuclear Information System (INIS)

    Hayashi, Takaya; Sato, Tetsuya

    1998-01-01

    By making a comprehensive comparative study of many self-organizing phenomena occurring in magnetohydrodynamics and kinetic plasmas, we came up with a hypothetical grand view of self-organization. This assertion is confirmed by a recent computer simulation for a broader science field, specifically, the structure formation of short polymer chains, where the nature of the interaction is completely different from that of plasmas. It is found that the formation of the global orientation order proceeds stepwise. (author)

  6. Computer simulation of the aluminium extrusion process

    Directory of Open Access Journals (Sweden)

    A. Śliwa

    2017-01-01

    Full Text Available The purpose of the work is computer simulation of the aluminium extrusion process using the Finite elements method (FEM. The impact of the speed of a punch falling on the material in the aluminium extrusion process was investigated. It was found that high stresses are created, leading to material destruction, if the punch is falling too fast. The design cycle is significantly reduced in multiple industrial applications if the FEM is applied, which enhances productivity and profits.

  7. Computer simulation of displacement cascades in copper

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1983-06-01

    More than 500 displacement cascades in copper have been generated with the computer simulation code MARLOWE over an energy range pertinent to both fission and fusion neutron spectra. Three-dimensional graphical depictions of selected cascades, as well as quantitative analysis of cascade shapes and sizes and defect densities, illustrate cascade behavior as a function of energy. With increasing energy, the transition from production of single compact damage regions to widely spaced multiple damage regions is clearly demonstrated

  8. Computer simulation of a 3-phase induction motor

    International Nuclear Information System (INIS)

    Memon, N.A.; Unsworth, P.J.

    2004-01-01

    Computer Simulation of a 3-phase squirrel-cage induction motor is presented in Microsoft QBASIC for understanding trends and various operational modes of an induction motor. Thyristor fed, phase controlled induction motor (three-wire) model has been simulated. In which voltage is applied to the motor stator winding through back-to-back connected thyristors as controlled switches in series with the stator. The simulated induction motor system opens up towards a wide range of investigation/analysis options for research and development work in the field. Key features of the simulation performed are highlighted for development of better understanding of the work done. Complete study of an Induction Motor, starting modes in terms the voltage/current, torque/speed characteristics and their graphical representation produced is presented. Ideal agreement of the simulation results with the notional outcome encourages users to go ahead for various hardware development projects based on the study through the simulation. (author)

  9. Validated physical models and parameters of bulk 3C–SiC aiming for credible technology computer aided design (TCAD) simulation

    Science.gov (United States)

    Arvanitopoulos, A.; Lophitis, N.; Gyftakis, K. N.; Perkins, S.; Antoniou, M.

    2017-10-01

    The cubic form of SiC (β- or 3C-) compared to the hexagonal α-SiC polytypes, primarily 4H- and 6H–SiC, has lower growth cost and can be grown heteroepitaxially in large area silicon (Si) wafers which makes it of special interest. This in conjunction with the recently reported growth of improved quality 3C–SiC, make the development of devices an imminent objective. However, the readiness of models that accurately predict the material characteristics, properties and performance is an imperative requirement for attaining the design and optimization of functional devices. The purpose of this study is to provide and validate a comprehensive set of models alongside with their parameters for bulk 3C–SiC. The validation process revealed that the proposed models are in a very good agreement to experimental data and confidence ranges were identified. This is the first piece of work achieving that for 3C–SiC. Considerably, it constitutes the necessary step for finite element method simulations and technology computer aided design.

  10. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  11. A compositional reservoir simulator on distributed memory parallel computers

    International Nuclear Information System (INIS)

    Rame, M.; Delshad, M.

    1995-01-01

    This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. A portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented

  12. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  13. A Computational Model to Simulate Groundwater Seepage Risk in Support of Geotechnical Investigations of Levee and Dam Projects

    Science.gov (United States)

    2013-03-01

    they often have spatial (e.g., small map scale ) or temporal resolutions (e.g., imagery from few time ERDC/GSL TR-13-5 28 periods) that hinder...Fluvial processes in geomorphology. San Francisco: W.H. Freeman. Li, H., and J. Caers. 2011. Geological modeling and history matching of multi- scale ...filling. Sedimentology 47:121- 178. Peakall, J., P. J. Ashworth , and J. L. Best. 2007. Meander-bend evolution, alluvial architecture, and the role of

  14. Plasticity modeling & computation

    CERN Document Server

    Borja, Ronaldo I

    2013-01-01

    There have been many excellent books written on the subject of plastic deformation in solids, but rarely can one find a textbook on this subject. “Plasticity Modeling & Computation” is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids. It adopts a simple narrative style that is not mathematically overbearing, and has been written to emulate a professor giving a lecture on this subject inside a classroom. Each section is written to provide a balance between the relevant equations and the explanations behind them. Where relevant, sections end with one or more exercises designed to reinforce the understanding of the “lecture.” Color figures enhance the presentation and make the book very pleasant to read. For professors planning to use this textbook for their classes, the contents are sufficient for Parts A and B that can be taught in sequence over a period of two semesters or quarters.

  15. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  16. Dynamic computer simulation of the Fort St. Vrain steam turbines

    International Nuclear Information System (INIS)

    Conklin, J.C.

    1983-01-01

    A computer simulation is described for the dynamic response of the Fort St. Vrain nuclear reactor regenerative intermediate- and low-pressure steam turbines. The fundamental computer-modeling assumptions for the turbines and feedwater heaters are developed. A turbine heat balance specifying steam and feedwater conditions at a given generator load and the volumes of the feedwater heaters are all that are necessary as descriptive input parameters. Actual plant data for a generator load reduction from 100 to 50% power (which occurred as part of a plant transient on November 9, 1981) are compared with computer-generated predictions, with reasonably good agreement

  17. Integrating environmental taxes on local air pollutants with fiscal reform in Hungary: simulations with a computable general equilibrium model

    International Nuclear Information System (INIS)

    Morris, Glenn E.; Revesz, Tamas; Zalai, Ernoe; Fucsko, Jozsef

    1999-01-01

    This paper describes the Fiscal Environmental Integration Model (FEIM) and its use to examine the merits of introducing a set of air pollutant emission taxes and stringent abatement requirements based on best commonly available control technology. These environmental protection strategies are examined both independently and in combination. In addition, Hungary has very high VAT, employment, and income tax rates and therefore might receive more than the usual advantage from using environmental tax revenues to reduce other taxes. We therefore also examine the economic and environmental implications of different uses of the revenues generated by the air pollutant emissions taxes. FEIM is a CGE model of the Hungarian economy that includes sectoral air pollution control costs functions and execution options that allow examination of the key policy choices involved. We developed and ran a baseline and seven scenarios with FEIM. The scenarios centered on introduction of environmental load fees (ELF) on emissions of SO 2 , NO x , and particulates and emission abatement requirements (EAR) for these pollutants. (Author)

  18. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  19. Integrated Computational Tools for Identification of CCR5 Antagonists as Potential HIV-1 Entry Inhibitors: Homology Modeling, Virtual Screening, Molecular Dynamics Simulations and 3D QSAR Analysis

    Directory of Open Access Journals (Sweden)

    Suri Moonsamy

    2014-04-01

    Full Text Available Using integrated in-silico computational techniques, including homology modeling, structure-based and pharmacophore-based virtual screening, molecular dynamic simulations, per-residue energy decomposition analysis and atom-based 3D-QSAR analysis, we proposed ten novel compounds as potential CCR5-dependent HIV-1 entry inhibitors. Via validated docking calculations, binding free energies revealed that novel leads demonstrated better binding affinities with CCR5 compared to maraviroc, an FDA-approved HIV-1 entry inhibitor and in clinical use. Per-residue interaction energy decomposition analysis on the averaged MD structure showed that hydrophobic active residues Trp86, Tyr89 and Tyr108 contributed the most to inhibitor binding. The validated 3D-QSAR model showed a high cross-validated rcv2 value of 0.84 using three principal components and non-cross-validated r2 value of 0.941. It was also revealed that almost all compounds in the test set and training set yielded a good predicted value. Information gained from this study could shed light on the activity of a new series of lead compounds as potential HIV entry inhibitors and serve as a powerful tool in the drug design and development machinery.

  20. Simulation of a welding process in polyduct pipelines resolved with a finite elements computational model. Comparison with analytical solutions and tests with thermocouples

    International Nuclear Information System (INIS)

    Sanzi, H; Elvira, G; Kloster, M; Asta, E; Zalazar, M

    2006-01-01

    All welding processes induce deformations and thermal tensions, which must be evaluated correctly since they can influence a component's structural integrity. This work determines the distribution of temperatures that develop during a manual welding process with shielded electrodes (SMAW), on the circumference seam of a pipe for use in polyducts. A simplified model of Finite Elements (FEA) using three dimensional solids is proposed for the study. The analysis considers that while the welding process is underway, no heat is lost into the environment, that is, adiabatic conditions are considered, and the transformations produced in the material due to phase changes do not produce modifications in the properties of the supporting or base materials. The results of the simulation are compared with those obtained by recent analytical studies developed by different investigators, such as Nguyen, Ohta, Matsuoka, Suzuki and Taeda, where a continuously moving three dimensional double ellipsoidal source was used. The results are then compared with the experimental results by measuring with thermocouples. This study reveals the sensitivity and the validity of the proposed computer model, and in a second stage optimizes the engineering times for the resolution of a problem like the one presented in order to design the corresponding welding procedure (CW)

  1. DEVELOPMENT OF BIOSURFACTANT-MEDIATED OIL RECOVERY IN MODEL POROUS SYSTEMS AND COMPUTER SIMULATIONS OF BIOSURFACTANT-MEDIATED OIL RECOVERY

    Energy Technology Data Exchange (ETDEWEB)

    M.J. McInerney; S.K. Maudgalya; R. Knapp; M. Folmsbee

    2004-05-31

    Current technology recovers only one-third to one-half of the oil that is originally present in an oil reservoir. Entrapment of petroleum hydrocarbons by capillary forces is a major factor that limits oil recovery (1, 3, 4). Hydrocarbon displacement can occur if interfacial tension (IFT) between the hydrocarbon and aqueous phases is reduced by several orders of magnitude. Microbially-produced biosurfactants may be an economical method to recover residual hydrocarbons since they are effective at low concentrations. Previously, we showed that substantial mobilization of residual hydrocarbon from a model porous system occurs at biosurfactant concentrations made naturally by B. mojavensis strain JF-1 if a polymer and 2,3-butanediol were present (2). In this report, we include data on oil recovery from Berea sandstone experiments along with our previous data from sand pack columns in order to relate biosurfactant concentration to the fraction of oil recovered. We also investigate the effect that the JF-2 biosurfactant has on interfacial tension (IFT). The presence of a co-surfactant, 2,3-butanediol, was shown to improve oil recoveries possibly by changing the optimal salinity concentration of the formulation. The JF-2 biosurfactant lowered IFT by nearly 2 orders of magnitude compared to typical values of 28-29 mN/m. Increasing the salinity increased the IFT with or without 2,3-butanediol present. The lowest interfacial tension observed was 0.1 mN/m. Tertiary oil recovery experiments showed that biosurfactant solutions with concentrations ranging from 10 to 60 mg/l in the presence of 0.1 mM 2,3-butanediol and 1 g/l of partially hydrolyzed polyacrylamide (PHPA) recovered 10-40% of the residual oil present in Berea sandstone cores. When PHPA was used alone, about 10% of the residual oil was recovered. Thus, about 10% of the residual oil recovered in these experiments was due to the increase in viscosity of the displacing fluid. Little or no oil was recovered at

  2. COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD

    Directory of Open Access Journals (Sweden)

    Leonid Flehantov

    2017-03-01

    Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.

  3. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  4. Numerical simulation of dynamics of brushless dc motors for aerospace and other applications. Volume 2: User's guide to computer EMA model

    Science.gov (United States)

    Demerdash, N. A. O.; Nehl, T. W.

    1979-01-01

    A description and user's guide of the computer program developed to simulate the dynamics of an electromechanical actuator for aerospace applications are presented. The effects of the stator phase currents on the permanent magnets of the rotor are examined. The voltage and current waveforms present in the power conditioner network during the motoring, regenerative braking, and plugging modes of operation are presented and discussed.

  5. Simulation and computation in health physics training

    International Nuclear Information System (INIS)

    Lakey, S.R.A.; Gibbs, D.C.C.; Marchant, C.P.

    1980-01-01

    The Royal Naval College has devised a number of computer aided learning programmes applicable to health physics which include radiation shield design and optimisation, environmental impact of a reactor accident, exposure levels produced by an inert radioactive gas cloud, and the prediction of radiation detector response in various radiation field conditions. Analogue computers are used on reduced or fast time scales because time dependent phenomenon are not always easily assimilated in real time. The build-up and decay of fission products, the dynamics of intake of radioactive material and reactor accident dynamics can be effectively simulated. It is essential to relate these simulations to real time and the College applies a research reactor and analytical phantom to this end. A special feature of the reactor is a chamber which can be supplied with Argon-41 from reactor exhaust gases to create a realistic gaseous contamination environment. Reactor accident situations are also taught by using role playing sequences carried out in real time in the emergency facilities associated with the research reactor. These facilities are outlined and the training technique illustrated with examples of the calculations and simulations. The training needs of the future are discussed, with emphasis on optimisation and cost-benefit analysis. (H.K.)

  6. An advanced coarse-grained nucleosome core particle model for computer simulations of nucleosome-nucleosome interactions under varying ionic conditions.

    Directory of Open Access Journals (Sweden)

    Yanping Fan

    Full Text Available In the eukaryotic cell nucleus, DNA exists as chromatin, a compact but dynamic complex with histone proteins. The first level of DNA organization is the linear array of nucleosome core particles (NCPs. The NCP is a well-defined complex of 147 bp DNA with an octamer of histones. Interactions between NCPs are of paramount importance for higher levels of chromatin compaction. The polyelectrolyte nature of the NCP implies that nucleosome-nucleosome interactions must exhibit a great influence from both the ionic environment as well as the positively charged and highly flexible N-terminal histone tails, protruding out from the NCP. The large size of the system precludes a modelling analysis of chromatin at an all-atom level and calls for coarse-grained approximations. Here, a model of the NCP that include the globular histone core and the flexible histone tails described by one particle per each amino acid and taking into account their net charge is proposed. DNA wrapped around the histone core was approximated at the level of two base pairs represented by one bead (bases and sugar plus four beads of charged phosphate groups. Computer simulations, using a Langevin thermostat, in a dielectric continuum with explicit monovalent (K(+, divalent (Mg(2+ or trivalent (Co(NH(3(6 (3+ cations were performed for systems with one or ten NCPs. Increase of the counterion charge results in a switch from repulsive NCP-NCP interaction in the presence of K(+, to partial aggregation with Mg(2+ and to strong mutual attraction of all 10 NCPs in the presence of CoHex(3+. The new model reproduced experimental results and the structure of the NCP-NCP contacts is in agreement with available data. Cation screening, ion-ion correlations and tail bridging contribute to the NCP-NCP attraction and the new NCP model accounts for these interactions.

  7. Computer simulation studies in condensed-matter physics 5. Proceedings

    International Nuclear Information System (INIS)

    Landau, D.P.; Mon, K.K.; Schuettler, H.B.

    1993-01-01

    As the role of computer simulations began to increase in importance, we sensed a need for a ''meeting place'' for both experienced simulators and neophytes to discuss new techniques and results in an environment which promotes extended discussion. As a consequence of these concerns, The Center for Simulational Physics established an annual workshop on Recent Developments in Computer Simulation Studies in Condensed-Matter Physics. This year's workshop was the fifth in this series and the interest which the scientific community has shown demonstrates quite clearly the useful purpose which the series has served. The workshop was held at the University of Georgia, February 17-21, 1992, and these proceedings from a record of the workshop which is published with the goal of timely dissemination of the papers to a wider audience. The proceedings are divided into four parts. The first part contains invited papers which deal with simulational studies of classical systems and includes an introduction to some new simulation techniques and special purpose computers as well. A separate section of the proceedings is devoted to invited papers on quantum systems including new results for strongly correlated electron and quantum spin models. The third section is comprised of a single, invited description of a newly developed software shell designed for running parallel programs. The contributed presentations comprise the final chapter. (orig.). 79 figs

  8. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  9. A computer model for a theory of evolution.

    Science.gov (United States)

    Bocci, Cristiano; Freguglia, Paolo; Rogora, Enrico

    2010-01-01

    Computer models and computer simulations are crucial for understanding complex phenomena because they compel the explicit enumeration of all variables and the exact specification of all relations between them. In this paper we discuss a computer model for a phenotypical theory of evolution which, in our opinion, is well suited to simulate the complex dependence of speciation on both internal and external factors, through their influences on the fertility factor. Some of these dependences are investigated through simulations.

  10. Towards The Deep Model : Understanding Visual Recognition Through Computational Models

    OpenAIRE

    Wang, Panqu

    2017-01-01

    Understanding how visual recognition is achieved in the human brain is one of the most fundamental questions in vision research. In this thesis I seek to tackle this problem from a neurocomputational modeling perspective. More specifically, I build machine learning-based models to simulate and explain cognitive phenomena related to human visual recognition, and I improve computational models using brain-inspired principles to excel at computer vision tasks.I first describe how a neurocomputat...

  11. Parallel Proximity Detection for Computer Simulation

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  12. Parallel Proximity Detection for Computer Simulations

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  13. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    entitled Computer simulations of lipid bilayers and proteins describes two molecular dynamics (MD) simulation studies of pure lipid bilayers as well as a study of a transmembrane protein embedded in a lipid bilayer matrix. Below follows a brief overview of the thesis. Chapter 1. This chapter is a short...... introduction, where I briefly describe the basic biological background for the systems studied in Chapters 3, 4 and 5. This is done in a non-technical way to allow the general interested reader to get an impression of the work. Chapter 2, Methods: In this chapter the background for the methods used......, Pressure profile calculations in lipid bilayers: A lipid bilayer is merely $\\sim$5~nm thick, but the lateral pressure (parallel to the bilayer plane) varies several hundred bar on this short distance (normal to the bilayer). These variations in the lateral pressure are commonly referred to as the pressure...

  14. Computer Simulations of Intrinsically Disordered Proteins

    Science.gov (United States)

    Chong, Song-Ho; Chatterjee, Prathit; Ham, Sihyun

    2017-05-01

    The investigation of intrinsically disordered proteins (IDPs) is a new frontier in structural and molecular biology that requires a new paradigm to connect structural disorder to function. Molecular dynamics simulations and statistical thermodynamics potentially offer ideal tools for atomic-level characterizations and thermodynamic descriptions of this fascinating class of proteins that will complement experimental studies. However, IDPs display sensitivity to inaccuracies in the underlying molecular mechanics force fields. Thus, achieving an accurate structural characterization of IDPs via simulations is a challenge. It is also daunting to perform a configuration-space integration over heterogeneous structural ensembles sampled by IDPs to extract, in particular, protein configurational entropy. In this review, we summarize recent efforts devoted to the development of force fields and the critical evaluations of their performance when applied to IDPs. We also survey recent advances in computational methods for protein configurational entropy that aim to provide a thermodynamic link between structural disorder and protein activity.

  15. The learning effects of computer simulations in science education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van Joolingen, Wouter; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to

  16. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    Science.gov (United States)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  17. Development of computer simulations for landfill methane recovery

    Energy Technology Data Exchange (ETDEWEB)

    Massmann, J.W.; Moore, C.A.; Sykes, R.M.

    1981-12-01

    Two- and three-dimensional finite-difference computer programs simulating methane recovery systems in landfills have been developed. These computer programs model multicomponent combined pressure and diffusional flow in porous media. Each program and the processes it models are described in this report. Examples of the capabilities of each program are also presented. The two-dimensional program was used to simulate methane recovery systems in a cylindrically shaped landfill. The effects of various pump locations, geometries, and extraction rates were determined. The three-dimensional program was used to model the Puente Hills landfill, a field test site in southern California. The biochemical and microbiological details of methane generation in landfills are also given. Effects of environmental factors, such as moisture, oxygen, temperature, and nutrients on methane generation are discussed and an analytical representation of the gas generation rate is developed.

  18. Computer generated timing diagrams to supplement simulation

    CERN Document Server

    Booth, A W

    1981-01-01

    The ISPS computer description language has been used in a simulation study to specify the components of a high speed data acquisition system and its protocols. A facility has been developed for automatically generating timing diagrams from the specification of the data acquisition system written in the ISPS description language. Diagrams can be generated for both normal and abnormal working modes of the system. They are particularly useful for design and debugging in the prototyping stage of a project and can be later used for reference by maintenance engineers. (11 refs).

  19. Computer simulation of replacement sequences in copper

    International Nuclear Information System (INIS)

    Schiffgens, J.O.; Schwartz, D.W.; Ariyasu, R.G.; Cascadden, S.E.

    1978-01-01

    Results of computer simulations of , , and replacement sequences in copper are presented, including displacement thresholds, focusing energies, energy losses per replacement, and replacement sequence lengths. These parameters are tabulated for six interatomic potentials and shown to vary in a systematic way with potential stiffness and range. Comparisons of results from calculations made with ADDES, a quasi-dynamical code, and COMENT, a dynamical code, show excellent agreement, demonstrating that the former can be calibrated and used satisfactorily in the analysis of low energy displacement cascades. Upper limits on , , and replacement sequences were found to be approximately 10, approximately 30, and approximately 14 replacements, respectively. (author)

  20. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  1. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  2. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  3. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  4. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    The computer is to chaos what cloud chambers and particle accelerators are to particle-physics. Numbers and functions are chaos' mesons an~ quarks. In this article we provide an introduction to chaos and the role that computers play in this field. Chaos and Dynamical Systems. The laws of science aim at relating cause ...

  5. EWE: A computer model for ultrasonic inspection

    Science.gov (United States)

    Douglas, S. R.; Chaplin, K. R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It was applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues.

  6. Ewe: a computer model for ultrasonic inspection

    International Nuclear Information System (INIS)

    Douglas, S.R.; Chaplin, K.R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues

  7. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  8. A Computational Framework for Realistic Retina Modeling.

    Science.gov (United States)

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  9. The behaviour of adaptive boneremodeling simulation models

    NARCIS (Netherlands)

    Weinans, H.; Huiskes, R.; Grootenboer, H.J.

    1992-01-01

    The process of adaptive bone remodeling can be described mathematically and simulated in a computer model, integrated with the finite element method. In the model discussed here, cortical and trabecular bone are described as continuous materials with variable density. The remodeling rule applied to

  10. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  11. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  12. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  13. A Stochastic Dynamic Model of Computer Viruses

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2012-01-01

    Full Text Available A stochastic computer virus spread model is proposed and its dynamic behavior is fully investigated. Specifically, we prove the existence and uniqueness of positive solutions, and the stability of the virus-free equilibrium and viral equilibrium by constructing Lyapunov functions and applying Ito's formula. Some numerical simulations are finally given to illustrate our main results.

  14. Computational modelling for dry-powder inhalers

    NARCIS (Netherlands)

    Kröger, Ralf; Woolhouse, Robert; Becker, Michael; Wachtel, Herbert; de Boer, Anne; Horner, Marc

    2012-01-01

    Computational fluid dynamics (CFD) is a simulation tool used for modelling powder flow through inhalers to allow optimisation both of device design and drug powder. Here, Ralf Kröger, Consulting Senior CFD Engineer, ANSYS Germany GmbH; Marc Horner, Lead Technical Services Engineer, Healthcare,

  15. Computer Simulation of the UMER Gridded Gun

    CERN Document Server

    Haber, Irving; Friedman, Alex; Grote, D P; Kishek, Rami A; Reiser, Martin; Vay, Jean-Luc; Zou, Yun

    2005-01-01

    The electron source in the University of Maryland Electron Ring (UMER) injector employs a grid 0.15 mm from the cathode to control the current waveform. Under nominal operating conditions, the grid voltage during the current pulse is sufficiently positive relative to the cathode potential to form a virtual cathode downstream of the grid. Three-dimensional computer simulations have been performed that use the mesh refinement capability of the WARP particle-in-cell code to examine a small region near the beam center in order to illustrate some of the complexity that can result from such a gridded structure. These simulations have been found to reproduce the hollowed velocity space that is observed experimentally. The simulations also predict a complicated time-dependent response to the waveform applied to the grid during the current turn-on. This complex temporal behavior appears to result directly from the dynamics of the virtual cathode formation and may therefore be representative of the expected behavior in...

  16. A computer program for scanning transmission ion microscopy simulation

    International Nuclear Information System (INIS)

    Wu, R.; Shen, H.; Mi, Y.; Sun, M.D.; Yang, M.J.

    2005-01-01

    With the installation of the Scanning Proton Microprobe system at Fudan University, we are in the process of developing a three-dimension reconstruction technique based on scanning transmission ion microscopy-computed tomography (STIM-CT). As the first step, a related computer program of STIM simulation has been established. This program is written in the Visual C++[reg], using the technique of OOP (Object Oriented Programming) and it is a standard multiple-document Windows[reg] program. It can be run with all MS Windows[reg] operating systems. The operating mode is the menu mode, using a multiple process technique. The stopping power theory is based on the Bethe-Bloch formula. In order to simplify the calculation, the improved cylindrical coordinate model was introduced in the program instead of a usual spherical or cylindrical coordinate model. The simulated results of a sample at several rotation angles are presented

  17. Transparency of Environmental Computer Models

    NARCIS (Netherlands)

    Vos, de M.G.; Top, J.L.; van Hage, W.R.; Schreiber, A.Th.

    2013-01-01

    Environmental computer models are considered essential tools in supporting environmental decision making, but their main value is that they allow a better understanding of our complex environment. Despite numerous attempts to promote good modelling practice, transparency of current environmental

  18. Simulating soil melting with CFD [computational fluid dynamics

    International Nuclear Information System (INIS)

    Hawkes, G.L.

    1997-01-01

    Computational fluid dynamics (CFD) is being used to validate the use of thermal plasma arc vitrification for treatment of contaminated soil. Soil melting is modelled by a CFD calculation code which links electrical fields, heat transport, and natural convection. The developers believe it is the first successful CFD analysis to incorporate a simulated PID (proportional-integral-derivative) controller, which plays a vital role by following the specified electrical power curve. (Author)

  19. Computational Simulations and the Scientific Method

    Science.gov (United States)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  20. How Many Times Should One Run a Computational Simulation?

    DEFF Research Database (Denmark)

    Seri, Raffaello; Secchi, Davide

    2017-01-01

    This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...... statistical power analysis as a way to determine the appropriate number of runs. Two examples are then produced using results from an agent-based model. The reader is then guided through the application of this statistical technique and exposed to its limits and potentials....

  1. COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE

    Directory of Open Access Journals (Sweden)

    Constantin LUPU

    2015-07-01

    Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.

  2. The null-event method in computer simulation

    International Nuclear Information System (INIS)

    Lin, S.L.

    1978-01-01

    The simulation of collisions of ions moving under the influence of an external field through a neutral gas to non-zero temperatures is discussed as an example of computer models of processes in which a probe particle undergoes a series of interactions with an ensemble of other particles, such that the frequency and outcome of the events depends on internal properties of the second particles. The introduction of null events removes the need for much complicated algebra, leads to a more efficient simulation and reduces the likelihood of logical error. (Auth.)

  3. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  4. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  5. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  6. Structural models of zebrafish (Danio rerio NOD1 and NOD2 NACHT domains suggest differential ATP binding orientations: insights from computational modeling, docking and molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Jitendra Maharana

    Full Text Available Nucleotide-binding oligomerization domain-containing protein 1 (NOD1 and NOD2 are cytosolic pattern recognition receptors playing pivotal roles in innate immune signaling. NOD1 and NOD2 recognize bacterial peptidoglycan derivatives iE-DAP and MDP, respectively and undergoes conformational alternation and ATP-dependent self-oligomerization of NACHT domain followed by downstream signaling. Lack of structural adequacy of NACHT domain confines our understanding about the NOD-mediated signaling mechanism. Here, we predicted the structure of NACHT domain of both NOD1 and NOD2 from model organism zebrafish (Danio rerio using computational methods. Our study highlighted the differential ATP binding modes in NOD1 and NOD2. In NOD1, γ-phosphate of ATP faced toward the central nucleotide binding cavity like NLRC4, whereas in NOD2 the cavity was occupied by adenine moiety. The conserved 'Lysine' at Walker A formed hydrogen bonds (H-bonds and Aspartic acid (Walker B formed electrostatic interaction with ATP. At Sensor 1, Arg328 of NOD1 exhibited an H-bond with ATP, whereas corresponding Arg404 of NOD2 did not. 'Proline' of GxP motif (Pro386 of NOD1 and Pro464 of NOD2 interacted with adenine moiety and His511 at Sensor 2 of NOD1 interacted with γ-phosphate group of ATP. In contrast, His579 of NOD2 interacted with the adenine moiety having a relatively inverted orientation. Our findings are well supplemented with the molecular interaction of ATP with NLRC4, and consistent with mutagenesis data reported for human, which indicates evolutionary shared NOD signaling mechanism. Together, this study provides novel insights into ATP binding mechanism, and highlights the differential ATP binding modes in zebrafish NOD1 and NOD2.

  7. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    Science.gov (United States)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual

  8. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  9. Real time simulation of large systems on mini-computer

    International Nuclear Information System (INIS)

    Nakhle, Michel; Roux, Pierre.

    1979-01-01

    Most simulation languages will only accept an explicit formulation of differential equations, and logical variables hold no special status therein. The pace of the suggested methods of integration is limited by the smallest time constant of the model submitted. The NEPTUNIX 2 simulation software has a language that will take implicit equations and an integration method of which the variable pace is not limited by the time constants of the model. This, together with high time and memory ressources optimization of the code generated, makes NEPTUNIX 2 a basic tool for simulation on mini-computers. Since the logical variables are specific entities under centralized control, correct processing of discontinuities and synchronization with a real process are feasible. The NEPTUNIX 2 is the industrial version of NEPTUNIX 1 [fr

  10. Computational simulation methods for composite fracture mechanics

    Science.gov (United States)

    Murthy, Pappu L. N.

    1988-01-01

    Structural integrity, durability, and damage tolerance of advanced composites are assessed by studying damage initiation at various scales (micro, macro, and global) and accumulation and growth leading to global failure, quantitatively and qualitatively. In addition, various fracture toughness parameters associated with a typical damage and its growth must be determined. Computational structural analysis codes to aid the composite design engineer in performing these tasks were developed. CODSTRAN (COmposite Durability STRuctural ANalysis) is used to qualitatively and quantitatively assess the progressive damage occurring in composite structures due to mechanical and environmental loads. Next, methods are covered that are currently being developed and used at Lewis to predict interlaminar fracture toughness and related parameters of fiber composites given a prescribed damage. The general purpose finite element code MSC/NASTRAN was used to simulate the interlaminar fracture and the associated individual as well as mixed-mode strain energy release rates in fiber composites.

  11. Amorphous nanoparticles — Experiments and computer simulations

    International Nuclear Information System (INIS)

    Hoang, Vo Van; Ganguli, Dibyendu

    2012-01-01

    The data obtained by both experiments and computer simulations concerning the amorphous nanoparticles for decades including methods of synthesis, characterization, structural properties, atomic mechanism of a glass formation in nanoparticles, crystallization of the amorphous nanoparticles, physico-chemical properties (i.e. catalytic, optical, thermodynamic, magnetic, bioactivity and other properties) and various applications in science and technology have been reviewed. Amorphous nanoparticles coated with different surfactants are also reviewed as an extension in this direction. Much attention is paid to the pressure-induced polyamorphism of the amorphous nanoparticles or amorphization of the nanocrystalline counterparts. We also introduce here nanocomposites and nanofluids containing amorphous nanoparticles. Overall, amorphous nanoparticles exhibit a disordered structure different from that of corresponding bulks or from that of the nanocrystalline counterparts. Therefore, amorphous nanoparticles can have unique physico-chemical properties differed from those of the crystalline counterparts leading to their potential applications in science and technology.

  12. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  13. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

    2002-07-28

    This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

  14. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  15. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  16. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  17. Introducing Computational Physics in Introductory Physics using Intentionally Incorrect Simulations

    Science.gov (United States)

    Cox, Anne

    2011-03-01

    Students in physics courses routinely use and trust computer simulations. Finding errors in intentionally incorrect simulations can help students learn physics, be more skeptical of simulations, and provide an initial introduction to computational physics. This talk will provide examples of electrostatics simulations that students can correct using Easy Java Simulations and are housed in the Open Source Physics Collection on ComPADRE (http://www.compadre.org/osp). Partial support through the Open Source Physics Project, NSF DUE-0442581.

  18. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  19. Associative Memory Computing Power and Its Simulation

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  20. Associative Memory computing power and its simulation

    CERN Document Server

    Ancu, L S; The ATLAS collaboration; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...