WorldWideScience

Sample records for advanced computational simulation

  1. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  2. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  3. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  4. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  5. The use of advanced computer simulation in structural design

    Energy Technology Data Exchange (ETDEWEB)

    Field, C.J.; Mole, A. [Arup, San Fransisco, CA (United States); Arkinstall, M. [Arup, Sydney (Australia)

    2005-07-01

    The benefits that can be gained from the application of advanced numerical simulation in building design were discussed. A review of current practices in structural engineering was presented along with an illustration of a range of international project case studies. Structural engineers use analytical methods to evaluate both static and dynamic loads. Structural design is prescribed by a range of building codes, depending on location, building type and loading, but often, buildings do not fit well within the codes, particularly if one wants to take advantage of new technologies and developments in design that are not covered by the code. Advanced simulation refers to the use of mathematical modeling to complex problems to allow a wider consideration of building types and conditions that can be designed reliably using standard practices. Advanced simulation is used to address virtual testing and prototyping, verifying innovative design ideas, forensic engineering, and design optimization. The benefits of advanced simulation include enhanced creativity, improved performance, cost savings, risk management, sustainable design solutions, and better communication. The following 5 case studies illustrated the value gained by using advanced simulation as an integral part of the design process: the earthquake resistant Maison Hermes in Tokyo; the seismic resistant braces known as the Unbonded Brace for use in the United States; a simulation of the existing Disney Museum to evaluate its capacity to resist earthquakes; simulation of the MIT Brain and Cognitive Science Project to evaluate the effect of different foundation types on the vibration entering the building; and, the Beijing Aquatic Center whose design was streamlined by optimized structural analysis. It was suggested that industry should encourage the transfer of technology from other professions and should try to collaborate towards a global building model to construct buildings in a more efficient manner. 7 refs

  6. Advances in Computational Social Science and Social Simulation

    OpenAIRE

    Miguel Quesada, Francisco J.; Amblard, Frédéric; Juan A. Barceló; Madella, Marco; Aguirre, Cristián; Ahrweiler, Petra; Aldred, Rachel; Ali Abbas, Syed Muhammad; Lopez Rojas, Edgar Alonso; Alonso Betanzos, Amparo; Alvarez Galvez, Javier; Andrighetto, Giulia; Antunes, Luis; Araghi, Yashar; Asatani, Kimitaka

    2014-01-01

    Aquesta conferència és la celebració conjunta de la "10th Artificial Economics Conference AE", la "10th Conference of the European Social Simulation Association ESSA" i la "1st Simulating the Past to Understand Human History SPUHH". Conferència organitzada pel Laboratory for Socio­-Historical Dynamics Simulation (LSDS-­UAB) de la Universitat Autònoma de Barcelona. Readers will find results of recent research on computational social science and social simulation economics, management, so...

  7. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  8. Computational modeling, optimization and manufacturing simulation of advanced engineering materials

    CERN Document Server

    2016-01-01

    This volume presents recent research work focused in the development of adequate theoretical and numerical formulations to describe the behavior of advanced engineering materials.  Particular emphasis is devoted to applications in the fields of biological tissues, phase changing and porous materials, polymers and to micro/nano scale modeling. Sensitivity analysis, gradient and non-gradient based optimization procedures are involved in many of the chapters, aiming at the solution of constitutive inverse problems and parameter identification. All these relevant topics are exposed by experienced international and inter institutional research teams resulting in a high level compilation. The book is a valuable research reference for scientists, senior undergraduate and graduate students, as well as for engineers acting in the area of computational material modeling.

  9. Advances in multi-physics and high performance computing in support of nuclear reactor power systems modeling and simulation

    International Nuclear Information System (INIS)

    Significant advances in computational performance have occurred over the past two decades, achieved not only by the introduction of more powerful processors but the incorporation of parallelism in computer hardware at all levels. Simultaneous with these hardware and associated system software advances have been advances in modeling physical phenomena and the numerical algorithms to allow their usage in simulation. This paper presents a review of the advances in computer performance, discusses the modeling and simulation capabilities required to address the multi-physics and multi-scale phenomena applicable to a nuclear reactor core simulator, and present examples of relevant physics simulation codes' performances on high performance computers.

  10. NATO Advanced Study Institute on Advances in the Computer Simulations of Liquid Crystals

    CERN Document Server

    Zannoni, Claudio

    2000-01-01

    Computer simulations provide an essential set of tools for understanding the macroscopic properties of liquid crystals and of their phase transitions in terms of molecular models. While simulations of liquid crystals are based on the same general Monte Carlo and molecular dynamics techniques as are used for other fluids, they present a number of specific problems and peculiarities connected to the intrinsic properties of these mesophases. The field of computer simulations of anisotropic fluids is interdisciplinary and is evolving very rapidly. The present volume covers a variety of techniques and model systems, from lattices to hard particle and Gay-Berne to atomistic, for thermotropics, lyotropics, and some biologically interesting liquid crystals. Contributions are written by an excellent panel of international lecturers and provides a timely account of the techniques and problems in the field.

  11. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  12. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    Science.gov (United States)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  13. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  14. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  15. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  16. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  17. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  18. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  19. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  20. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  1. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  2. Current Advances in the Computational Simulation of the Formation of Low-Mass Stars

    Energy Technology Data Exchange (ETDEWEB)

    Klein, R I; Inutsuka, S; Padoan, P; Tomisaka, K

    2005-10-24

    Developing a theory of low-mass star formation ({approx} 0.1 to 3 M{sub {circle_dot}}) remains one of the most elusive and important goals of theoretical astrophysics. The star-formation process is the outcome of the complex dynamics of interstellar gas involving non-linear interactions of turbulence, gravity, magnetic field and radiation. The evolution of protostellar condensations, from the moment they are assembled by turbulent flows to the time they reach stellar densities, spans an enormous range of scales, resulting in a major computational challenge for simulations. Since the previous Protostars and Planets conference, dramatic advances in the development of new numerical algorithmic techniques have been successfully implemented on large scale parallel supercomputers. Among such techniques, Adaptive Mesh Refinement and Smooth Particle Hydrodynamics have provided frameworks to simulate the process of low-mass star formation with a very large dynamic range. It is now feasible to explore the turbulent fragmentation of molecular clouds and the gravitational collapse of cores into stars self-consistently within the same calculation. The increased sophistication of these powerful methods comes with substantial caveats associated with the use of the techniques and the interpretation of the numerical results. In this review, we examine what has been accomplished in the field and present a critique of both numerical methods and scientific results. We stress that computational simulations should obey the available observational constraints and demonstrate numerical convergence. Failing this, results of large scale simulations do not advance our understanding of low-mass star formation.

  3. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  4. Advanced Simulation and Computing: A Summary Report to the Director's Review

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  5. Computer-based simulations

    OpenAIRE

    Antonoaie, C.; Antonoaie, N.

    2010-01-01

    A computer-based simulation replicates an environment through a computer program designed to consider multiple variables, interactions, and system constraints. Computer-based simulation is used in organization studies to model human social systems to better understand the dynamics between individual and group behaviours.These methods advance organization studies research in many ways. They can be used for extrapolating theory, validating hypotheses, or revealing emergent behaviour. Simulation...

  6. Advances in simulated modeling of vibration systems based on computational intelligence

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Computational intelligence is the computational simulation of the bio-intelligence, which includes artificial neural networks, fuzzy systems and evolutionary computations. This article summarizes the state of the art in the field of simulated modeling of vibration systems using methods of computational intelligence, based on some relevant subjects and the authors' own research work. First, contributions to the applications of computational intelligence to the identification of nonlinear characteristics of packaging are reviewed. Subsequently, applications of the newly developed training algorithms for feedforward neural networks to the identification of restoring forces in multi-degree-of-freedom nonlinear systems are discussed. Finally, the neural-network-based method of model reduction for the dynamic simulation of microelectromechanical systems (MEMS) using generalized Hebbian algorithm (GHA) and robust GHA is outlined. The prospects of the simulated modeling of vibration systems using techniques of computational intelligence are also indicated.

  7. Computer simulation of homogenization of boric acid in a pressurizer of the advanced nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Rosa, Jose E.P. da; Moreira, Maria de L., E-mail: jeduird@hotmail.com, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Oliveira, Andre F. de, E-mail: eafoliveira@ien.gov.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2013-07-01

    The reactivity of a water cooled reactor is controlled using control rods or boron dilution in water of the primary circuit. The boron-10 ({sup 10} B) is an efficient neutron absorber, especially when used in the absorption of thermal neutrons. Transient studies with disabilities in the homogenization of boron in PWR reactors become important as the boric acid solution is added to the primary circuit coolant in order to help control the fission rate in the reactor core. After reactor shutdown, the boron present in coolant has the function of maintaining reactor subcriticality. If low concentrated boron solution enters in the primary circuit, it becomes necessary to inject boron and to assure that the coolant will be well homogenized in order to increase the concentration and thus preventing water with small amounts of boron to reach the core. The aim of this study is to simulate the boron homogenization in the pressurizer of an advanced nuclear reactor. It is used a test section, which represents a quarter of a modular nuclear reactor pressurizer. By using the CFX code, a computer program that allows thermal hydraulic analysis of different types of flow, three examples were simulated using different operating conditions. With the results, it was analyzed the parameters that could influence this homogenization. Case studies such as variation of the dimensions of the water inlet and outlet tubes, flow variation and change in positioning of entrances and exits were made with the goal of finding parameters that could help the optimization of the homogenization of boron. The results confirm that the issues analyzed can be changed in the project in order to obtain the best operating condition. (author)

  8. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  9. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    OpenAIRE

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused revi...

  10. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  11. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  12. Development of advanced computational fluid dynamics tools and their application to simulation of internal turbulent flows

    Science.gov (United States)

    Emelyanov, V. N.; Karpenko, A. G.; Volkov, K. N.

    2015-06-01

    Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of internal fluid flows are discussed. The finite volume method is applied to solve three-dimensional (3D) unsteady compressible Euler and Navier-Stokes equations on unstructured meshes. Compute Inified Device Architecture (CUDA) technology is used for programming implementation of parallel computational algorithms. Solution of some fluid dynamics problems on GPUs is presented and approaches to optimization of the CFD code related to the use of different types of memory are discussed. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared with the use of different meshes and different methods of distribution of input data into blocks. Performance measurements show that numerical schemes developed achieve 20 to 50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  13. Predicting Earthquake Occurrence at Subduction-Zone Plate Boundaries Through Advanced Computer Simulation

    Science.gov (United States)

    Matsu'Ura, M.; Hashimoto, C.; Fukuyama, E.

    2004-12-01

    In general, predicting the occurrence of earthquakes is very difficult, because of the complexity of actual faults and nonlinear interaction between them. From the standpoint of earthquake prediction, however, our target is limited to the large events that completely break down a seismogenic zone. To such large events we may apply the concept of the earthquake cycle. The entire process of earthquake generation cycles generally consists of tectonic loading due to relative plate motion, quasi-static rupture nucleation, dynamic rupture propagation and stop, and restoration of fault strength. This process can be completely described by a coupled nonlinear system, which consists of an elastic/viscoelastic slip-response function that relates fault slip to shear stress change and a fault constitutive law that prescribes change in shear strength with fault slip and contact time. The shear stress and the shear strength are related with each other through boundary conditions on the fault. The driving force of this system is observed relative plate motion. The system to describe the earthquake generation cycle is conceptually quite simple. The complexity in practical modeling mainly comes from complexity in structure of the real earth. Recently, we have developed a physics-based, predictive simulation system for earthquake generation at plate boundaries in and around Japan, where the four plates of Pacific, North American, Philippine Sea and Eurasian are interacting with each other. The simulation system consists of a crust-mantle structure model, a quasi-static tectonic loading model, and a dynamic rupture propagation model. First, we constructed a realistic 3D model of plate interfaces in and around Japan by applying an inversion technique to ISC hypocenter data, and computed viscoelastic slip-response functions for this structure model. Second, we introduced the slip- and time-dependent fault constitutive law with an inherent strength-restoration mechanism as a basic

  14. Advanced computer simulation and modelling for solving single phase hydraulic problems

    International Nuclear Information System (INIS)

    This paper discusses the methods to perform single phase hydraulic calculations for complex piping networks and applications which require a high degree of accuracy. Two separate computer programs are utilized for the simulation and modeling of the networks. Equivalent length of piping and corresponding flows and pressures are calculated by using Overthruster and Kypipe computer programs respectively. The Overthruster Program is designed to perform standardized inplant L/D hydraulic calculations. This program contains certain empirical equations and data. The Kypipe Program is designed specifically to simulate steady state pressure and flow calculations in piping distribution system transporting fluids. Fluor Daniel, completed the modification design and Southern California Edison installed the modification and performed start-up testing of the system. The actual test results, pressures and flows, correlated well within 2 percent of the values predicted by analytical methods. This unique example demonstrates analytical capabilities and the level of accuracies achieved by using this method versus the conventional methods with typical inaccuracies of 10 to 15 percent

  15. Advanced adaptive computational methods for Navier-Stokes simulations in rotorcraft aerodynamics

    Science.gov (United States)

    Stowers, S. T.; Bass, J. M.; Oden, J. T.

    1993-01-01

    A phase 2 research and development effort was conducted in area transonic, compressible, inviscid flows with an ultimate goal of numerically modeling complex flows inherent in advanced helicopter blade designs. The algorithms and methodologies therefore are classified as adaptive methods, which are error estimation techniques for approximating the local numerical error, and automatically refine or unrefine the mesh so as to deliver a given level of accuracy. The result is a scheme which attempts to produce the best possible results with the least number of grid points, degrees of freedom, and operations. These types of schemes automatically locate and resolve shocks, shear layers, and other flow details to an accuracy level specified by the user of the code. The phase 1 work involved a feasibility study of h-adaptive methods for steady viscous flows, with emphasis on accurate simulation of vortex initiation, migration, and interaction. Phase 2 effort focused on extending these algorithms and methodologies to a three-dimensional topology.

  16. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  17. Recent Advances in Computational Simulation of Macro-, Meso-, and Micro-Scale Biomimetics Related Fluid Flow Problems

    Institute of Scientific and Technical Information of China (English)

    Y. Y. Yan

    2007-01-01

    Over the last decade, computational methods have been intensively applied to a variety of scientific researches and engineering designs. Although the computational fluid dynamics (CFD) method has played a dominant role in studying and simulating transport phenomena involving fluid flow and heat and mass transfers, in recent years, other numerical methods for the simulations at meso- and micro-scales have also been actively applied to solve the physics of complex flow and fluid-interface interactions. This paper presents a review of recent advances in multi-scale computational simulation of biomimetics related fluid flow problems. The state-of-the-art numerical techniques, such as lattice Boltzmann method (LBM), molecular dynamics (MD), and conventional CFD, applied to different problems such as fish flow, electro-osmosis effect of earthworm motion, and self-cleaning hydrophobic surface, and the numerical approaches are introduced. The new challenging of modelling biomimetics problems in developing the physical conditions of self-clean hydrophobic surfaces is discussed.

  18. Advances in computers

    CERN Document Server

    Memon, Atif

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  19. Advances in Computers

    CERN Document Server

    Zelkowitz, Marvin

    2010-01-01

    This is volume 79 of Advances in Computers. This series, which began publication in 1960, is the oldest continuously published anthology that chronicles the ever- changing information technology field. In these volumes we publish from 5 to 7 chapters, three times per year, that cover the latest changes to the design, development, use and implications of computer technology on society today. Covers the full breadth of innovations in hardware, software, theory, design, and applications.Many of the in-depth reviews have become standard references that co

  20. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.

  1. New Science Gateways for Advanced Computing Simulations and Visualization Using Vine Toolkit in PL-Grid

    OpenAIRE

    Piotr Dziubecki; Piotr Grabowski; Michał Krysinski; Tomasz Kuczynski; Krzysztof Kurowski; Tomasz Piontek; Dawid Szejnfeld

    2013-01-01

    A Science Gateway is a connection between scientists and their computational tools in the form of web portal. It creates a space for communities, collaboration and data sharing and visualization in a comprehensive and efficient manner. The main purpose of such a solution is to allow users to access the computational resources, process and analyze their data and get the results in a uniform and user friendly way. In this paper we propose a complex solution based on the Rich Internet Applicatio...

  2. Towards advanced code simulators

    International Nuclear Information System (INIS)

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  3. Exploring Interactive and Dynamic Simulations Using a Computer Algebra System in an Advanced Placement Chemistry Course

    Science.gov (United States)

    Matsumoto, Paul S.

    2014-01-01

    The article describes the use of Mathematica, a computer algebra system (CAS), in a high school chemistry course. Mathematica was used to generate a graph, where a slider controls the value of parameter(s) in the equation; thus, students can visualize the effect of the parameter(s) on the behavior of the system. Also, Mathematica can show the…

  4. Recent advances in renal hypoxia: insights from bench experiments and computer simulations.

    Science.gov (United States)

    Layton, Anita T

    2016-07-01

    The availability of oxygen in renal tissue is determined by the complex interactions among a host of processes, including renal blood flow, glomerular filtration, arterial-to-venous oxygen shunting, medullary architecture, Na(+) transport, and oxygen consumption. When this delicate balance is disrupted, the kidney may become susceptible to hypoxic injury. Indeed, renal hypoxia has been implicated as one of the major causes of acute kidney injury and chronic kidney diseases. This review highlights recent advances in our understanding of renal hypoxia; some of these studies were published in response to a recent Call for Papers of this journal: Renal Hypoxia. PMID:27147670

  5. Casting directly from a computer model by using advanced simulation software FLOW-3D Cast ®

    Directory of Open Access Journals (Sweden)

    M. Sirviö

    2009-01-01

    Full Text Available ConiferRob - A patternless casting technique, originally conceived at VTT Technical Research Centre of Finland and furtherdeveloped at its spin-off company, Simtech Systems, offers up to 40% savings in product development costs, and up to two months shorterdevelopment times compared to conventional techniques. Savings of this order can be very valuable on today's highly competitivemarkets. Casting simulation is commonly used for designing of casting systems. However, most of the software are today old fashioned and predicting just shrinkage porosity. Flow Science, VTT and Simtech have developed new software called FLOW-3D Cast ® , whichcan simulate surface defects, air entrainment, filters, core gas problems and even a cavitation.

  6. Casting directly from a computer model by using advanced simulation software FLOW-3D Cast ®

    OpenAIRE

    M. Sirviö; M. Woś

    2009-01-01

    ConiferRob - A patternless casting technique, originally conceived at VTT Technical Research Centre of Finland and furtherdeveloped at its spin-off company, Simtech Systems, offers up to 40% savings in product development costs, and up to two months shorterdevelopment times compared to conventional techniques. Savings of this order can be very valuable on today's highly competitivemarkets. Casting simulation is commonly used for designing of casting systems. However, most of the software are ...

  7. Editorial, Workshop on New Directions for Advanced Computer Simulations and Experiments in Fusion-Related Plasma-Surface Interactions

    International Nuclear Information System (INIS)

    Because plasma-boundary physics encompasses some of the most important unresolved issues for both the International Thermonuclear Experimental Reactor (ITER) project and future fusion power reactors, there is a strong interest in the fusion community for better understanding and characterization of plasma-wall interactions. Chemical and physical sputtering cause the erosion of the limiters/divertor plates and vacuum vessel walls (made of C, Be and W, for example) and degrade fusion performance by diluting the fusion fuel and excessively cooling the core, while carbon redeposition could produce long-term in-vessel tritium retention, degrading the superior thermo-mechanical properties of the carbon materials. Mixed plasma-facing materials are proposed, requiring optimization for different power and particle flux characteristics. Knowledge of material properties as well as characteristics of the plasma-material interaction are prerequisites for such optimizations. Computational power will soon reach hundreds of teraflops, so that theoretical and plasma science expertise can be matched with new experimental capabilities in order to mount a strong response to these challenges. To begin to address such questions, a Workshop on New Directions for Advanced Computer Simulations and Experiments in Fusion-Related Plasma-Surface Interactions for Fusion (PSIF) was held at the Oak Ridge National Laboratory from 21 to 23 March, 2005. The purpose of the workshop was to bring together researchers in fusion related plasma-wall interactions in order to address these topics and to identify the most needed and promising directions for study, to exchange opinions on the present depth of knowledge of surface properties for the main fusion-related materials, e.g., C, Be and W, especially for sputtering, reflection, and deuterium (tritium) retention properties. The goal was to suggest the most important next steps needed for such basic computational and experimental work to be facilitated

  8. Compute Canada: Advancing Computational Research

    International Nuclear Information System (INIS)

    High Performance Computing (HPC) is redefining the way that research is done. Compute Canada's HPC infrastructure provides a national platform that enables Canadian researchers to compete on an international scale, attracts top talent to Canadian universities and broadens the scope of research.

  9. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  10. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  11. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  12. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    International Nuclear Information System (INIS)

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  13. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software con

  14. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software con

  15. In-Service Design and Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    International Nuclear Information System (INIS)

    This final report on ''In-Service Design and Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation'' (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: (1) A Transport and Fate Model for Helium and Helium Management; (2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; (3) Multiscale Modeling of Fracture consisting of: (3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), (3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, (3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, (3d) A Model for the KJc(T) of a High Strength NFA MA957, (3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, (3f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; (4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and (5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES

  16. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  17. Advances and Challenges in Computational Plasma Science

    Energy Technology Data Exchange (ETDEWEB)

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  18. Advances and Challenges in Computational Plasma Science

    International Nuclear Information System (INIS)

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology

  19. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  20. Hybrid deterministic and stochastic x-ray transport simulation for transmission computed tomography with advanced detector noise model

    Science.gov (United States)

    Popescu, Lucretiu M.

    2016-03-01

    We present a model for simulation of noisy X-ray computed tomography data sets. The model is made of two main components, a photon transport simulation component that generates the noiseless photon field incident on the detector, and a detector response model that takes as input the incident photon field parameters and given the X-ray source intensity and exposure time can generate noisy data sets, accordingly. The photon transport simulation component combines direct ray-tracing of polychromatic X-rays for calculation of transmitted data, with Monte Carlo simulation for calculation of the scattered-photon data. The Monte Carlo scatter simulation is accelerated by implementing particle splitting and importance sampling variance reduction techniques. The detector-incident photon field data are stored as energy expansion coefficients on a refined grid that covers the detector area. From these data the detector response model is able to generate noisy detector data realizations, by reconstituting the main parameters that describe each detector element response in statistical terms, including spatial correlations. The model is able to generate very fast, on the fly, CT data sets corresponding to different radiation doses, as well as detector response characteristics, facilitating data management in extensive optimization studies by reducing the computation time and storage space demands.

  1. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  2. Computationally efficient multibody simulations

    Science.gov (United States)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  3. Computer Simulation Western

    International Nuclear Information System (INIS)

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  4. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  5. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  6. Computer-simulated phacoemulsification

    Science.gov (United States)

    Laurell, Carl-Gustaf; Nordh, Leif; Skarman, Eva; Andersson, Mats; Nordqvist, Per

    2001-06-01

    Phacoemulsification makes the cataract operation easier for the patient but involves a demanding technique for the surgeon. It is therefore important to increase the quality of surgical training in order to shorten the learning period for the beginner. This should diminish the risks of the patient. We are developing a computer-based simulator for training of phacoemulsification. The simulator is built on a platform that can be used as a basis for several different training simulators. A prototype has been made that has been partly tested by experienced surgeons.

  7. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, Panagiotis; /Fermilab; Cary, John; /Tech-X, Boulder; Mcinnes, Lois Curfman; /Argonne; Mori, Warren; /UCLA; Ng, Cho; /SLAC; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2011-10-21

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  8. Community petascale project for accelerator science and simulation: advancing computational science for future accelerators and accelerator technologies

    International Nuclear Information System (INIS)

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R and D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors

  9. Commnity Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, Panagiotis; /Fermilab; Cary, John; /Tech-X, Boulder; Mcinnes, Lois Curfman; /Argonne; Mori, Warren; /UCLA; Ng, Cho; /SLAC; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2008-07-01

    The design and performance optimization of particle accelerators is essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC1 Accelerator Science and Technology project, the SciDAC2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modeling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multi-physics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  10. Recent Advances in Evolutionary Computation

    Institute of Scientific and Technical Information of China (English)

    Xin Yao; Yong Xu

    2006-01-01

    Evolutionary computation has experienced a tremendous growth in the last decade in both theoretical analyses and industrial applications. Its scope has evolved beyond its original meaning of "biological evolution" toward a wide variety of nature inspired computational algorithms and techniques, including evolutionary, neural, ecological, social and economical computation, etc., in a unified framework. Many research topics in evolutionary computation nowadays are not necessarily "evolutionary". This paper provides an overview of some recent advances in evolutionary computation that have been made in CERCIA at the University of Birmingham, UK. It covers a wide range of topics in optimization, learning and design using evolutionary approaches and techniques, and theoretical results in the computational time complexity of evolutionary algorithms. Some issues related to future development of evolutionary computation are also discussed.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  12. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  13. Computational photography: advances and challenges

    OpenAIRE

    Lam, EYM

    2011-01-01

    In the mid-1990s when digital photography began to enter the consumer market, Professor Joseph Goodman and I set out to explore how computation would impact the imaging system design. The field of study has since grown to be known as computational photography. In this paper I'll describe some of its recent advances and challenges, and discuss what the future holds. © 2011 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).

  14. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    Energy Technology Data Exchange (ETDEWEB)

    Naqvi, S [Saint Agnes Cancer Institute, Department of Radiation Oncology, Baltimore, MD (United States)

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  15. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    International Nuclear Information System (INIS)

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  16. Electron injector computer simulations

    International Nuclear Information System (INIS)

    The authors present contributions for electron injector computation and design, describing a simple but complete simulation code implemented on a personal computer, giving the main design choices taken for the BCMN and LEP high intensity injectors and for the ORION self-focussing injector. Electron dynamics are characterized by the predominant effect of the first ''accelerating'' cell, in contrast with proton dynamics. In this region shorter than an RF half-wavelength the non-linear bunching and acceleration can only be simulated in a step-by-step procedure. Analytical ''adiabatic'' approach cannot help the designer but he can take advantage of non-repetitive features to obtain radial RF self-focussing together with longitudinal bunching

  17. Computer security simulation

    International Nuclear Information System (INIS)

    Development and application of a series of simulation codes used for computer security analysis and design are described. Boolean relationships for arrays of barriers within functional modules are used to generate composite effectiveness indices. The general case of multiple layers of protection with any specified barrier survival criteria is given. Generalized reduction algorithms provide numerical security indices in selected subcategories and for the system as a whole. 9 figures, 11 tables

  18. Computer Simulator: An Educational Tool for Computer Architecture

    OpenAIRE

    Mihyar Hesson

    2006-01-01

    The great advancement in computer architecture and cache memory design and technology had a considerable influence on the way computer architecture was taught in universities. This requires students to be able to visualize the detailed activities that take place within a computer processor and its interaction with memory system. Computer simulators could effectively be used to enhance the understanding and comprehension of cache memory operation. The main objective of this project was to desi...

  19. Computational Methods for Simulating Quantum Computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.

    2006-01-01

    This review gives a survey of numerical algorithms and software to simulate quantum computers. It covers the basic concepts of quantum computation and quantum algorithms and includes a few examples that illustrate the use of simulation software for ideal and physical models of quantum computers.

  20. Recent advances in computational optimization

    CERN Document Server

    2013-01-01

    Optimization is part of our everyday life. We try to organize our work in a better way and optimization occurs in minimizing time and cost or the maximization of the profit, quality and efficiency. Also many real world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization. This book presents recent advances in computational optimization. The volume includes important real world problems like parameter settings for con- trolling processes in bioreactor, robot skin wiring, strip packing, project scheduling, tuning of PID controller and so on. Some of them can be solved by applying traditional numerical methods, but others need a huge amount of computational resources. For them it is shown that is appropriate to develop algorithms based on metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming etc...

  1. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  2. Computational Models of Human Performance: Validation of Memory and Procedural Representation in Advanced Air/Ground Simulation

    Science.gov (United States)

    Corker, Kevin M.; Labacqz, J. Victor (Technical Monitor)

    1997-01-01

    The Man-Machine Interaction Design and Analysis System (MIDAS) under joint U.S. Army and NASA cooperative is intended to assist designers of complex human/automation systems in successfully incorporating human performance capabilities and limitations into decision and action support systems. MIDAS is a computational representation of multiple human operators, selected perceptual, cognitive, and physical functions of those operators, and the physical/functional representation of the equipment with which they operate. MIDAS has been used as an integrated predictive framework for the investigation of human/machine systems, particularly in situations with high demands on the operators. We have extended the human performance models to include representation of both human operators and intelligent aiding systems in flight management, and air traffic service. The focus of this development is to predict human performance in response to aiding system developed to identify aircraft conflict and to assist in the shared authority for resolution. The demands of this application requires representation of many intelligent agents sharing world-models, coordinating action/intention, and cooperative scheduling of goals and action in an somewhat unpredictable world of operations. In recent applications to airborne systems development, MIDAS has demonstrated an ability to predict flight crew decision-making and procedural behavior when interacting with automated flight management systems and Air Traffic Control. In this paper, we describe two enhancements to MIDAS. The first involves the addition of working memory in the form of an articulatory buffer for verbal communication protocols and a visuo-spatial buffer for communications via digital datalink. The second enhancement is a representation of multiple operators working as a team. This enhanced model was used to predict the performance of human flight crews and their level of compliance with commercial aviation communication

  3. International Conference on Advanced Computing

    CERN Document Server

    Patnaik, Srikanta

    2014-01-01

    This book is composed of the Proceedings of the International Conference on Advanced Computing, Networking, and Informatics (ICACNI 2013), held at Central Institute of Technology, Raipur, Chhattisgarh, India during June 14–16, 2013. The book records current research articles in the domain of computing, networking, and informatics. The book presents original research articles, case-studies, as well as review articles in the said field of study with emphasis on their implementation and practical application. Researchers, academicians, practitioners, and industry policy makers around the globe have contributed towards formation of this book with their valuable research submissions.

  4. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  5. Advances in embedded computer vision

    CERN Document Server

    Kisacanin, Branislav

    2014-01-01

    This illuminating collection offers a fresh look at the very latest advances in the field of embedded computer vision. Emerging areas covered by this comprehensive text/reference include the embedded realization of 3D vision technologies for a variety of applications, such as stereo cameras on mobile devices. Recent trends towards the development of small unmanned aerial vehicles (UAVs) with embedded image and video processing algorithms are also examined. The authoritative insights range from historical perspectives to future developments, reviewing embedded implementation, tools, technolog

  6. r.avaflow: An advanced open source computational framework for the GIS-based simulation of two-phase mass flows and process chains

    Science.gov (United States)

    Mergili, Martin; Fischer, Jan-Thomas; Fellin, Wolfgang; Ostermann, Alexander; Pudasaini, Shiva P.

    2015-04-01

    Geophysical mass flows stand for a broad range of processes and process chains such as flows and avalanches of snow, soil, debris or rock, and their interactions with water bodies resulting in flood waves. Despite considerable efforts put in model development, the simulation, and therefore the appropriate prediction of these types of events still remains a major challenge in terms of the complex material behaviour, strong phase interactions, process transformations and the complex mountain topography. Sophisticated theories exist, but they have hardly been brought to practice yet. We fill this gap by developing a novel and unified high-resolution computational tool, r.avaflow, representing a comprehensive and advanced open source GIS simulation environment for geophysical mass flows. Based on the latest and most advanced two-phase physical-mathematical models, r.avaflow includes the following features: (i) it is suitable for a broad spectrum of mass flows such as rock, rock-ice and snow avalanches, glacial lake outburst floods, debris and hyperconcentrated flows, and even landslide-induced tsunamis and submarine landslides, as well as process chains involving more than one of these phenomena; (ii) it accounts for the real two-phase nature of many flow types: viscous fluids and solid particles are considered separately with advanced mechanics and strong phase interactions; (iii) it is freely available and adoptable along with the GRASS GIS software. In the future, it will include the intrinsic topographic influences on the flow dynamics and morphology as well as an advanced approach to simulate the entrainment and deposition of solid and fluid material. As input r.avaflow needs information on (a) the mountain topography, (b) the material properties and (c) the spatial distribution of the solid and fluid release masses or one or more hydrographs of fluid and solid material. We demonstrate the functionalities and performance of r.avaflow by using some generic and real

  7. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  8. Education and Training of Future Nuclear Engineers at DIN: From Advanced Computer Codes to an Interactive Plant Simulator.

    OpenAIRE

    Cabellos de Francisco, Oscar Luis; Ahnert Iglesias, Carolina; Cuervo Gómez, Diana; García Herranz, Nuria; Gallego Díaz, Eduardo F.; Mínguez Torres, Emilio; Aragonés Beltrán, José María; Lorente Fillol, Alfredo; Piedra, David

    2010-01-01

    This paper summarizes the work being performed at the Department of Nuclear Engineering (www.din.upm.es) of the Universidad Politécnica de Madrid to improve the education and training of future Spanish nuclear engineers according to the Bologna rules. We present two main efforts introduced in our programme: i) the understanding of the current computational methodologies/codes starting from the nuclear data processing, then the lattice and core calculations codes, and finally the power plant ...

  9. Perspective: Computer simulations of long time dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Elber, Ron [Department of Chemistry, The Institute for Computational Engineering and Sciences, University of Texas at Austin, Austin, Texas 78712 (United States)

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.

  10. Perspective: Computer simulations of long time dynamics

    International Nuclear Information System (INIS)

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances

  11. Perspective: Computer simulations of long time dynamics.

    Science.gov (United States)

    Elber, Ron

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  12. Computational Design of Advanced Nuclear Fuels

    Energy Technology Data Exchange (ETDEWEB)

    Savrasov, Sergey [Univ. of California, Davis, CA (United States); Kotliar, Gabriel [Rutgers Univ., Piscataway, NJ (United States); Haule, Kristjan [Rutgers Univ., Piscataway, NJ (United States)

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  13. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  14. Simulating Chemistry Using Quantum Computers

    OpenAIRE

    Kassal, Ivan; Whitfield, James D.; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alan

    2011-01-01

    The difficulty of simulating quantum systems, well-known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achi...

  15. Simulating Human Cognitive Using Computational Verb Theory

    Institute of Scientific and Technical Information of China (English)

    YANGTao

    2004-01-01

    Modeling and simulation of a life system is closely connected to the modeling of cognition,especially for advanced life systems. The primary difference between an advanced life system and a digital computer is that the advanced life system consists of a body with mind while a digital computer is only a mind in a formal sense. To model an advanced life system one needs to symbols into a body where a digital computer is embedded. In this paper, a computational verb theory is proposed as a new paradigm of grounding symbols into the outputs of sensors. On one hand, a computational verb can preserve the physical "meanings" of the dynamics of sensor data such that a symbolic system can be used to manipulate physical meanings instead of abstract tokens in the digital computer. On the other hand, the physical meanings of an abstract symbol/token, which is usually an output of a reasoning process in the digital computer, can be restored and fed back to the actuators. Therefore, the computational verb theory bridges the gap between symbols and physical reality from the dynamic cognition perspective.

  16. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  17. Advances in Computer Science and Engineering

    CERN Document Server

    Second International Conference on Advances in Computer Science and Engineering (CES 2012)

    2012-01-01

    This book includes the proceedings of the second International Conference on Advances in Computer Science and Engineering (CES 2012), which was held during January 13-14, 2012 in Sanya, China. The papers in these proceedings of CES 2012 focus on the researchers’ advanced works in their fields of Computer Science and Engineering mainly organized in four topics, (1) Software Engineering, (2) Intelligent Computing, (3) Computer Networks, and (4) Artificial Intelligence Software.

  18. Advanced numerical simulations of selected metallurgical units

    Directory of Open Access Journals (Sweden)

    G. Kokot

    2012-12-01

    Full Text Available Purpose: of this paper is to present numerical simulations of large structures in metallurgical industry. Some examples of finite element analysis are presented. The calculations were performed for the determining the stress effort of the metallurgical units mainly blast furnace, throath’s gas pipelines, hot blast stoves, etc. during the working conditions and for the repairing purpose.Design/methodology/approach: The way of conducting simulations and analysis were the finite element method connected with the optimization process.Findings: Performing the numerical analysis the changes in the structures design were applied what extremely influenced on the state effort and the durability of considered structures.Research limitations/implications: Development of the presented approach solving the coupled field and CFD problems, the application of the parallel computing and domain decomposition methods in the large structure simulations.Practical implications: Presented results shows the possibility of application the advanced computational methods in the computer aided engineering processes of designing and analysing the large structure as the metallurgical units are. It can dramatically influence on the recognizing of the effort stets and helps in the monitoring, overhauls and redesigning process. Those methods gives the global very precise information which cannot be obtain in other ways (analytical solutions, experimental methods.Originality/value: The paper present the original research results comes from the complex numerical simulations of the main metallurgical units in the blast furnace train. The original value of the paper is the introduction of the advanced finite element simulation in the field of iron steel industry structures design and developing.

  19. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  20. Computational Seebeck Coefficient Measurement Simulations

    OpenAIRE

    Martin, Joshua

    2012-01-01

    We have employed finite element analysis to develop computational Seebeck coefficient metrology simulations. This approach enables a unique exploration of multiple probe arrangements and measurement techniques within the same temporal domain. To demonstrate the usefulness of this approach, we have performed these Seebeck coefficient measurement simulations to quantitatively explore perturbations to voltage and temperature correspondence, by comparing simultaneous and staggered data acquisitio...

  1. Advanced simulation for fast reactor design

    International Nuclear Information System (INIS)

    Full text: This talk broadly reviews recent research aimed at applying advanced simulation techniques specifically to fast neutron reactors. By advanced simulation we generally refer to attempts to do more science-based simulation - that is, to numerically solve the three-dimensional governing physical equations on fine scales and observe and study the holistic phenomena that emerge. In this way simulation is treated more akin to a traditional physical experiment, and can can be used both separately and in conjunction with physical experiments to develop more accurate predictive theories on reactor behavior. Many existing fast reactor modeling tools were developed for last generation's computational resources. They were built by engineers and physicists with deep physical insight - insight that both shaped and was informed by existing theory, and was underpinned by a vast repository of experimental data. Their general approach was to develop models that were tailored to varying degrees to the details of the reactor design, using free model parameters that were subsequently calibrated to match existing experimental data. The resulting codes were thus extremely useful for their specific purpose but highly limited in their predictive capability (neutronics to a lesser degree). They tended to represent more the state-of-the-art in our understanding rather than tools of exploration and innovation. Recently, a number of researchers have attempted to study the feasibility of solving more fundamental governing equations on realistic, three-dimensional geometries for different fast reactor sub-domains. This includes solving the Navier-Stokes equations for single-phase sodium flow (Direct Numerical Simulation, Large Eddie Simulation, and Reynolds Averaged Navier Stokes Equations) in the core, upper plenum, primary and intermediate loop, etc.; the non-homogenized transport equations at very fine group, angle, and energy discretization, and thermo-mechanical feedback based on

  2. Assessment of training simulators with advanced models

    International Nuclear Information System (INIS)

    Training quality received by the nuclear power plants operators is related to the reliability degree reached by the models which constitute the calculation basis. TECNATOM began, in the middle of the 80's, the PWR and BWR training simulators upgrading to reproduce all type of transients with long term operation and a very high reliability degree. As a result of this, the Simulation Advanced Models Project (MAS) has been developed for both PWR and BWR simulators. The simulators software is the TRAC code running in real time on the CRAY X-MP 14 vectorial computer. The validation methodology followed in the MAS Project is based on the EPRI's one. The main goal is the detailed analysis of the variables and physical phenomena to validate ('dynamic modes') included in the validation transients matrix. The reference results are supplied by plant data or best estimate codes: TRAC-PF1/MOD1 and TRACG for PWR and BWR training simulators, respectively. This paper shows the main results of validation transients and the main conclusions: improvement of simulation scope and reliability, EOP's scenarios simulation with long term recovery and physical phenomena analysis similar to best estimate codes. (orig.) (13 refs., 17 figs., 4 tabs.)

  3. Advanced in Computer Science and its Applications

    CERN Document Server

    Yen, Neil; Park, James; CSA 2013

    2014-01-01

    The theme of CSA is focused on the various aspects of computer science and its applications for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of computer science and its applications. Therefore this book will be include the various theories and practical applications in computer science and its applications.

  4. International Conference on Advanced Computing for Innovation

    CERN Document Server

    Angelova, Galia; Agre, Gennady

    2016-01-01

    This volume is a selected collection of papers presented and discussed at the International Conference “Advanced Computing for Innovation (AComIn 2015)”. The Conference was held at 10th -11th of November, 2015 in Sofia, Bulgaria and was aimed at providing a forum for international scientific exchange between Central/Eastern Europe and the rest of the world on several fundamental topics of computational intelligence. The papers report innovative approaches and solutions in hot topics of computational intelligence – advanced computing, language and semantic technologies, signal and image processing, as well as optimization and intelligent control.

  5. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  6. Soft computing in advanced robotics

    CERN Document Server

    Kobayashi, Ichiro; Kim, Euntai

    2014-01-01

    Intelligent system and robotics are inevitably bound up; intelligent robots makes embodiment of system integration by using the intelligent systems. We can figure out that intelligent systems are to cell units, while intelligent robots are to body components. The two technologies have been synchronized in progress. Making leverage of the robotics and intelligent systems, applications cover boundlessly the range from our daily life to space station; manufacturing, healthcare, environment, energy, education, personal assistance, logistics. This book aims at presenting the research results in relevance with intelligent robotics technology. We propose to researchers and practitioners some methods to advance the intelligent systems and apply them to advanced robotics technology. This book consists of 10 contributions that feature mobile robots, robot emotion, electric power steering, multi-agent, fuzzy visual navigation, adaptive network-based fuzzy inference system, swarm EKF localization and inspection robot. Th...

  7. Computer Simulation of Diffraction Patterns.

    Science.gov (United States)

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  8. Computer networks and advanced communications

    International Nuclear Information System (INIS)

    One of the major methods for getting the most productivity and benefits from computer usage is networking. However, for those who are contemplating a change from stand-alone computers to a network system, the investigation of actual networks in use presents a paradox: network systems can be highly productive and beneficial; at the same time, these networks can create many complex, frustrating problems. The issue becomes a question of whether the benefits of networking are worth the extra effort and cost. In response to this issue, the authors review in this paper the implementation and management of an actual network in the LSU Petroleum Engineering Department. The network, which has been in operation for four years, is large and diverse (50 computers, 2 sites, PC's, UNIX RISC workstations, etc.). The benefits, costs, and method of operation of this network will be described, and an effort will be made to objectively weigh these elements from the point of view of the average computer user

  9. Software Framework for Advanced Power Plant Simulations

    Energy Technology Data Exchange (ETDEWEB)

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  10. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  11. Computational Challenges in Nuclear Weapons Simulation

    Energy Technology Data Exchange (ETDEWEB)

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  12. Extending the horizons advances in computing, optimization, and decision technologies

    CERN Document Server

    Joseph, Anito; Mehrotra, Anuj; Trick, Michael

    2007-01-01

    Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state-of-the-art in the interface between OR/MS and CS/AI and of the high caliber of research being conducted by members of the INFORMS Computing Society. EXTENDING THE HORIZONS: Advances in Computing, Optimization, and Decision Technologies is a volume that presents the latest, leading research in the design and analysis of algorithms, computational optimization, heuristic search and learning, modeling languages, parallel and distributed computing, simulation, computational logic and visualization. This volume also emphasizes a variety of novel applications in the interface of CS, AI, and OR/MS.

  13. Advanced laptop and small personal computer technology

    Science.gov (United States)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  14. Advanced Biomedical Computing Center (ABCC) | DSITP

    Science.gov (United States)

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  15. Advance Trends in Soft Computing

    CERN Document Server

    Kreinovich, Vladik; Kacprzyk, Janusz; WCSC 2013

    2014-01-01

    This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing. Topics range from fuzzy sets, to fuzzy logic, fuzzy mathematics, neuro-fuzzy systems, fuzzy control, decision making in fuzzy environments, image processing and many more. The book is dedicated to Lotfi A. Zadeh, a renowned specialist in signal analysis and control systems research who proposed the idea of fuzzy sets, in which an element may have a partial membership, in the early 1960s, followed by the idea of fuzzy logic, in which a statement can be true only to a certain degree, with degrees described by numbers in the interval [0,1]. The performance of fuzzy systems can often be improved with the help of optimization techniques, e.g. evolutionary co...

  16. Evolutionary Games and Computer Simulations

    CERN Document Server

    Huberman, B A; Huberman, Bernardo A.; Glance, Natalie S.

    1993-01-01

    Abstract: The prisoner's dilemma has long been considered the paradigm for studying the emergence of cooperation among selfish individuals. Because of its importance, it has been studied through computer experiments as well as in the laboratory and by analytical means. However, there are important differences between the way a system composed of many interacting elements is simulated by a digital machine and the manner in which it behaves when studied in real experiments. In some instances, these disparities can be marked enough so as to cast doubt on the implications of cellular automata type simulations for the study of cooperation in social systems. In particular, if such a simulation imposes space-time granularity, then its ability to describe the real world may be compromised. Indeed, we show that the results of digital simulations regarding territoriality and cooperation differ greatly when time is discrete as opposed to continuous.

  17. Advanced Computer Algebra for Determinants

    CERN Document Server

    Koutschan, Christoph

    2011-01-01

    We prove three conjectures concerning the evaluation of determinants, which are related to the counting of plane partitions and rhombus tilings. One of them has been posed by George Andrews in 1980, the other two are by Guoce Xin and Christian Krattenthaler. Our proofs employ computer algebra methods, namely the holonomic ansatz proposed by Doron Zeilberger and variations thereof. These variations make Zeilberger's original approach even more powerful and allow for addressing a wider variety of determinants. Finally we present, as a challenge problem, a conjecture about a closed form evaluation of Andrews's determinant.

  18. Advances in randomized parallel computing

    CERN Document Server

    Rajasekaran, Sanguthevar

    1999-01-01

    The technique of randomization has been employed to solve numerous prob­ lems of computing both sequentially and in parallel. Examples of randomized algorithms that are asymptotically better than their deterministic counterparts in solving various fundamental problems abound. Randomized algorithms have the advantages of simplicity and better performance both in theory and often in practice. This book is a collection of articles written by renowned experts in the area of randomized parallel computing. A brief introduction to randomized algorithms In the aflalysis of algorithms, at least three different measures of performance can be used: the best case, the worst case, and the average case. Often, the average case run time of an algorithm is much smaller than the worst case. 2 For instance, the worst case run time of Hoare's quicksort is O(n ), whereas its average case run time is only O( n log n). The average case analysis is conducted with an assumption on the input space. The assumption made to arrive at t...

  19. Computer-simulated schlieren optics

    International Nuclear Information System (INIS)

    Computer-simulated schlieren pictures are used to interpret and quantitatively analyze schlieren pictures taken from a fast varying plasma with axial symmetry. Structurized angular distributions of deflected rays are obtained from a ray tracing simulation, the characteristics of which are related to density, density gradients, and dimensions of the plasma. Angular distributions are transformed into intensity distributions using the optical data of a typical schlieren system. Ranges of the plasma parameters density, density gradients, and dimensions are given. As an example the method is applied to the compression phase of a fast high voltage plasma focus

  20. Computer simulation of martensitic transformations

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ping

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  1. Plasma physics via computer simulation

    CERN Document Server

    Birdsall, CK

    2004-01-01

    PART 1: PRIMER Why attempting to do plasma physics via computer simulation using particles makes good sense Overall view of a one dimensional electrostatic program A one dimensional electrostatic program ES1 Introduction to the numerical methods used Projects for ES1 A 1d electromagnetic program EM1 Projects for EM1 PART 2: THEORY Effects of the spatial grid Effects of the finitw time ste Energy-conserving simulation models Multipole models Kinetic theory for fluctuations and noise; collisions Kinetic properties: theory, experience and heuristic estimates PART 3: PRACTIC

  2. Interactive Simulations and advanced Visualization with Modelica

    OpenAIRE

    Bellmann, Tobias

    2009-01-01

    In this paper a Modelica library for interactive simulation and advanced visualization called ExternalDevices is introduced and presented. Providing support for standard input devices like keyboard and joystick as well as for communication via UDP and shared memory, this library allows the user to interact with a running simulation and process the output data of the simulation in other processes capable of UDP connections. An advanced visualization system replaces t...

  3. Computer simulation of superionic fluorides

    CERN Document Server

    Castiglione, M

    2000-01-01

    experimentally gives an indication of the correlations between nearby defects is well-reproduced. The most stringent test of simulation model transferability is presented in the studies of lead tin fluoride, in which significant 'covalent' effects are apparent. Other similarly-structured compounds are also investigated, and the reasons behind the adoption of such an unusual layered structure, and the mobility and site occupation of the anions is quantified. In this thesis the nature of ion mobility in cryolite and lead fluoride based compounds is investigated by computer simulation. The phase transition of cryolite is characterised in terms of rotation of AIF sub 6 octahedra, and the conductive properties are shown to result from diffusion of the sodium ions. The two processes appear to be unrelated. Very good agreement with NMR experimental results is found. The Pb sup 2 sup + ion has a very high polarisability, yet treatment of this property in previous simulations has been problematic. In this thesis a mor...

  4. Advanced circuit simulation using Multisim workbench

    CERN Document Server

    Báez-López, David; Cervantes-Villagómez, Ofelia Delfina

    2012-01-01

    Multisim is now the de facto standard for circuit simulation. It is a SPICE-based circuit simulator which combines analog, discrete-time, and mixed-mode circuits. In addition, it is the only simulator which incorporates microcontroller simulation in the same environment. It also includes a tool for printed circuit board design.Advanced Circuit Simulation Using Multisim Workbench is a companion book to Circuit Analysis Using Multisim, published by Morgan & Claypool in 2011. This new book covers advanced analyses and the creation of models and subcircuits. It also includes coverage of transmissi

  5. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  6. Computer Simulation for Emergency Incident Management

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  7. Foreword: Advanced Science Letters (ASL), Special Issue on Computational Astrophysics

    CERN Document Server

    ,

    2009-01-01

    Computational astrophysics has undergone unprecedented development over the last decade, becoming a field of its own. The challenge ahead of us will involve increasingly complex multi-scale simulations. These will bridge the gap between areas of astrophysics such as star and planet formation, or star formation and galaxy formation, that have evolved separately until today. A global knowledge of the physics and modeling techniques of astrophysical simulations is thus an important asset for the next generation of modelers. With the aim at fostering such a global approach, we present the Special Issue on Computational Astrophysics for the Advanced Science Letters (http://www.aspbs.com/science.htm). The Advanced Science Letters (ASL) is a new multi-disciplinary scientific journal which will cover extensively computational astrophysics and cosmology, and will act as a forum for the presentation and discussion of novel work attempting to connect different research areas. This Special Issue collects 9 reviews on 9 k...

  8. Computational electromagnetics recent advances and engineering applications

    CERN Document Server

    2014-01-01

    Emerging Topics in Computational Electromagnetics in Computational Electromagnetics presents advances in Computational Electromagnetics. This book is designed to fill the existing gap in current CEM literature that only cover the conventional numerical techniques for solving traditional EM problems. The book examines new algorithms, and applications of these algorithms for solving problems of current interest that are not readily amenable to efficient treatment by using the existing techniques. The authors discuss solution techniques for problems arising in nanotechnology, bioEM, metamaterials, as well as multiscale problems. They present techniques that utilize recent advances in computer technology, such as parallel architectures, and the increasing need to solve large and complex problems in a time efficient manner by using highly scalable algorithms.

  9. OPENING REMARKS: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  10. Computational Intelligence Paradigms in Advanced Pattern Classification

    CERN Document Server

    Jain, Lakhmi

    2012-01-01

    This monograph presents selected areas of application of pattern recognition and classification approaches including handwriting recognition, medical image analysis and interpretation, development of cognitive systems for image computer understanding, moving object detection, advanced image filtration and intelligent multi-object labelling and classification. It is directed to the scientists, application engineers, professors, professors and students will find this book useful.

  11. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  12. Computer Simulations of Cosmic Reionization

    CERN Document Server

    Trac, Hy

    2009-01-01

    The cosmic reionization of hydrogen was the last major phase transition in the evolution of the universe, which drastically changed the ionization and thermal conditions in the cosmic gas. To the best of our knowledge today, this process was driven by the ultra-violet radiation from young, star-forming galaxies and from first quasars. We review the current observational constraints on cosmic reionization, as well as the dominant physical effects that control the ionization of intergalactic gas. We then focus on numerical modeling of this process with computer simulations. Over the past decade, significant progress has been made in solving the radiative transfer of ionizing photons from many sources through the highly inhomogeneous distribution of cosmic gas in the expanding universe. With modern simulations, we have finally converged on a general picture for the reionization process, but many unsolved problems still remain in this young and exciting field of numerical cosmology.

  13. Advances in computers improving the web

    CERN Document Server

    Zelkowitz, Marvin

    2010-01-01

    This is volume 78 of Advances in Computers. This series, which began publication in 1960, is the oldest continuously published anthology that chronicles the ever- changing information technology field. In these volumes we publish from 5 to 7 chapters, three times per year, that cover the latest changes to the design, development, use and implications of computer technology on society today.Covers the full breadth of innovations in hardware, software, theory, design, and applications.Many of the in-depth reviews have become standard references that continue to be of significant, lasting value i

  14. Advances in simulation of PCR

    International Nuclear Information System (INIS)

    Polymerase chain reaction (PCR) is an important diagnosis tool in molecular biology, which have been greatly improved by PCR. However, optimizing the experimental conditions is still a problem for PRC. Computer biology can be a solution to this problem. In this paper, developments of the mathematical models for PCA are reviewed. It is believed that this kind of research efforts shall be helpful for optimizing the experimental conditions and providing guidance for the biologists and understanding the mechanism of PCR. (authors)

  15. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  16. Multiscale Computer Simulation of Failure in Aerogels

    Science.gov (United States)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  17. Strange attractor simulated on a quantum computer

    OpenAIRE

    M. Terraneo; Georgeot, B.; D.L. Shepelyansky

    2002-01-01

    We show that dissipative classical dynamics converging to a strange attractor can be simulated on a quantum computer. Such quantum computations allow to investigate efficiently the small scale structure of strange attractors, yielding new information inaccessible to classical computers. This opens new possibilities for quantum simulations of various dissipative processes in nature.

  18. QCE : A Simulator for Quantum Computer Hardware

    NARCIS (Netherlands)

    Michielsen, Kristel; Raedt, Hans De

    2003-01-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms.

  19. Computer simulation of liquid crystals

    International Nuclear Information System (INIS)

    In this thesis computer simulations of liquid crystal systems were performed focusing on the isotropic-nematic interface and on the effects of confinement. A range of idealised and atomistic models were employed using both Monte-Carlo and Molecular Dynamics techniques. The structure of the planar isotropic - nematic (I-N) interface was investigated using constant volume Monte-Carlo simulation of systems of hard-ellipsoids confined between parallel hard walls. The nematic phase is observed to wet the hard walls, establishing two planar I-N interfaces per simulation box. The microscopic pressure tensor in the vicinity of the interface was calculated for both planar and normal alignment of the nematic. The surface tension, calculated directly from the pressure tensor, is lowest in the case of planar alignment indicating that this is the preferred alignment at the interface. The form of the transverse pressure across the interface is dramatically different for the two orientations. For the case of planar alignment we observe a large tension (low transverse pressure) on the nematic side and a small compressive region (high transverse pressure) on the isotropic side. For the case of normal alignment we see a large compression on the nematic side followed by tension on the isotropic side. Comparisons are made with the results of Onsager theory, showing excellent agreement. Gay-Berne particles confined to very thin films are investigated using constant pressure Monte-Carlo simulation. Orientational wetting is again observed at the liquid-wall interface. For homeotropic alignment at the wall we observe strong layering of particles into planes parallel to the walls and long-range orientational order which persists well beyond the range of the fluid-wall interaction. The average system order parameter is strongly dependent on the film thickness, the order being highest when the film thickness is commensurate with the formation of an integral number of molecular layers. An

  20. [Activities of Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  1. An advanced coarse-grained nucleosome core particle model for computer simulations of nucleosome-nucleosome interactions under varying ionic conditions.

    Directory of Open Access Journals (Sweden)

    Yanping Fan

    Full Text Available In the eukaryotic cell nucleus, DNA exists as chromatin, a compact but dynamic complex with histone proteins. The first level of DNA organization is the linear array of nucleosome core particles (NCPs. The NCP is a well-defined complex of 147 bp DNA with an octamer of histones. Interactions between NCPs are of paramount importance for higher levels of chromatin compaction. The polyelectrolyte nature of the NCP implies that nucleosome-nucleosome interactions must exhibit a great influence from both the ionic environment as well as the positively charged and highly flexible N-terminal histone tails, protruding out from the NCP. The large size of the system precludes a modelling analysis of chromatin at an all-atom level and calls for coarse-grained approximations. Here, a model of the NCP that include the globular histone core and the flexible histone tails described by one particle per each amino acid and taking into account their net charge is proposed. DNA wrapped around the histone core was approximated at the level of two base pairs represented by one bead (bases and sugar plus four beads of charged phosphate groups. Computer simulations, using a Langevin thermostat, in a dielectric continuum with explicit monovalent (K(+, divalent (Mg(2+ or trivalent (Co(NH(3(6 (3+ cations were performed for systems with one or ten NCPs. Increase of the counterion charge results in a switch from repulsive NCP-NCP interaction in the presence of K(+, to partial aggregation with Mg(2+ and to strong mutual attraction of all 10 NCPs in the presence of CoHex(3+. The new model reproduced experimental results and the structure of the NCP-NCP contacts is in agreement with available data. Cation screening, ion-ion correlations and tail bridging contribute to the NCP-NCP attraction and the new NCP model accounts for these interactions.

  2. Advanced Vadose Zone Simulations Using TOUGH

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, S.; Doughty, C.; Kowalsky, M.B.; Moridis, G.J.; Pan,L.; Xu, T.; Zhang, Y.; Pruess, K.

    2007-02-01

    The vadose zone can be characterized as a complex subsurfacesystem in which intricate physical and biogeochemical processes occur inresponse to a variety of natural forcings and human activities. Thismakes it difficult to describe, understand, and predict the behavior ofthis specific subsurface system. The TOUGH nonisothermal multiphase flowsimulators are well-suited to perform advanced vadose zone studies. Theconceptual models underlying the TOUGH simulators are capable ofrepresenting features specific to the vadose zone, and of addressing avariety of coupled phenomena. Moreover, the simulators are integratedinto software tools that enable advanced data analysis, optimization, andsystem-level modeling. We discuss fundamental and computationalchallenges in simulating vadose zone processes, review recent advances inmodeling such systems, and demonstrate some capabilities of the TOUGHsuite of codes using illustrative examples.

  3. Computational simulation methods for composite fracture mechanics

    Science.gov (United States)

    Murthy, Pappu L. N.

    1988-01-01

    Structural integrity, durability, and damage tolerance of advanced composites are assessed by studying damage initiation at various scales (micro, macro, and global) and accumulation and growth leading to global failure, quantitatively and qualitatively. In addition, various fracture toughness parameters associated with a typical damage and its growth must be determined. Computational structural analysis codes to aid the composite design engineer in performing these tasks were developed. CODSTRAN (COmposite Durability STRuctural ANalysis) is used to qualitatively and quantitatively assess the progressive damage occurring in composite structures due to mechanical and environmental loads. Next, methods are covered that are currently being developed and used at Lewis to predict interlaminar fracture toughness and related parameters of fiber composites given a prescribed damage. The general purpose finite element code MSC/NASTRAN was used to simulate the interlaminar fracture and the associated individual as well as mixed-mode strain energy release rates in fiber composites.

  4. Efficient SDH Computation In Molecular Simulations Data

    OpenAIRE

    Tu, Yi-Cheng; Chen, Shaoping; Pandit, Sagar; Kumar, Anand; Grupcev, Vladimir

    2012-01-01

    Analysis of large particle or molecular simulation data is integral part of the basic-science research community. It often involves computing functions such as point-to-point interactions of particles. Spatial distance histogram (SDH) is one such vital computation in scientific discovery. SDH is frequently used to compute Radial Distribution Function (RDF), and it takes quadratic time to compute using naive approach. Naive SDH computation is even more expensive as it is computed continuously ...

  5. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  6. Hybrid and Electric Advanced Vehicle Systems Simulation

    Science.gov (United States)

    Beach, R. F.; Hammond, R. A.; Mcgehee, R. K.

    1985-01-01

    Predefined components connected to represent wide variety of propulsion systems. Hybrid and Electric Advanced Vehicle System (HEAVY) computer program is flexible tool for evaluating performance and cost of electric and hybrid vehicle propulsion systems. Allows designer to quickly, conveniently, and economically predict performance of proposed drive train.

  7. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  8. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  9. Equipping simulators with an advanced thermal hydraulics model EDF's experience

    International Nuclear Information System (INIS)

    The development of an accelerated version of the advanced CATHARe-1 thermal hydraulics code designed for EDF training simulators (CATHARE-SIMU) was successfully completed as early as 1991. Its successful integration as the principal model of the SIPA Post-Accident Simulator meant that its use could be extended to full-scale simulators as part of the renovation of the stock of existing simulators. In order to further extend the field of application to accidents occurring in shutdown states requiring action and to catch up with developments in respect of the CATHARE code, EDF initiated the SCAR Project designed to adapt CATHARE-2 to simulator requirements (acceleration, parallelization of the computation and extension of the simulation range). In other respects, the installation of SIPA on workstations means that the authors can envisage the application of this remarkable training facility to the understanding of thermal hydraulics accident phenomena

  10. International Conference on Computers and Advanced Technology in Education

    CERN Document Server

    Advanced Information Technology in Education

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computers and Advanced Technology in Education. With the development of computers and advanced technology, the human social activities are changing basically. Education, especially the education reforms in different countries, has been experiencing the great help from the computers and advanced technology. Generally speaking, education is a field which needs more information, while the computers, advanced technology and internet are a good information provider. Also, with the aid of the computer and advanced technology, persons can make the education an effective combination. Therefore, computers and advanced technology should be regarded as an important media in the modern education. Volume Advanced Information Technology in Education is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of computers and advanced technology in education to d...

  11. The Guide to Computer Simulations and Games

    CERN Document Server

    Becker, K

    2011-01-01

    The first computer simulation book for anyone designing or building a game Answering the growing demand for a book catered for those who design, develop, or use simulations and games this book teaches you exactly what you need to know in order to understand the simulations you build or use all without having to earn another degree. Organized into three parts, this informative book first defines computer simulations and describes how they are different from live-action and paper-based simulations. The second section builds upon the previous, with coverage of the technical details of simulations

  12. Computer Simulation in Chemical Kinetics

    Science.gov (United States)

    Anderson, Jay Martin

    1976-01-01

    Discusses the use of the System Dynamics technique in simulating a chemical reaction for kinetic analysis. Also discusses the use of simulation modelling in biology, ecology, and the social sciences, where experimentation may be impractical or impossible. (MLH)

  13. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  14. VLSI circuit simulation using a vector computer

    Science.gov (United States)

    Mcgrogan, S. K.

    1984-01-01

    Simulation of circuits having more than 2000 active devices requires the largest, fastest computers available. A vector computer, such as the CYBER 205, can yield great speed and cost advantages if efforts are made to adapt the simulation program to the strengths of the computer. ASPEC and SPICE (1), two widely used circuit simulation programs, are discussed. ASPECV and VAMOS (5) are respectively vector adaptations of these two simulators. They demonstrate the substantial performance enhancements possible for this class of algorithm on the CYBER 205.

  15. QCE: A Simulator for Quantum Computer Hardware

    OpenAIRE

    Michielsen, Kristel; De Raedt, Hans

    2003-01-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms. QCE runs in a Windows 98/NT/2000/ME/XP environment. It can be used to validate designs of physically realizable quantum processors and as an interactive educational tool to learn about qu...

  16. Stochastic Simulations on the Cellular Wave Computers

    OpenAIRE

    Ercsey-Ravasz, M.; Roska, T.; Néda, Z.

    2006-01-01

    The computational paradigm represented by Cellular Neural/nonlinear Networks (CNN) and the CNN Universal Machine (CNN-UM) as a Cellular Wave Computer, gives new perspectives for computational physics. Many numerical problems and simulations can be elegantly addressed on this fully parallelized and analogic architecture. Here we study the possibility of performing stochastic simulations on this chip. First a realistic random number generator is implemented on the CNN-UM, and then as an example...

  17. Dynamic Simulations of Advanced Fuel Cycles

    Energy Technology Data Exchange (ETDEWEB)

    Steven J. Piet; Brent W. Dixon; Jacob J. Jacobson; Gretchen E. Matthern; David E. Shropshire

    2011-03-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the U.S. Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe “lessons learned” from dynamic simulations but attempt to answer the “so what” question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof.

  18. Analyzing Robotic Kinematics Via Computed Simulations

    Science.gov (United States)

    Carnahan, Timothy M.

    1992-01-01

    Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.

  19. Accounting Principles are Simulated on Quantum Computers

    OpenAIRE

    Diep, Do Ngoc; Giang, Do Hoang

    2005-01-01

    The paper is devoted to a new idea of simulation of accounting by quantum computing. We expose the actual accounting principles in a pure mathematics language. After that we simulated the accounting principles on quantum computers. We show that all arbitrary accounting actions are exhausted by the described basic actions. The main problem of accounting are reduced to some system of linear equations in the economic model of Leontief. In this simulation we use our constructed quantum Gau\\ss-Jor...

  20. Computer simulation of sputtering: A review

    International Nuclear Information System (INIS)

    In 1986, H. H. Andersen reviewed attempts to understand sputtering by computer simulation and identified several areas where further research was needed: potential energy functions for molecular dynamics (MD) modelling; the role of inelastic effects on sputtering, especially near the target surface; the modelling of surface binding in models based on the binary collision approximation (BCA); aspects of cluster emission in MD models; and angular distributions of sputtered particles. To these may be added kinetic energy distributions of sputtered particles and the relationships between MD and BCA models, as well as the development of intermediate models. Many of these topics are discussed. Recent advances in BCA modelling include the explicit evaluation of the time in strict BCA codes and the development of intermediate codes able to simulate certain many-particle problems realistically. Developments in MD modelling include the wide-spread use of many-body potentials in sputtering calculations, inclusion of realistic electron excitation and electron-phonon interactions, and several studies of cluster ion impacts on solid surfaces

  1. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or...

  2. Hybrid and electric advanced vehicle systems (heavy) simulation

    Science.gov (United States)

    Hammond, R. A.; Mcgehee, R. K.

    1981-01-01

    A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.

  3. Transport modeling and advanced computer techniques

    International Nuclear Information System (INIS)

    A workshop was held at the University of Texas in June 1988 to consider the current state of transport codes and whether improved user interfaces would make the codes more usable and accessible to the fusion community. Also considered was the possibility that a software standard could be devised to ease the exchange of routines between groups. It was noted that two of the major obstacles to exchanging routines now are the variety of geometrical representation and choices of units. While the workshop formulated no standards, it was generally agreed that good software engineering would aid in the exchange of routines, and that a continued exchange of ideas between groups would be worthwhile. It seems that before we begin to discuss software standards we should review the current state of computer technology --- both hardware and software to see what influence recent advances might have on our software goals. This is done in this paper

  4. Advanced Scientific Computing Research Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  5. Advanced proton imaging in computed tomography

    CERN Document Server

    Mattiazzo, S; Giubilato, P; Pantano, D; Pozzobon, N; Snoeys, W; Wyss, J

    2015-01-01

    In recent years the use of hadrons for cancer radiation treatment has grown in importance, and many facilities are currently operational or under construction worldwide. To fully exploit the therapeutic advantages offered by hadron therapy, precise body imaging for accurate beam delivery is decisive. Proton computed tomography (pCT) scanners, currently in their R&D phase, provide the ultimate 3D imaging for hadrons treatment guidance. A key component of a pCT scanner is the detector used to track the protons, which has great impact on the scanner performances and ultimately limits its maximum speed. In this article, a novel proton-tracking detector was presented that would have higher scanning speed, better spatial resolution and lower material budget with respect to present state-of-the-art detectors, leading to enhanced performances. This advancement in performances is achieved by employing the very latest development in monolithic active pixel detectors (to build high granularity, low material budget, ...

  6. Filtration theory using computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Corey, I. [Lawrence Livermore National Lab., CA (United States)

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  7. Computational tools for simulation of phase transformations

    OpenAIRE

    Schalin, Mikael

    1999-01-01

    A new software package, Thermite, for thermodynamiccalculations and process simulation is developed around theThermo-Calc databank. Thermite is a computational toolbox forequilibrium calculations and simulation of phasetransformations. It provides graphic visualisation and allowsmanipulation of the presented data. Two types of phase transformations have been implemented inthe software. First, it was used to simulate solidification ofalloys using the Gulliver-Scheil model. Simulations were mad...

  8. Computer simulation in physics and engineering

    CERN Document Server

    Steinhauser, Martin Oliver

    2013-01-01

    This work is a needed reference for widely used techniques and methods of computer simulation in physics and other disciplines, such as materials science. The work conveys both: the theoretical foundations of computer simulation as well as applications and "tricks of the trade", that often are scattered across various papers. Thus it will meet a need and fill a gap for every scientist who needs computer simulations for his/her task at hand. In addition to being a reference, case studies and exercises for use as course reading are included.

  9. Advanced ST plasma scenario simulations for NSTX

    International Nuclear Information System (INIS)

    Integrated scenario simulations are done for NSTX that address four primary milestones for developing advanced ST configurations: high β and high βN inductive discharges to study all aspects of ST physics in the high beta regime; non-inductively sustained discharges for flattop times greater than the skin time to study the various current drive techniques; non-inductively sustained discharges at high βfor flattop times much greater than a skin time which provides the integrated advanced ST target for NSTX; and non-solenoidal startup and plasma current rampup. The simulations done here use the Tokamak Simulation Code (TSC) and are based on a discharge 109070. TRANSP analysis of the discharge provided the thermal diffusivities for electrons and ions, the neutral beam (NB) deposition profile and other characteristics. CURRAY is used to calculate the High Harmonic Fast Wave (HHFW) heating depositions and current drive. GENRAY/CQL3D is used to establish the heating and CD deposition profiles for electron Bernstein waves (EBW). Analysis of the ideal MHD stability is done with JSOLVER, BALMSC, and PEST2. The simulations indicate that the integrated advanced ST plasma is reachable, obtaining stable plasmas with β ∼ 40% at βN's of 7.7-9, IP = 1.0 MA and BT = 0.35 T. The plasma is 100% non-inductive and has a flattop of 4 skin times. The resulting global energy confinement corresponds to a multiplier of H98(y,2) = 1.5. The simulations have demonstrated the importance of HHFW heating and CD, EBW off-axis CD, strong plasma shaping, density control, and early heating/H-mode transition for producing and optimizing these plasma configurations (author)

  10. Evaluation of Visual Computer Simulator for Computer Architecture Education

    Science.gov (United States)

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  11. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    Science.gov (United States)

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  12. Quantum physics, simulation, and computation

    International Nuclear Information System (INIS)

    Full text: The ultimate scope and power of computers will be determined by the laws of physics. Quantum computers exploit the rules of quantum mechanics, using quantum coherence and entanglement for new ways of information processing. Up to date, the realization of these systems requires extremely precise control of matter on the atomic scale and a nearly perfect isolation from the environment. The question, to what extent quantum information processing can also be exploited in 'natural' and less controlled systems, including biological ones, is exciting but still open. In this talk, I will present some of our recent work on (quantum) physically and biologically motivated models of information processing. (author)

  13. An Introduction to Parallel Cluster Computing Using PVM for Computer Modeling and Simulation of Engineering Problems

    International Nuclear Information System (INIS)

    An investigation has been conducted regarding the ability of clustered personal computers to improve the performance of executing software simulations for solving engineering problems. The power and utility of personal computers continues to grow exponentially through advances in computing capabilities such as newer microprocessors, advances in microchip technologies, electronic packaging, and cost effective gigabyte-size hard drive capacity. Many engineering problems require significant computing power. Therefore, the computation has to be done by high-performance computer systems that cost millions of dollars and need gigabytes of memory to complete the task. Alternately, it is feasible to provide adequate computing in the form of clustered personal computers. This method cuts the cost and size by linking (clustering) personal computers together across a network. Clusters also have the advantage that they can be used as stand-alone computers when they are not operating as a parallel computer. Parallel computing software to exploit clusters is available for computer operating systems like Unix, Windows NT, or Linux. This project concentrates on the use of Windows NT, and the Parallel Virtual Machine (PVM) system to solve an engineering dynamics problem in Fortran

  14. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  15. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  16. Computer Simulations, Disclosure and Duty of Care

    Directory of Open Access Journals (Sweden)

    John Barlow

    2006-05-01

    Full Text Available Computer simulations provide cost effective methods for manipulating and modeling 'reality'. However they are not real. They are imitations of a system or event, real or fabricated, and as such mimic, duplicate or represent that system or event. The degree to which a computer simulation aligns with and reproduces the ‘reality’ of the system or event it attempts to mimic or duplicate depends upon many factors including the efficiency of the simulation algorithm, the processing power of the computer hardware used to run the simulation model, and the expertise, assumptions and prejudices of those concerned with designing, implementing and interpreting the simulation output. Computer simulations in particular are increasingly replacing physical experimentation in many disciplines, and as a consequence, are used to underpin quite significant decision-making which may impact on ‘innocent’ third parties. In this context, this paper examines two interrelated issues: Firstly, how much and what kind of information should a simulation builder be required to disclose to potential users of the simulation? Secondly, what are the implications for a decision-maker who acts on the basis of their interpretation of a simulation output without any reference to its veracity, which may in turn comprise the safety of other parties?

  17. Stochastic Simulations on the Cellular Wave Computers

    CERN Document Server

    Ercsey-Ravasz, M; Neda, Z

    2006-01-01

    The computational paradigm represented by Cellular Neural/nonlinear Networks (CNN) and the CNN Universal Machine (CNN-UM) as a Cellular Wave Computer, gives new perspectives for computational physics. Many numerical problems and simulations can be elegantly addressed on this fully parallelized and analogic architecture. Here we study the possibility of performing stochastic simulations on this chip. First a realistic random number generator is implemented on the CNN-UM, and then as an example the two-dimensional Ising model is studied by Monte Carlo type simulations. The results obtained on an experimental version of the CNN-UM with 128 * 128 cells are in good agreement with the results obtained on digital computers. Computational time measurements suggests that the developing trend of the CNN-UM chips - increasing the lattice size and the number of local logic memories - will assure an important advantage for the CNN-UM in the near future.

  18. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  19. Advances in Computational Techniques to Study GPCR-Ligand Recognition.

    Science.gov (United States)

    Ciancetta, Antonella; Sabbadin, Davide; Federico, Stephanie; Spalluto, Giampiero; Moro, Stefano

    2015-12-01

    G-protein-coupled receptors (GPCRs) are among the most intensely investigated drug targets. The recent revolutions in protein engineering and molecular modeling algorithms have overturned the research paradigm in the GPCR field. While the numerous ligand-bound X-ray structures determined have provided invaluable insights into GPCR structure and function, the development of algorithms exploiting graphics processing units (GPUs) has made the simulation of GPCRs in explicit lipid-water environments feasible within reasonable computation times. In this review we present a survey of the recent advances in structure-based drug design approaches with a particular emphasis on the elucidation of the ligand recognition process in class A GPCRs by means of membrane molecular dynamics (MD) simulations. PMID:26538318

  20. Advanced studies on Simulation Methodologies for very Complicated Fracture Phenomena

    International Nuclear Information System (INIS)

    Although nowadays, computational techniques are well developed, for Extremely Complicated Fracture Phenomena, they are still very difficult to simulate, for general engineers, researchers. To overcome many difficulties in those simulations, we have developed not only Simulation Methodologies but also theoretical basis and concepts. We sometimes observe extremely complicated fracture patterns, especially in dynamic fracture phenomena such as dynamic crack branching, kinking, curving, etc. For examples, although the humankind, from primitive men to modern scientists such as Albert Einstein had watched the post-mortem patterns of dynamic crack branching, the governing condition for the onset of the phenomena had been unsolved until our experimental study. From in these studies, we found the governing condition of dynamic crack bifurcation, as follows. When the total energy flux per unit time into a propagating crack tip reaches the material crack resistance, the crack braches into two cracks [total energy flux criterion]. The crack branches many times whenever the criterion is satisfied. Furthermore, the complexities also arise due to their time-dependence and/or their-deformation dependence. In order to make it possible to simulate such extremely complicated fracture phenomena, we developed many original advanced computational methods and technologies. These are (i) moving finite element method based on Delaunay automatic triangulation (MFEMBOAT), path independent, (ii) equivalent domain integral expression of the dynamic J integral associated with a continuous auxiliary function, (iii) Mixed phase path-prediction mode simulation, (iv) implicit path prediction criterion. In this paper, these advanced computational methods are thoroughly explained together with successful comparison with the experimental results. Since multiple dynamic crack branching phenomena may be most complicated fracture due to complicated fracture paths, and its time dependence (transient

  1. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  2. Computer simulation to arc spraying

    Institute of Scientific and Technical Information of China (English)

    梁志芳; 李午申; 王迎娜

    2004-01-01

    The arc spraying process is divided into two stages: the first stage is atomization-spraying stream (ASS) and the second one is spraying deposition (SD). Then study status is described of both stages' physical model and corresponding controlling-equation. Based on the analysis of study status, the conclusion as follows is got. The heat and mass transfer models with two or three dimensions in ASS stage should be established to far deeply analyses the dynamical and thermal behavior of the overheat droplet. The statistics law of overheated droplets should be further studied by connecting simulation with experiments. More proper validation experiments should be designed for flattening simulation to modify the models in SD stage.

  3. Computer-simulated phacoemulsification improvements

    Science.gov (United States)

    Soederberg, Per G.; Laurell, Carl-Gustaf; Artzen, D.; Nordh, Leif; Skarman, Eva; Nordqvist, P.; Andersson, Mats

    2002-06-01

    A simulator for phacoemulsification cataract extraction is developed. A three-dimensional visual interface and foot pedals for phacoemulsification power, x-y positioning, zoom and focus were established. An algorithm that allows real time visual feedback of the surgical field was developed. Cataract surgery is the most common surgical procedure. The operation requires input from both feet and both hands and provides visual feedback through the operation microscope essentially without tactile feedback. Experience demonstrates that the number of complications for an experienced surgeon learning phacoemulsification, decreases exponentially, reaching close to the asymptote after the first 500 procedures despite initial wet lab training on animal eyes. Simulator training is anticipated to decrease training time, decrease complication rate for the beginner and reduce expensive supervision by a high volume surgeon.

  4. Computer simulation of aeolian bedforms

    Institute of Scientific and Technical Information of China (English)

    苗天德; 慕青松; 武生智

    2001-01-01

    A discrete model is set up using the cellular automaton method and applied to simulate the formation and evolution of aeolian bedforms. The calculated bedforms resemble the actual shape of natural sand ripples and dunes.This reveals that the sand movement is a typical nonlinear dynamical process, and that the nesting configuration of sand ripples, dunes and draas are a self-organized system with a fractal characteristic, and evotves simultaneously at various scales in the sand-airflow.

  5. Advanced numerical techniques in core simulations

    International Nuclear Information System (INIS)

    The whole core simulations are one of the most CPU intensive calculations in reactor physics design and analyses. For a designer it is imperative to perform these calculations with good accuracy and in least time possible to try out various options. It is important for the code developers to use techniques involving minimum approximations and to use most recent numerical methods applied in tandem with huge computing power available today. In the presented paper, some of these methods are discussed. (author)

  6. Computer simulations applied in materials

    International Nuclear Information System (INIS)

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La2Zr2O7 pyrochlores; first principle calculations of defects formation energies in the Y2(Ti,Sn,Zr)2O7 pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO2; composition defect maps for A3+B3+O3 perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  7. Computer simulation of geologic systems

    International Nuclear Information System (INIS)

    The Geologic Simulation Model (GSM) developed under the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) project at the Pacific Northwest Laboratory for the Department of Energy is a quasi-deterministic process-response model which simulates the development of the geologic and hydrologic systems of a groundwater basin for a million years into the future. Effects of natural processes on the groundwater hydrologic system are modeled principally by rate equations. The combined effects and synergistic interactions of different processes are approximated by linear superposition of their effects during discrete time intervals in a stepwise-integration approach. The results of the GSM simulations are not yet defensible. They are promising, and the general behavior of the GSM over the near-term (20,000 years) and long-term (million years) is plausible. Thus, in terms of a demonstration of the GSM technology alone, the results indicate that the development effort was a success, and this report indicates what additional effort is required to make the GSM defensible. However, the GSM is a part of a coordinated performance analysis which involves other models as well, and is intended as a primary guide to analyses to be performed in addition to that of the present system. The usefulness of the GSM results to the demonstration of a coordinated performance analysis technology must be determined by considering the validity of the results and how they may be applied realistically (unmodified) to guiding more detailed analyses

  8. Computer simulations applied in materials

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  9. Computer simulation boosts automation in the stockyard

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-04-01

    Today's desktop computer and advanced software keep pace with handling equipment to reach new heights of sophistication with graphic simulation able to show precisely what is and could happen in the coal terminal's stockyard. The article describes an innovative coal terminal nearing completion on the Pacific coast at Lazaro Cardenas in Mexico, called the Petracalco terminal. Here coal is unloaded, stored and fed to the nearby power plant of Pdte Plutarco Elias Calles. The R & D department of the Italian company Techint, Italimpianti has developed MHATIS, a sophisticated software system for marine terminal management here, allowing analysis of performance with the use of graphical animation. Strategies can be tested before being put into practice and likely power station demand can be predicted. The design and operation of the MHATIS system is explained. Other integrated coal handling plants described in the article are that developed by the then PWH (renamed Krupp Foerdertechnik) of Germany for the Israel Electric Corporation and the installation by the same company of a further bucketwheel for a redesigned coal stockyard at the Port of Hamburg operated by Hansaport. 1 fig., 4 photos.

  10. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  11. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  12. New Developments in the Simulation of Advanced Accelerator Concepts

    International Nuclear Information System (INIS)

    Improved computational methods are essential to the diverse and rapidly developing field of advanced accelerator concepts. We present an overview of some computational algorithms for laser-plasma concepts and high-brightness photocathode electron sources. In particular, we discuss algorithms for reduced laser-plasma models that can be orders of magnitude faster than their higher-fidelity counterparts, as well as important on-going efforts to include relevant additional physics that has been previously neglected. As an example of the former, we present 2D laser wakefield accelerator simulations in an optimal Lorentz frame, demonstrating and gt;10 GeV energy gain of externally injected electrons over a 2 m interaction length, showing good agreement with predictions from scaled simulations and theory, with a speedup factor of ∼2,000 as compared to standard particle-in-cell.

  13. Computer simulation of gear tooth manufacturing processes

    Science.gov (United States)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  14. The Consortium for Advanced Simulation of Light Water Reactors

    International Nuclear Information System (INIS)

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) is a DOE Energy Innovation Hub for modeling and simulation of nuclear reactors. It brings together an exceptionally capable team from national labs, industry and academia that will apply existing modeling and simulation capabilities and develop advanced capabilities to create a usable environment for predictive simulation of light water reactors (LWRs). This environment, designated as the Virtual Environment for Reactor Applications (VERA), will incorporate science-based models, state-of-the-art numerical methods, modern computational science and engineering practices, and uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs). It will couple state-of-the-art fuel performance, neutronics, thermal-hydraulics (T-H), and structural models with existing tools for systems and safety analysis and will be designed for implementation on both today's leadership-class computers and the advanced architecture platforms now under development by the DOE. CASL focuses on a set of challenge problems such as CRUD induced power shift and localized corrosion, grid-to-rod fretting fuel failures, pellet clad interaction, fuel assembly distortion, etc. that encompass the key phenomena limiting the performance of PWRs. It is expected that much of the capability developed will be applicable to other types of reactors. CASL's mission is to develop and apply modeling and simulation capabilities to address three critical areas of performance for nuclear power plants: (1) reduce capital and operating costs per unit energy by enabling power uprates and plant lifetime extension, (2) reduce nuclear waste volume generated by enabling higher fuel burnup, and (3) enhance nuclear safety by enabling high-fidelity predictive capability for component performance.

  15. The Consortium for Advanced Simulation of Light Water Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ronaldo Szilard; Hongbin Zhang; Doug Kothe; Paul Turinsky

    2011-10-01

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) is a DOE Energy Innovation Hub for modeling and simulation of nuclear reactors. It brings together an exceptionally capable team from national labs, industry and academia that will apply existing modeling and simulation capabilities and develop advanced capabilities to create a usable environment for predictive simulation of light water reactors (LWRs). This environment, designated as the Virtual Environment for Reactor Applications (VERA), will incorporate science-based models, state-of-the-art numerical methods, modern computational science and engineering practices, and uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs). It will couple state-of-the-art fuel performance, neutronics, thermal-hydraulics (T-H), and structural models with existing tools for systems and safety analysis and will be designed for implementation on both today's leadership-class computers and the advanced architecture platforms now under development by the DOE. CASL focuses on a set of challenge problems such as CRUD induced power shift and localized corrosion, grid-to-rod fretting fuel failures, pellet clad interaction, fuel assembly distortion, etc. that encompass the key phenomena limiting the performance of PWRs. It is expected that much of the capability developed will be applicable to other types of reactors. CASL's mission is to develop and apply modeling and simulation capabilities to address three critical areas of performance for nuclear power plants: (1) reduce capital and operating costs per unit energy by enabling power uprates and plant lifetime extension, (2) reduce nuclear waste volume generated by enabling higher fuel burnup, and (3) enhance nuclear safety by enabling high-fidelity predictive capability for component performance.

  16. Motion control in advanced driving simulators

    OpenAIRE

    Elloumi, Hatem

    2006-01-01

    Driving simulators are advanced devices composed of four components: a virtual scene projected on a wide screen to imitate the road and the traffic, an audio system to play the driving sounds (horn, squeal of brakes, etc.), a car cockpit (including a real dashboard, the pedals and the seat of the driver) to copy the body position and the interaction of the driver with a real vehicle and finally a robot carrying the car cockpit to provide its motion. While the first three components could be c...

  17. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  18. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  19. Cluster computing software for GATE simulations.

    Science.gov (United States)

    De Beenhouwer, Jan; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R

    2007-06-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values. PMID:17654895

  20. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    The importance of computer simulations in lipid bilayer research has become more prominent for the last couple of decades and as computers get even faster, simulations will play an increasingly important part of understanding the processes that take place in and across cell membranes. This thesis...... entitled Computer simulations of lipid bilayers and proteins describes two molecular dynamics (MD) simulation studies of pure lipid bilayers as well as a study of a transmembrane protein embedded in a lipid bilayer matrix. Below follows a brief overview of the thesis. Chapter 1. This chapter is a short......, Pressure profile calculations in lipid bilayers: A lipid bilayer is merely $\\sim$5~nm thick, but the lateral pressure (parallel to the bilayer plane) varies several hundred bar on this short distance (normal to the bilayer). These variations in the lateral pressure are commonly referred to as the pressure...

  1. Computer-Aided Simulation of Mastoidectomy

    Institute of Scientific and Technical Information of China (English)

    CHEN He-xin; MA Zhi-chao; Wang Zhang-feng; GUO Jie-bo; WEN Wei-ping; XU Geng

    2008-01-01

    Objective To establish a three-dimensional model of the temporal bone using CT scan images for study of temporal bone structures and simulation of mastoidectomy procedures. Methods CT scan images from 6 individuals (12 temporal bones) were used to reconstruct the Fallopian canal, internal auditory canal, cochlea, semicircular canals, sigmoid sinus, posterior fossa floor and jugular bulb on a computer platform. Their anatomical relations within the temporal bone were restored in the computed model. The same model was used to simulate mastoidectomy procedures. Results The reconstructed computer model provided accurate and clear three-dimensional images of temporal bone structures. Simulation of mastoidectomy using these images provided procedural experiences closely mimicking the real surgical procedure. Conclusion Computeraided three dimensional reconstruction of temporal bone structures using CT scan images is a useful tool in surgical simulation and can aid surgical procedure planning.

  2. Creating science simulations through Computational Thinking Patterns

    Science.gov (United States)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  3. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, Donald P. [Cornell Univ., Ithaca, NY (United States); Hencey, Brandon M. [Cornell Univ., Ithaca, NY (United States)

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  4. Modeling and Computer Simulation: Molecular Dynamics and Kinetic Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, B.D.; Caturla, M.J.; Diaz de la Rubia, T.

    2000-10-10

    Recent years have witnessed tremendous advances in the realistic multiscale simulation of complex physical phenomena, such as irradiation and aging effects of materials, made possible by the enormous progress achieved in computational physics for calculating reliable, yet tractable interatomic potentials and the vast improvements in computational power and parallel computing. As a result, computational materials science is emerging as an important complement to theory and experiment to provide fundamental materials science insight. This article describes the atomistic modeling techniques of molecular dynamics (MD) and kinetic Monte Carlo (KMC), and an example of their application to radiation damage production and accumulation in metals. It is important to note at the outset that the primary objective of atomistic computer simulation should be obtaining physical insight into atomic-level processes. Classical molecular dynamics is a powerful method for obtaining insight about the dynamics of physical processes that occur on relatively short time scales. Current computational capability allows treatment of atomic systems containing as many as 10{sup 9} atoms for times on the order of 100 ns (10{sup -7}s). The main limitation of classical MD simulation is the relatively short times accessible. Kinetic Monte Carlo provides the ability to reach macroscopic times by modeling diffusional processes and time-scales rather than individual atomic vibrations. Coupling MD and KMC has developed into a powerful, multiscale tool for the simulation of radiation damage in metals.

  5. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    OpenAIRE

    Huynh, Nathan; Snyder, Rita; Vidal, Jose M.; Tavakoli, Abbas S.; Cai, Bo

    2012-01-01

    The medication administration process (MAP) is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess...

  6. CONSTRUCTION COST INTEGRATED CONTROL BASED ON COMPUTER SIMULATION

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Construction cost control is a complex system engineering. Thetraditional controlling method cannot dynamically control in advance the construction cost because of its hysteresis. This paper proposes a computer simulation based construction cost integrated control method, which combines the cost with PERT systematically, so that the construction cost can be predicted and optimized systematically and effectively. The new method overcomes the hysteresis of the traditional systems, and is a distinct improvement over them in effect and practicality.

  7. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  8. Advances in NLTE Modeling for Integrated Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Scott, H A; Hansen, S B

    2009-07-08

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, {Delta}n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  9. Enabling Computational Technologies for Terascale Scientific Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  10. The ACP (Advanced Computer Program) multiprocessor system at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Case, G.; Cook, A.; Fischler, M.; Gaines, I.; Hance, R.; Husby, D.

    1986-09-01

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere.

  11. The ACP [Advanced Computer Program] multiprocessor system at Fermilab

    International Nuclear Information System (INIS)

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere

  12. Electric Propulsion Plume Simulations Using Parallel Computer

    Directory of Open Access Journals (Sweden)

    Joseph Wang

    2007-01-01

    Full Text Available A parallel, three-dimensional electrostatic PIC code is developed for large-scale electric propulsion simulations using parallel supercomputers. This code uses a newly developed immersed-finite-element particle-in-cell (IFE-PIC algorithm designed to handle complex boundary conditions accurately while maintaining the computational speed of the standard PIC code. Domain decomposition is used in both field solve and particle push to divide the computation among processors. Two simulations studies are presented to demonstrate the capability of the code. The first is a full particle simulation of near-thruster plume using real ion to electron mass ratio. The second is a high-resolution simulation of multiple ion thruster plume interactions for a realistic spacecraft using a domain enclosing the entire solar array panel. Performance benchmarks show that the IFE-PIC achieves a high parallel efficiency of ≥ 90%

  13. Simulating Factorization with a Quantum Computer

    OpenAIRE

    Rosales, Jose Luis

    2015-01-01

    Modern cryptography is largely based on complexity assumptions, for example, the ubiquitous RSA is based on the supposed complexity of the prime factorization problem. Thus, it is of fundamental importance to understand how a quantum computer would eventually weaken these algorithms. In this paper, one follows Feynman's prescription for a computer to simulate the physics corresponding to the algorithm of factoring a large number $N$ into primes. Using Dirac-Jordan transformation theory one tr...

  14. General Agreement on Tariff and Trade Negotiations: A Computer-Based Simulation.

    Science.gov (United States)

    Manrique, Gabriel G.

    This paper recommends the use of a computer simulation about trade and tariff negotiations to reinforce and apply principles learned in undergraduate international trade courses and to provide students with an opportunity to use the advanced features of Symphony, a computer spreadsheet. This simulation is a game in which both the class and…

  15. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  16. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  17. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  18. Computer simulation of tritium removal facility design

    International Nuclear Information System (INIS)

    In this study, a computer simulation of tritium diffusion out of molten salt is performed using COMSOL Multiphysics. The purpose of the simulation is to investigate the efficiency of the permeation window type tritium removal facility, which is proposed for tritium control in FHRs. The result of the simulation suggests a large surface area is one of the key issues in the design of the tritium removal facility, and the simple tube bundle concept is insufficient to provide the surface area needed for an efficient tritium removal process. (author)

  19. International conference on Advances in Intelligent Control and Innovative Computing

    CERN Document Server

    Castillo, Oscar; Huang, Xu; Intelligent Control and Innovative Computing

    2012-01-01

    In the lightning-fast world of intelligent control and cutting-edge computing, it is vitally important to stay abreast of developments that seem to follow each other without pause. This publication features the very latest and some of the very best current research in the field, with 32 revised and extended research articles written by prominent researchers in the field. Culled from contributions to the key 2011 conference Advances in Intelligent Control and Innovative Computing, held in Hong Kong, the articles deal with a wealth of relevant topics, from the most recent work in artificial intelligence and decision-supporting systems, to automated planning, modelling and simulation, signal processing, and industrial applications. Not only does this work communicate the current state of the art in intelligent control and innovative computing, it is also an illuminating guide to up-to-date topics for researchers and graduate students in the field. The quality of the contents is absolutely assured by the high pro...

  20. Simulating physical phenomena with a quantum computer

    Science.gov (United States)

    Ortiz, Gerardo

    2003-03-01

    In a keynote speech at MIT in 1981 Richard Feynman raised some provocative questions in connection to the exact simulation of physical systems using a special device named a ``quantum computer'' (QC). At the time it was known that deterministic simulations of quantum phenomena in classical computers required a number of resources that scaled exponentially with the number of degrees of freedom, and also that the probabilistic simulation of certain quantum problems were limited by the so-called sign or phase problem, a problem believed to be of exponential complexity. Such a QC was intended to mimick physical processes exactly the same as Nature. Certainly, remarks coming from such an influential figure generated widespread interest in these ideas, and today after 21 years there are still some open questions. What kind of physical phenomena can be simulated with a QC?, How?, and What are its limitations? Addressing and attempting to answer these questions is what this talk is about. Definitively, the goal of physics simulation using controllable quantum systems (``physics imitation'') is to exploit quantum laws to advantage, and thus accomplish efficient imitation. Fundamental is the connection between a quantum computational model and a physical system by transformations of operator algebras. This concept is a necessary one because in Quantum Mechanics each physical system is naturally associated with a language of operators and thus can be considered as a possible model of quantum computation. The remarkable result is that an arbitrary physical system is naturally simulatable by another physical system (or QC) whenever a ``dictionary'' between the two operator algebras exists. I will explain these concepts and address some of Feynman's concerns regarding the simulation of fermionic systems. Finally, I will illustrate the main ideas by imitating simple physical phenomena borrowed from condensed matter physics using quantum algorithms, and present experimental

  1. Progress in Computational Simulation of Earthquakes

    Science.gov (United States)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  2. Macromod: Computer Simulation For Introductory Economics

    Science.gov (United States)

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  3. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  4. Eliminating Computational Instability In Multibody Simulations

    Science.gov (United States)

    Watts, Gaines L.

    1994-01-01

    TWOBODY implements improved version of Lagrange multiplier method. Program ultilizes programming technique eliminating computational instability in multibody simulations in which Lagrange multipliers used. In technique, one uses constraint equations, instead of integration, to determine coordinates that are not independent. To illustrate technique, it includes simple mathematical model of solid rocket booster and parachute connected by frictionless swivel. Written in FORTRAN 77.

  5. Computer simulations of phospholipid - membrane thermodynamic fluctuations

    DEFF Research Database (Denmark)

    Pedersen, U.R.; Peters, Günther H.j.; Schröder, T.B.;

    2008-01-01

    This paper reports all-atom computer simulations of five phospholipid membranes, DMPC, DPPC, DMPG, DMPS, and DMPSH, with a focus on the thermal equilibrium fluctuations of volume, energy, area, thickness, and order parameter. For the slow fluctuations at constant temperature and pressure (defined...

  6. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2013-01-01

    This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...

  7. Transonic wing analysis using advanced computational methods

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  8. Cosmological Simulations on a Grid of Computers

    CERN Document Server

    Depardon, Benjamin; Desprez, Frédéric; Blaizot, Jérémy; Courtois, Hélène M

    2010-01-01

    The work presented in this paper aims at restricting the input parameter values of the semi-analytical model used in GALICS and MOMAF, so as to derive which parameters influence the most the results, e.g., star formation, feedback and halo recycling efficiencies, etc. Our approach is to proceed empirically: we run lots of simulations and derive the correct ranges of values. The computation time needed is so large, that we need to run on a grid of computers. Hence, we model GALICS and MOMAF execution time and output files size, and run the simulation using a grid middleware: DIET. All the complexity of accessing resources, scheduling simulations and managing data is harnessed by DIET and hidden behind a web portal accessible to the users.

  9. Strange attractor simulated on a quantum computer

    CERN Document Server

    Terraneo, M; Shepelyansky, D L

    2003-01-01

    Starting from the work of Lorenz, it has been realized that the dynamics of many various dissipative systems converges to so-called strange attractors. These objects are characterized by fractal dimensions and chaotic unstable dynamics of individual trajectories. They appear in nature in very different contexts, including applications to turbulence and weather forecast, molecular dynamics, chaotic chemical reactions, multimode solid state lasers and complex dynamics in ecological systems and physiology. The efficient numerical simulation of such dissipative systems can therefore lead to many important practical applications. Here we study a simple deterministic model where dynamics converges to a strange attractor, and show that it can be efficiently simulated on a quantum computer. Even if the dynamics on the attractor is unstable, dissipative and irreversible, a realistic quantum computer can simulate it in a reversible way, and, already with 70 qubits, will provide access to new informations unaccessible f...

  10. Second International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Konar, Amit; Chakraborty, Aruna

    2014-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two-volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 148 scholarly papers, which have been accepted for presentation from over 640 submissions in the second International Conference on Advanced Computing, Networking and Informatics, 2014, held in Kolkata, India during June 24-26, 2014. The first volume includes innovative computing techniques and relevant research results in informatics with selective applications in pattern recognition, signal/image process...

  11. Advances in Future Computer and Control Systems v.2

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  12. Advances in Future Computer and Control Systems v.1

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  13. Computing Algorithms for Nuffield Advanced Physics.

    Science.gov (United States)

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  14. Aerodynamic optimization studies on advanced architecture computers

    Science.gov (United States)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  15. Fluid Dynamics Theory, Computation, and Numerical Simulation

    CERN Document Server

    Pozrikidis, Constantine

    2009-01-01

    Fluid Dynamics: Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner. The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming. This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice. There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes. Two distinguishing features of the discourse are: solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty. Matlab codes are presented and discussed for ...

  16. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2001-01-01

    Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...

  17. Power-efficient computer architectures recent advances

    CERN Document Server

    Själander, Magnus; Kaxiras, Stefanos

    2014-01-01

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture.Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Sp

  18. Advanced Metamorphic Techniques in Computer Viruses

    OpenAIRE

    Beaucamps, Philippe

    2007-01-01

    Nowadays viruses use polymorphic techniques to mutate their code on each replication, thus evading detection by antiviruses. However detection by emulation can defeat simple polymorphism: thus metamorphic techniques are used which thoroughly change the viral code, even after decryption. We briefly detail this evolution of virus protection techniques against detection and then study the MetaPHOR virus, today's most advanced metamorphic virus.

  19. Advances in computing, and their impact on scientific computing.

    Science.gov (United States)

    Giles, Mike

    2002-01-01

    This paper begins by discussing the developments and trends in computer hardware, starting with the basic components (microprocessors, memory, disks, system interconnect, networking and visualization) before looking at complete systems (death of vector supercomputing, slow demise of large shared-memory systems, rapid growth in very large clusters of PCs). It then considers the software side, the relative maturity of shared-memory (OpenMP) and distributed-memory (MPI) programming environments, and new developments in 'grid computing'. Finally, it touches on the increasing importance of software packages in scientific computing, and the increased importance and difficulty of introducing good software engineering practices into very large academic software development projects. PMID:12539947

  20. Computer simulation of underwater nuclear events

    International Nuclear Information System (INIS)

    This report describes the computer simulation of two underwater nuclear explosions, Operation Wigwam and a modern hypothetical explosion of greater yield. The computer simulations were done in spherical geometry with the LASNEX computer code. Comparison of the LASNEX calculation with Snay's analytical results and the Wigwam measurements shows that agreement in the shock pressure versus range in water is better than 5%. The results of the calculations are also consistent with the cube root scaling law for an underwater blast wave. The time constant of the wave front was determined from the wave profiles taken at several points. The LASNEX time-constant calculation and Snay's theoretical results agree to within 20%. A time-constant-versus-range relation empirically fitted by Snay is valid only within a limited range at low pressures, whereas a time-constant formula based on Sedov's similarity solution holds at very high pressures. This leaves the intermediate pressure range with neither an empirical nor a theoretical formula for the time constant. These one-dimensional simulations demonstrate applicability of the computer code to investigations of this nature, and justify the use of this technique for more complex two-dimensional problems, namely, surface effects on underwater nuclear explosions. 16 refs., 8 figs., 2 tabs

  1. Preface: Special issue: ten years of advances in computer entertainment

    NARCIS (Netherlands)

    Katayose, Haruhiro; Reidsma, Dennis; Rauterberg, M

    2014-01-01

    This special issue celebrates the 10th edition of the International Conference on Advances in Computer Entertainment (ACE) by collecting six selected and revised papers from among this year’s accepted contributions.

  2. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  3. Cloud Computing Simulation Using CloudSim

    OpenAIRE

    Kumar, Ranjan; Sahoo, G.

    2014-01-01

    As we know that Cloud Computing is a new paradigm in IT. It has many advantages and disadvantages. But in future it will spread in the whole world. Many researches are going on for securing the cloud services. Simulation is the act of imitating or pretending. It is a situation in which a particular set of condition is created artificially in order to study that could exit in reality. We need only a simple Operating System with some memory to startup our Computer. All our resources will be ava...

  4. Computational fluid dynamics for sport simulation

    CERN Document Server

    2009-01-01

    All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.

  5. 3rd International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Chaki, Nabendu

    2016-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

  6. Computer Simulations on a Multidimensional Continuum:

    DEFF Research Database (Denmark)

    Pfeffer, Melanie; Otrel-Cass, Kathrin; Girault, Isabelle;

    2016-01-01

    , often simplified models of real-world or hypothetical phenomenon that contain features that not only allow but promote the exploration of ideas, manipulation of parameters, observation of events, and testing of questions. The origin and components of this definition are described in further detail...... with emphasis on simulations’ algorithmic, dynamic, and simple features. Defined as models, simulations can be computational or conceptual in nature and may reflect hypothetical or real events; such distinctions are addressed. Examples of programs that demonstrate the features of simulations emphasized in our...... definition are introduced throughout the current chapter....

  7. Monte Carlo Simulation of Quantum Computation

    OpenAIRE

    Cerf, N. J.; Koonin, S. E.

    1997-01-01

    The many-body dynamics of a quantum computer can be reduced to the time evolution of non-interacting quantum bits in auxiliary fields by use of the Hubbard-Stratonovich representation of two-bit quantum gates in terms of one-bit gates. This makes it possible to perform the stochastic simulation of a quantum algorithm, based on the Monte Carlo evaluation of an integral of dimension polynomial in the number of quantum bits. As an example, the simulation of the quantum circuit for the Fast Fouri...

  8. Computer Simulations of Lipid Bilayers and Proteins

    OpenAIRE

    Sonne, Jacob

    2006-01-01

    Molekyldynamiske (MD) computersimuleringer benyttes i dag i stor ud\\-str{\\ae}k\\-ning til at studere forskellige typer systemer p{\\aa} submikroskopiske l{\\ae}ng\\-de\\-ska\\-laer. I denne afhandling med titlen {\\it Computer simulations of lipid bilayers and proteins} beskrives MD simuleringer af biologiske membraner og proteiner. I en MD simulering l{\\o}\\-ses Newtons be\\-v{\\ae}gel\\-ses\\-lig\\-ning numerisk for en samling af op til nogle hundrede tusinde atomer. Hovedresultatet af en s{\\aa}dan simu...

  9. Computer Simulation of Convective Plasma Cells

    CERN Document Server

    Carboni, Rodrigo

    2015-01-01

    Computer simulations of plasmas are relevant nowadays, because it helps us understand physical processes taking place in the sun and other stellar objects. We developed a program called PCell which is intended for displaying the evolution of the magnetic field in a 2D convective plasma cell with perfect conducting walls for different stationary plasma velocity fields. Applications of this program are presented. This software works interactively with the mouse and the users can create their own movies in MPEG format. The programs were written in Fortran and C. There are two versions of the program (GNUPLOT and OpenGL). GNUPLOT and OpenGL are used to display the simulation.

  10. Advanced modeling and simulation to design and manufacture high performance and reliable advanced microelectronics and microsystems.

    Energy Technology Data Exchange (ETDEWEB)

    Nettleship, Ian (University of Pittsburgh, Pittsburgh, PA); Hinklin, Thomas; Holcomb, David Joseph; Tandon, Rajan; Arguello, Jose Guadalupe, Jr. (,; .); Dempsey, James Franklin; Ewsuk, Kevin Gregory; Neilsen, Michael K.; Lanagan, Michael (Pennsylvania State University, University Park, PA)

    2007-07-01

    An interdisciplinary team of scientists and engineers having broad expertise in materials processing and properties, materials characterization, and computational mechanics was assembled to develop science-based modeling/simulation technology to design and reproducibly manufacture high performance and reliable, complex microelectronics and microsystems. The team's efforts focused on defining and developing a science-based infrastructure to enable predictive compaction, sintering, stress, and thermomechanical modeling in ''real systems'', including: (1) developing techniques to and determining materials properties and constitutive behavior required for modeling; (2) developing new, improved/updated models and modeling capabilities, (3) ensuring that models are representative of the physical phenomena being simulated; and (4) assessing existing modeling capabilities to identify advances necessary to facilitate the practical application of Sandia's predictive modeling technology.

  11. Advance simulation capability for environmental management (ASCEM) - 59065

    International Nuclear Information System (INIS)

    The United States Department Energy (DOE) Office of Environmental Management (EM) determined that uniform application of advanced modeling in the subsurface could help reduce the cost and risks associated with its environmental cleanup mission. In response to this determination, the EM Office of Technology Innovation and Development (OTID), Groundwater and Soil Remediation (GW and S) began the program Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for integrating data and scientific understanding to enable prediction of contaminant fate and transport in natural and engineered systems. This initiative supports the reduction of uncertainties and risks associated with EM?s environmental cleanup and closure programs through better understanding and quantifying the subsurface flow and contaminant transport behavior in complex geological systems. This involves the long-term performance of engineered components, including cementitious materials in nuclear waste disposal facilities that may be sources for future contamination of the subsurface. This paper describes the ASCEM tools and approach and the ASCEM programmatic accomplishments completed in 2010 including recent advances and technology transfer. The US Department of Energy Office of Environmental Management has begun development of an Advanced Simulation Capability for Environmental Management, (ASCEM). This program will provide predictions of the end states of contaminated areas allowing for cost and risk reduction of EM remedial activities. ASCEM will provide the tools and approaches necessary to standardize risk and performance assessments across the DOE complex. Through its Phase One demonstration, the ASCEM team has shown value to the EM community in the areas of High Performance Computing, Data Management, Visualization, and Uncertainty Quantification. In 2012, ASCEM will provide an initial limited release of a community code for

  12. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  13. Computer simulation of complexity in plasmas

    International Nuclear Information System (INIS)

    By making a comprehensive comparative study of many self-organizing phenomena occurring in magnetohydrodynamics and kinetic plasmas, we came up with a hypothetical grand view of self-organization. This assertion is confirmed by a recent computer simulation for a broader science field, specifically, the structure formation of short polymer chains, where the nature of the interaction is completely different from that of plasmas. It is found that the formation of the global orientation order proceeds stepwise. (author)

  14. Computer Simulation of Multidimensional Archaeological Artefacts

    Directory of Open Access Journals (Sweden)

    Vera Moitinho de Almeida

    2012-11-01

    Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.

  15. Computer simulation of complexity in plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, Takaya; Sato, Tetsuya [National Inst. for Fusion Science, Toki, Gifu (Japan)

    1998-08-01

    By making a comprehensive comparative study of many self-organizing phenomena occurring in magnetohydrodynamics and kinetic plasmas, we came up with a hypothetical grand view of self-organization. This assertion is confirmed by a recent computer simulation for a broader science field, specifically, the structure formation of short polymer chains, where the nature of the interaction is completely different from that of plasmas. It is found that the formation of the global orientation order proceeds stepwise. (author)

  16. Real-time simulation of an automotive gas turbine using the hybrid computer

    Science.gov (United States)

    Costakis, W.; Merrill, W. C.

    1984-01-01

    A hybrid computer simulation of an Advanced Automotive Gas Turbine Powertrain System is reported. The system consists of a gas turbine engine, an automotive drivetrain with four speed automatic transmission, and a control system. Generally, dynamic performance is simulated on the analog portion of the hybrid computer while most of the steady state performance characteristics are calculated to run faster than real time and makes this simulation a useful tool for a variety of analytical studies.

  17. Computer simulations in the science classroom

    Science.gov (United States)

    Richards, John; Barowy, William; Levin, Dov

    1992-03-01

    In this paper we describe software for science instruction that is based upon a constructivist epistemology of learning. From a constructivist perspective, the process of learning is viewed as an active construction of knowledge, rather than a passive reception of information. The computer has the potential to provide an environment in which students can explore their understanding and better construct scientific knowledge. The Explorer is an interactive environment that integrates animated computer models with analytic capabilities for learning and teaching science. The system include graphs, a spreadsheet, scripting, and interactive tools. During formative evaluation of Explorer in the classroom, we have focused on learning the function and effectiveness of computer models in teaching science. Models have helped students relate theory to experiment when used in conjunction with hands-on activities and when the simulation addressed students' naive understanding of the phenomena. Two classroom examples illustrate our findings. The first is based on the dynamics of colliding objects. The second describes a class modeling the function of simple electric circuits. The simulations bridge between phenomena and theory by providing an abstract representation on which students may make measurements. Simulations based on scientific theory help to provide a set of interrelated experiences that challenge students' informal understanding of the science.

  18. QCWAVE, a Mathematica quantum computer simulation update

    CERN Document Server

    Tabakin, Frank

    2011-01-01

    This Mathematica 7.0/8.0 package upgrades and extends the quantum computer simulation code called QDENSITY. Use of the density matrix was emphasized in QDENSITY, although that code was also applicable to a quantum state description. In the present version, the quantum state version is stressed and made amenable to future extensions to parallel computer simulations. The add-on QCWAVE extends QDENSITY in several ways. The first way is to describe the action of one, two and three- qubit quantum gates as a set of small ($2 \\times 2, 4\\times 4$ or $8\\times 8$) matrices acting on the $2^{n_q}$ amplitudes for a system of $n_q$ qubits. This procedure was described in our parallel computer simulation QCMPI and is reviewed here. The advantage is that smaller storage demands are made, without loss of speed, and that the procedure can take advantage of message passing interface (MPI) techniques, which will hopefully be generally available in future Mathematica versions. Another extension of QDENSITY provided here is a mu...

  19. Computer simulation of two-phase flow in nuclear reactors

    International Nuclear Information System (INIS)

    Two-phase flow models dominate the requirements of economic resources for the development and use of computer codes which serve to analyze thermohydraulic transients in nuclear power plants. An attempt is made to reduce the effort of analyzing reactor transients by combining purpose-oriented modelling with advanced computing techniques. Six principles are presented on mathematical modeling and the selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited than the two-fluid model for the analysis of two-phase flow in nuclear reactors, because of the latter's closure problems. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost. (orig.)

  20. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  1. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  2. Advances in Computing and Information Technology : Proceedings of the Second International Conference on Advances in Computing and Information Technology

    CERN Document Server

    Nagamalai, Dhinaharan; Chaki, Nabendu

    2013-01-01

    The international conference on Advances in Computing and Information technology (ACITY 2012) provides an excellent international forum for both academics and professionals for sharing knowledge and results in theory, methodology and applications of Computer Science and Information Technology. The Second International Conference on Advances in Computing and Information technology (ACITY 2012), held in Chennai, India, during July 13-15, 2012, covered a number of topics in all major fields of Computer Science and Information Technology including: networking and communications, network security and applications, web and internet computing, ubiquitous computing, algorithms, bioinformatics, digital image processing and pattern recognition, artificial intelligence, soft computing and applications. Upon a strength review process, a number of high-quality, presenting not only innovative ideas but also a founded evaluation and a strong argumentation of the same, were selected and collected in the present proceedings, ...

  3. TNO-ADVANCE: a modular powertrain simulation and design tool

    NARCIS (Netherlands)

    Venne, J.W.C. van de; Smokers, R.T.M.

    2000-01-01

    To support its activities in the field of conventional and hybrid vehicles, TNO has developed ADVANCE, a modular simulation tool for the design and evaluation of advanced powertrains. In this paper the various features and the potential of ADVANCE are described and illustrated by means of three case

  4. Advances in Computer Science and Education

    CERN Document Server

    Huang, Xiong

    2012-01-01

    CSE2011 is an integrated conference concentration its focus on computer science and education. In the proceeding, you can learn much more knowledge about computer science and education of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful

  5. Computer simulation for sodium-concrete reactions

    International Nuclear Information System (INIS)

    In the liquid metal cooled fast breeder reactors (LMFBRs), direct contacts between sodium and concrete is unavoidable. Due to sodium's high chemical reactivity, sodium would react with concrete violently. Lots of hydrogen gas and heat would be released then. This would harm the ignorantly of the containment. This paper developed a program to simualte sodium-conrete reactions across-the-board. It could give the reaction zone temperature, pool temperature, penetration depth, penetration rate, hydrogen flux and reaction heat and so on. Concrete was considered to be composed of silica and water only in this paper. The variable, the quitient of sodium hydroxide, was introduced in the continuity equation to simulate the chemical reactions more realistically. The product of the net gas flux and boundary depth was ably transformed to that of penetration rate and boundary depth. The complex chemical kinetics equations was simplified under some hypothesises. All the technique applied above simplified the computer simulation consumedly. In other words, they made the computer simulation feasible. Theoretics models that applied in the program and the calculation procedure were expatiated in detail. Good agreements of an overall transient behavior were obtained in the series of sodium-concrete reaction experiment analysis. The comparison between the analytical and experimental results showed the program presented in this paper was creditable and reasonable for simulating the sodium-concrete reactions. This program could be used for nuclear safety judgement. (authors)

  6. Advances in Computer-Based Autoantibodies Analysis

    Science.gov (United States)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  7. Advances in Neurotechnology for Brain Computer Interfaces

    OpenAIRE

    Fazli, Siamac

    2011-01-01

    Gehirn Computer Schnittstellen haben in den letzten 10 Jahren ein enormes wissenschaftliches Interesse hervorgerufen. Allerdings offenbart diese spannende Technology bei näherer Betrachtung noch einige Hürden, welche bisher die Entwicklung von massentauglichen Anwendungen verhindert haben. Unter Anderem eine lange Vorbereitungszeit eines BCI Systems, die fehlende Steuermöglichkeiten für manche Benutzer, sowie die nicht Stationaritäten innerhalb einer Aufnahme. Diese Dissertation führt eine Re...

  8. Recent Advances in Computer Engineering and Applications

    OpenAIRE

    Manoj Jha, Stephen Lagakos Leonid Perlovsky; Covaci, Brindusa; Nikos Mastorakis, Azami Zaharim

    2010-01-01

    This year the 4th WSEAS International Conference on COMPUTER ENGINEERING and APPLICATIONS (CEA '10) was held at Harvard University, Cambridge, USA, January 27-29, 2010. The conference remains faithful to its original idea of providing a platform to discuss network architecture, network design software, mobile networks and mobile services, digital broadcasting, e-commerce, optical networks, hacking, trojian horses, viruses, worms, spam, information security, standards of information security: ...

  9. Proceedings of International Conference on Advances in Computing

    CERN Document Server

    R, Selvarani; Kumar, T

    2012-01-01

    This is the first International Conference on Advances in Computing (ICAdC-2012). The scope of the conference includes all the areas of Theoretical Computer Science, Systems and Software, and Intelligent Systems. Conference Proceedings is a culmination of research results, papers and the theory related to all the three major areas of computing, i.e., Theoretical Computer Science, Systems and Software, and Intelligent Systems. Helps budding researchers, graduates in the areas of Computer Science, Information science, Electronics, Telecommunication, Instrumentation, Networking to take forward their research work based on the reviewed results in the paper,  by mutual interaction through E-mail contacts in the proceedings.

  10. Advanced TCAD Simulations and Characterization of Semiconductor Devices

    OpenAIRE

    Ewert, Tony

    2006-01-01

    Today, micro- and nano-electronic devices are becoming more complex and advanced as the dimensions are shrinking. It is therefore a very challenging task to develop new device technologies with performance that can be predicted. This thesis focuses on advanced measurement techniques and TCAD simulations in order to characterize and understand the device physics of advanced semiconductor devices. TCAD simulations were made on a novel MOSFET device with asymmetric source and drain structures. ...

  11. Precision Casting via Advanced Simulation and Manufacturing

    Science.gov (United States)

    1997-01-01

    A two-year program was conducted to develop and commercially implement selected casting manufacturing technologies to enable significant reductions in the costs of castings, increase the complexity and dimensional accuracy of castings, and reduce the development times for delivery of high quality castings. The industry-led R&D project was cost shared with NASA's Aerospace Industry Technology Program (AITP). The Rocketdyne Division of Boeing North American, Inc. served as the team lead with participation from Lockheed Martin, Ford Motor Company, Howmet Corporation, PCC Airfoils, General Electric, UES, Inc., University of Alabama, Auburn University, Robinson, Inc., Aracor, and NASA-LeRC. The technical effort was organized into four distinct tasks. The accomplishments reported herein. Task 1.0 developed advanced simulation technology for core molding. Ford headed up this task. On this program, a specialized core machine was designed and built. Task 2.0 focused on intelligent process control for precision core molding. Howmet led this effort. The primary focus of these experimental efforts was to characterize the process parameters that have a strong impact on dimensional control issues of injection molded cores during their fabrication. Task 3.0 developed and applied rapid prototyping to produce near net shape castings. Rocketdyne was responsible for this task. CAD files were generated using reverse engineering, rapid prototype patterns were fabricated using SLS and SLA, and castings produced and evaluated. Task 4.0 was aimed at developing technology transfer. Rocketdyne coordinated this task. Casting related technology, explored and evaluated in the first three tasks of this program, was implemented into manufacturing processes.

  12. Advances in computational fluid dynamics solvers for modern computing environments

    Science.gov (United States)

    Hertenstein, Daniel; Humphrey, John R.; Paolini, Aaron L.; Kelmelis, Eric J.

    2013-05-01

    EM Photonics has been investigating the application of massively multicore processors to a key problem area: Computational Fluid Dynamics (CFD). While the capabilities of CFD solvers have continually increased and improved to support features such as moving bodies and adjoint-based mesh adaptation, the software architecture has often lagged behind. This has led to poor scaling as core counts reach the tens of thousands. In the modern High Performance Computing (HPC) world, clusters with hundreds of thousands of cores are becoming the standard. In addition, accelerator devices such as NVIDIA GPUs and Intel Xeon Phi are being installed in many new systems. It is important for CFD solvers to take advantage of the new hardware as the computations involved are well suited for the massively multicore architecture. In our work, we demonstrate that new features in NVIDIA GPUs are able to empower existing CFD solvers by example using AVUS, a CFD solver developed by the Air Force Research Labratory (AFRL) and the Volcanic Ash Advisory Center (VAAC). The effort has resulted in increased performance and scalability without sacrificing accuracy. There are many well-known codes in the CFD space that can benefit from this work, such as FUN3D, OVERFLOW, and TetrUSS. Such codes are widely used in the commercial, government, and defense sectors.

  13. Advances in the development and validation of CFD-BWR, a two-phase computational fluid dynamics model for the simulation of flow and heat transfer in boiling water reactors

    International Nuclear Information System (INIS)

    This paper presents recent advances in the validation of an advanced Computational Fluid Dynamics (CFD) computer code (CFD-BWR) that allows the detailed analysis of two-phase flow and heat transfer phenomena in Boiling Water Reactor (BWR) fuel bundles. The CFD-BWR code is being developed as a customized module built on the foundation of the commercial CFD-code STAR-CD which provides general two-phase flow modeling capabilities. We have described the model development strategy that has been adopted by the development team for the prediction of boiling flow regimes in a BWR fuel bundle. This strategy includes the use of local flow topology maps and flow topology specific phenomenological models. The paper reviews the key boiling phenomenological models and focuses on recent results of experiment analyses for the validation of two-phase BWR phenomena models including cladding-to-coolant heat transfer and Critical Heat Flux experiments and the BWR Full-size Assembly Boiling Test (BFBT). The two-phase flow models implemented in the CFD-BWR code can be grouped into three broad categories: models describing the vapor generation at the heated cladding surface, models describing the interactions between the vapor and the liquid coolant, and models describing the heat transfer between the fuel pin and the two-phase coolant. These models have been described and will be briefly reviewed. The boiling model used in the second generation of the CFD-BWR code includes a local flow topology map which allows the cell-by-cell selection of the local flow topology. Local flow topologies can range from a bubbly flow topology where the continuous phase is liquid, to a transition flow topology, to a droplet flow topology where the continuous phase is vapor, depending primarily on the local void fraction. The models describing the cladding-to-coolant heat transfer and the interplay between these models and the local flow topology are important in Critical Heat Flux (CHF) analyses, and will

  14. Intelligent Software Tools for Advanced Computing

    Energy Technology Data Exchange (ETDEWEB)

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  15. A computer simulation of chromosomal instability

    Science.gov (United States)

    Goodwin, E.; Cornforth, M.

    The transformation of a normal cell into a cancerous growth can be described as a process of mutation and selection occurring within the context of clonal expansion. Radiation, in addition to initial DNA damage, induces a persistent and still poorly understood genomic instability process that contributes to the mutational burden. It will be essential to include a quantitative description of this phenomenon in any attempt at science-based risk assessment. Monte Carlo computer simulations are a relatively simple way to model processes that are characterized by an element of randomness. A properly constructed simulation can capture the essence of a phenomenon that, as is often the case in biology, can be extraordinarily complex, and can do so even though the phenomenon itself is incompletely understood. A simple computer simulation of one manifestation of genomic instability known as chromosomal instability will be presented. The model simulates clonal expansion of a single chromosomally unstable cell into a colony. Instability is characterized by a single parameter, the rate of chromosomal rearrangement. With each new chromosome aberration, a unique subclone arises (subclones are defined as having a unique karyotype). The subclone initially has just one cell, but it can expand with cell division if the aberration is not lethal. The computer program automatically keeps track of the number of subclones within the expanding colony, and the number of cells within each subclone. Because chromosome aberrations kill some cells during colony growth, colonies arising from unstable cells tend to be smaller than those arising from stable cells. For any chosen level of instability, the computer program calculates the mean number of cells per colony averaged over many runs. These output should prove useful for investigating how such radiobiological phenomena as slow growth colonies, increased doubling time, and delayed cell death depend on chromosomal instability. Also of

  16. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  17. Scientific and computational challenges of the fusion simulation project (FSP)

    Science.gov (United States)

    Tang, W. M.

    2008-07-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER — a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied

  18. Computational Methods for Jet Noise Simulation

    Science.gov (United States)

    Goodrich, John W. (Technical Monitor); Hagstrom, Thomas

    2003-01-01

    The purpose of our project is to develop, analyze, and test novel numerical technologies central to the long term goal of direct simulations of subsonic jet noise. Our current focus is on two issues: accurate, near-field domain truncations and high-order, single-step discretizations of the governing equations. The Direct Numerical Simulation (DNS) of jet noise poses a number of extreme challenges to computational technique. In particular, the problem involves multiple temporal and spatial scales as well as flow instabilities and is posed on an unbounded spatial domain. Moreover, the basic phenomenon of interest, the radiation of acoustic waves to the far field, involves only a minuscule fraction of the total energy. The best current simulations of jet noise are at low Reynolds number. It is likely that an increase of one to two orders of magnitude will be necessary to reach a regime where the separation between the energy-containing and dissipation scales is sufficient to make the radiated noise essentially independent of the Reynolds number. Such an increase in resolution cannot be obtained in the near future solely through increases in computing power. Therefore, new numerical methodologies of maximal efficiency and accuracy are required.

  19. Simulation and computation in health physics training

    International Nuclear Information System (INIS)

    The Royal Naval College has devised a number of computer aided learning programmes applicable to health physics which include radiation shield design and optimisation, environmental impact of a reactor accident, exposure levels produced by an inert radioactive gas cloud, and the prediction of radiation detector response in various radiation field conditions. Analogue computers are used on reduced or fast time scales because time dependent phenomenon are not always easily assimilated in real time. The build-up and decay of fission products, the dynamics of intake of radioactive material and reactor accident dynamics can be effectively simulated. It is essential to relate these simulations to real time and the College applies a research reactor and analytical phantom to this end. A special feature of the reactor is a chamber which can be supplied with Argon-41 from reactor exhaust gases to create a realistic gaseous contamination environment. Reactor accident situations are also taught by using role playing sequences carried out in real time in the emergency facilities associated with the research reactor. These facilities are outlined and the training technique illustrated with examples of the calculations and simulations. The training needs of the future are discussed, with emphasis on optimisation and cost-benefit analysis. (H.K.)

  20. Computational neuroscience for advancing artificial intelligence

    Directory of Open Access Journals (Sweden)

    Fernando P. Ponce

    2011-07-01

    Full Text Available resumen del libro de Alonso, E. y Mondragón, E. (2011. Hershey, NY: Medical Information Science Reference. La neurociencia como disciplinapersigue el entendimiento del cerebro y su relación con el funcionamiento de la mente a través del análisis de la comprensión de la interacción de diversos procesos físicos, químicos y biológicos (Bassett & Gazzaniga, 2011. Por otra parte, numerosas disciplinasprogresivamente han realizado significativas contribuciones en esta empresa tales como la matemática, la psicología o la filosofía, entre otras. Producto de este esfuerzo, es que junto con la neurociencia tradicional han aparecido disciplinas complementarias como la neurociencia cognitiva, la neuropsicología o la neurocienciacomputacional (Bengio, 2007; Dayan & Abbott, 2005. En el contexto de la neurociencia computacional como disciplina complementaria a laneurociencia tradicional. Alonso y Mondragón (2011 editan el libroComputacional Neuroscience for Advancing Artificial Intelligence: Models, Methods and Applications.

  1. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    , immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.

  2. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  3. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  4. Fast computation algorithms for speckle pattern simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nascov, Victor; Samoilă, Cornel; Ursuţiu, Doru [Transylvania University of Braov (Romania)

    2013-11-13

    We present our development of a series of efficient computation algorithms, generally usable to calculate light diffraction and particularly for speckle pattern simulation. We use mainly the scalar diffraction theory in the form of Rayleigh-Sommerfeld diffraction formula and its Fresnel approximation. Our algorithms are based on a special form of the convolution theorem and the Fast Fourier Transform. They are able to evaluate the diffraction formula much faster than by direct computation and we have circumvented the restrictions regarding the relative sizes of the input and output domains, met on commonly used procedures. Moreover, the input and output planes can be tilted each to other and the output domain can be off-axis shifted.

  5. Fast computation algorithms for speckle pattern simulation

    International Nuclear Information System (INIS)

    We present our development of a series of efficient computation algorithms, generally usable to calculate light diffraction and particularly for speckle pattern simulation. We use mainly the scalar diffraction theory in the form of Rayleigh-Sommerfeld diffraction formula and its Fresnel approximation. Our algorithms are based on a special form of the convolution theorem and the Fast Fourier Transform. They are able to evaluate the diffraction formula much faster than by direct computation and we have circumvented the restrictions regarding the relative sizes of the input and output domains, met on commonly used procedures. Moreover, the input and output planes can be tilted each to other and the output domain can be off-axis shifted

  6. Advancements in Afterbody Radiative Heating Simulations for Earth Entry

    Science.gov (United States)

    Johnston, Christopher O.; Panesi, Marco; Brandis, Aaron M.

    2016-01-01

    Four advancements to the simulation of backshell radiative heating for Earth entry are presented. The first of these is the development of a flow field model that treats electronic levels of the dominant backshell radiator, N, as individual species. This is shown to allow improvements in the modeling of electron-ion recombination and two-temperature modeling, which are shown to increase backshell radiative heating by 10 to 40%. By computing the electronic state populations of N within the flow field solver, instead of through the quasi-steady state approximation in the radiation code, the coupling of radiative transition rates to the species continuity equations for the levels of N, including the impact of non-local absorption, becomes feasible. Implementation of this additional level of coupling between the flow field and radiation codes represents the second advancement presented in this work, which is shown to increase the backshell radiation by another 10 to 50%. The impact of radiative transition rates due to non-local absorption indicates the importance of accurate radiation transport in the relatively complex flow geometry of the backshell. This motivates the third advancement, which is the development of a ray-tracing radiation transport approach to compute the radiative transition rates and divergence of the radiative flux at every point for coupling to the flow field, therefore allowing the accuracy of the commonly applied tangent-slab approximation to be assessed for radiative source terms. For the sphere considered at lunar-return conditions, the tangent-slab approximation is shown to provide a sufficient level of accuracy for the radiative source terms, even for backshell cases. This is in contrast to the agreement between the two approaches for computing the radiative flux to the surface, which differ by up to 40%. The final advancement presented is the development of a nonequilibrium model for NO radiation, which provides significant backshell

  7. Simulation of advanced ultrasound systems using Field II

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2004-01-01

    impulse responses is explained. A simulation example for a synthetic aperture spread spectrum flow systems is described. It is shown how the advanced coded excitation can be set up, and how the simulation can be parallelized to reduce the simulation time from 17 months to 391 hours using a 32 CPU Linux...

  8. Molecular physiology of rhodopsin: Computer simulation

    Science.gov (United States)

    Fel'Dman, T. B.; Kholmurodov, Kh. T.; Ostrovsky, M. A.

    2008-03-01

    Computer simulation is used for comparative investigation of the molecular dynamics of rhodopsin containing the chromophore group (11- cis-retinal) and free opsin. Molecular dynamics is traced within a time interval of 3000 ps; 3 × 106 discrete conformational states of rhodopsin and opsin are obtained and analyzed. It is demonstrated that the presence of the chromophore group in the chromophore center of opsin influences considerably the nearest protein environment of 11- cis-retinal both in the region of the β-ionone ring and in the region of the protonated Schiff base bond. Based on simulation results, a possible intramolecular mechanism of keeping rhodopsin as a G-protein-coupled receptor in the inactive state, i.e., the chromophore function as an efficient ligand antagonist, is discussed.

  9. COMPUTER SIMULATION OF POLYMER SOLUTION THERMODYNAMICS

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    The statistical counting method for the computer simulation of the thermodynamic quantities of polymer solution has been reviewed. The calculating results for a single athermal chain confirm the theory of the renormalization group. The results for the athermal solution are consistent with the scaling law of the osmotic pressure with the exponent 2.25. The results for a single chain with the segmental interaction are in a good agreement with the exact results obtained by the direct counting method. The results for the polymer solution show us that the Flory-Huggins parameter is strongly dependent on both the polymer concentration and the interaction energy between segments.

  10. Computer simulation of contaminated soil bioremediation

    International Nuclear Information System (INIS)

    A mathematical model has been developed and simulated to describe contaminated soil bioremediation. The model equations consist of a system of three nonlinear partial differential equations. Dimensional analysis of the model equations has been performed, and solution of these equations has been conducted by an implicit finite difference method. A computer program is ru ned for solving the model equations and by using this program, the influence of the principal parameters (porosity, soil aggregate radius, and partition coefficient of the substrate) on the fate of chemicals has been studied. The rates of substrate, Oxygen diffusion and biodegradation rate have been found to be the controlling mechanisms for remediation in the aggregates

  11. Computer Simulation Studies of Gramicidin Channel

    Science.gov (United States)

    Song, Hyundeok; Beck, Thomas

    2009-04-01

    Ion channels are large membrane proteins, and their function is to facilitate the passage of ions across biological membranes. Recently, Dr. John Cuppoletti's group at UC showed that the gramicidin channel could function at high temperatures (360 -- 390K) with significant currents. This finding may have large implications for fuel cell technology. In this project, we will examine the experimental system by computer simulation. We will investigate how the temperature affects the current and differences in magnitude of the currents between two forms of Gramicidin, A and D. This research will help to elucidate the underlying molecular mechanism in this promising new technology.

  12. Computational model for protein unfolding simulation

    Science.gov (United States)

    Tian, Xu-Hong; Zheng, Ye-Han; Jiao, Xiong; Liu, Cai-Xing; Chang, Shan

    2011-06-01

    The protein folding problem is one of the fundamental and important questions in molecular biology. However, the all-atom molecular dynamics studies of protein folding and unfolding are still computationally expensive and severely limited by the time scale of simulation. In this paper, a simple and fast protein unfolding method is proposed based on the conformational stability analyses and structure modeling. In this method, two structure-based conditions are considered to identify the unstable regions of proteins during the unfolding processes. The protein unfolding trajectories are mimicked through iterative structure modeling according to conformational stability analyses. Two proteins, chymotrypsin inhibitor 2 (CI2) and α -spectrin SH3 domain (SH3) were simulated by this method. Their unfolding pathways are consistent with the previous molecular dynamics simulations. Furthermore, the transition states of the two proteins were identified in unfolding processes and the theoretical Φ values of these transition states showed significant correlations with the experimental data (the correlation coefficients are >0.8). The results indicate that this method is effective in studying protein unfolding. Moreover, we analyzed and discussed the influence of parameters on the unfolding simulation. This simple coarse-grained model may provide a general and fast approach for the mechanism studies of protein folding.

  13. Computer simulation of metal-organic materials

    Science.gov (United States)

    Stern, Abraham C.

    Computer simulations of metal-organic frameworks are conducted to both investigate the mechanism of hydrogen sorption and to elucidate a detailed, molecular-level understanding of the physical interactions that can lead to successful material design strategies. To this end, important intermolecular interactions are identified and individually parameterized to yield a highly accurate representation of the potential energy landscape. Polarization, one such interaction found to play a significant role in H 2 sorption, is included explicitly for the first time in simulations of metal-organic frameworks. Permanent electrostatics are usually accounted for by means of an approximate fit to model compounds. The application of this method to simulations involving metal-organic frameworks introduces several substantial problems that are characterized in this work. To circumvent this, a method is developed and tested in which atomic point partial charges are computed more directly, fit to the fully periodic electrostatic potential. In this manner, long-range electrostatics are explicitly accounted for via Ewald summation. Grand canonical Monte Carlo simulations are conducted employing the force field parameterization developed here. Several of the major findings of this work are: Polarization is found to play a critical role in determining the overall structure of H2 sorbed in metal-organic frameworks, although not always the determining factor in uptake. The parameterization of atomic point charges by means of a fit to the periodic electrostatic potential is a robust, efficient method and consistently results in a reliable description of Coulombic interactions without introducing ambiguity associated with other procedures. After careful development of both hydrogen and framework potential energy functions, quantitatively accurate results have been obtained. Such predictive accuracy will aid greatly in the rational, iterative design cycle between experimental and theoretical

  14. Advances in FDTD computational electrodynamics photonics and nanotechnology

    CERN Document Server

    Oskooi, Ardavan; Johnson, Steven G

    2013-01-01

    Advances in photonics and nanotechnology have the potential to revolutionize humanity s ability to communicate and compute. To pursue these advances, it is mandatory to understand and properly model interactions of light with materials such as silicon and gold at the nanoscale, i.e., the span of a few tens of atoms laid side by side. These interactions are governed by the fundamental Maxwell s equations of classical electrodynamics, supplemented by quantum electrodynamics. This book presents the current state-of-the-art in formulating and implementing computational models of these interactions. Maxwell s equations are solved using the finite-difference time-domain (FDTD) technique, pioneered by the senior editor, whose prior Artech books in this area are among the top ten most-cited in the history of engineering. You discover the most important advances in all areas of FDTD and PSTD computational modeling of electromagnetic wave interactions. This cutting-edge resource helps you understand the latest develo...

  15. COMPUTER SIMULATIONS IN SCIENCE EDUCATION: Implications for Distance Education

    OpenAIRE

    SAHIN, Sami

    2006-01-01

    This paper is a review of the literature about the use of computer simulations in science education. This review examines the types and good examples of computer simulations. The literature review indicated that although computer simulations cannot replace science classroom and laboratory activities completely, they offer various advantages both for classroom and distance education. This paper consists of four parts. The first part describes computer simulations; the second part reviews the ...

  16. Reliability of an Interactive Computer Program for Advance Care Planning

    OpenAIRE

    Schubart, Jane R.; Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J.

    2012-01-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demon...

  17. The Advanced Computational Methods Center, University of Georgia

    OpenAIRE

    Nute, Donald; Covington, Michael; Rankin, Terry

    1986-01-01

    The Advanced Computational Methods Center (ACMC) established at the University of Georgia in 1984, supports several research projects in artificial intelligence. The primary goal of AI research at ACMC is the design and installation of a logic-programming environment with advanced natural language processing and knowledge-acquisition capabilities on the university's highly parallel CYBERPLUS system from Control Data Corporation. This article briefly describes current research projects in arti...

  18. Recent advances in nuclear power plant simulation

    International Nuclear Information System (INIS)

    The field of industrial simulation has experienced very significant progress in recent years, and power plant simulation in particular has been an extremely active area. Improvements may be recorded in practically all simulator subsystems. In Europe, the construction of new full- or optimized-scope nuclear power plant simulators during the middle 1990's has been remarkable intense. In fact, it is possible to identify a distinct simulator generation, which constitutes a new de facto simulation standard. Thomson Training and Simulation has taken part in these developments by designing, building, and validation several of these new simulators for Dutch, German and French nuclear power plants. Their characteristics are discussed in this paper. The following main trends may be identified: Process modeling is clearly evolving towards obtaining engineering-grade performance, even under the added constraints of real-time operation and a very wide range of operating conditions to be covered; Massive use of modern graphic user interfaces (GUI) ensures an unprecedented flexibility and user-friendliness for the Instructor Station; The massive use of GUIs also allows the development of Trainee Stations (TS), which significantly enhance the in-depth training value of the simulators; The development of powerful Software Development Environments (SDE) enables the simulator maintenance teams to keep abreast of modifications carried out in the reference plants; Finally, simulator maintenance and its compliance with simulator fidelity requirements are greatly enhanced by integrated Configuration Management Systems (CMS). In conclusion, the power plant simulation field has attained a strong level of maturity, which benefits its approximately forty years of service to the power generation industry. (author)

  19. Scientific and computational challenges of the fusion simulation program (FSP)

    International Nuclear Information System (INIS)

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) - a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  20. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    International Nuclear Information System (INIS)

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  1. Advanced Simulation Capability for Environmental Management: Development and Demonstrations - 12532

    International Nuclear Information System (INIS)

    The U.S. Department of Energy Office of Environmental Management (EM), Technology Innovation and Development is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of capabilities, which are organized into Platform and Integrated Tool-sets and a High-Performance Computing Multi-process Simulator. The Platform capabilities target a level of functionality to allow end-to-end model development, starting with definition of the conceptual model and management of data for model input. The High-Performance Computing capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The new capabilities are demonstrated through working groups, including one focused on the Hanford Site Deep Vadose Zone. The ASCEM program focused on planning during the first year and executing a prototype tool-set for an early demonstration of individual components. Subsequently, ASCEM has focused on developing and demonstrating an integrated set of capabilities, making progress toward a version of the capabilities that can be used to engage end users. Demonstration of capabilities continues to be implemented through working groups. Three different working groups, one focused on EM problems in the deep vadose zone, another investigating attenuation mechanisms for metals and radionuclides, and a third focusing on waste tank performance assessment, continue to make progress. The project

  2. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    C. FOSTER; ET AL

    2001-01-01

    indicative of most glove box operations and demonstrates the ability and advantages of advance computer based modeling. The three-dimensional model also enables better comprehension of problems to non-technical staff. There are many barriers to the seamless integration between the initial design specifications and a computer simulation. Problems include the lack of a standard model and inexact manufacturing of components used in the glove box. The benefits and drawbacks are discussed; however, the results are useful.

  3. 9th International Conference on Advanced Computing & Communication Technologies

    CERN Document Server

    Mandal, Jyotsna; Auluck, Nitin; Nagarajaram, H

    2016-01-01

    This book highlights a collection of high-quality peer-reviewed research papers presented at the Ninth International Conference on Advanced Computing & Communication Technologies (ICACCT-2015) held at Asia Pacific Institute of Information Technology, Panipat, India during 27–29 November 2015. The book discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academia and industry present their original work and exchange ideas, information, techniques and applications in the field of Advanced Computing and Communication Technology.

  4. Computer simulations for the nano-scale

    International Nuclear Information System (INIS)

    A review of methods for computations for the nano-scale is presented. The paper should provide a convenient starting point into computations for the nano-scale as well as a more in depth presentation for those already working in the field of atomic/molecular-scale modeling. The argument is divided in chapters covering the methods for description of the (i) electrons, (ii) ions, and (iii) techniques for efficient solving of the underlying equations. A fairly broad view is taken covering the Hartree-Fock approximation, density functional techniques and quantum Monte-Carlo techniques for electrons. The customary quantum chemistry methods, such as post Hartree-Fock techniques, are only briefly mentioned. Description of both classical and quantum ions is presented. The techniques cover Ehrenfest, Born-Oppenheimer, and Car-Parrinello dynamics. The strong and weak points of both principal and technical nature are analyzed. In the second part we introduce a number of applications to demonstrate the different approximations and techniques introduced in the first part. They cover a wide range of applications such as non-simple liquids, surfaces, molecule-surface interactions, applications in nano technology, etc. These more in depth presentations, while certainly not exhaustive, should provide information on technical aspects of the simulations, typical parameters used, and ways of analysis of the huge amounts of data generated in these large-scale supercomputer simulations. (author)

  5. Computer simulations for the nano-scale

    International Nuclear Information System (INIS)

    A review of methods for computations for the nano-scale is presented. The paper should provide a convenient starting point into computations for the nano-scale as well as a more in depth presentation for those already working in the field of atomic/molecular-scale modeling. The argument is divided in chapters covering the methods for description of the (i) electrons, (ii) ions, and (iii) techniques for efficient solving of the underlying equations. A fairly broad view is taken covering the Hartree-Fock approximation, density functional techniques and quantum Monte-Carlo techniques for electrons.The customary quantum chemistry methods, such as post Hartree-Fock techniques, are only briefly mentioned. Description of both classical and quantum ions is presented. The techniques cover Ehrenfest, Born-Oppenheimer, and Car-Parrinello dynamics. The strong and weak points of both principal and technical nature are analyzed. In the second part we introduce a number of applications to demonstrate the different approximation and techniques introduced in the first part. They cover a wide range of applications such as non-simple liquids, surfaces, molecule-surface interactions, applications in nanotechnology, etc. These more in depth presentations, while certainly non exhaustive, should provide information on technical aspects of the simulations, typical parameters used, and ways of analysis of the huge amounts of data generated in these large-scale supercomputer simulations (Author)

  6. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

    2002-07-28

    This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

  7. Computational simulation for concurrent engineering of aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  8. Computational simulation of concurrent engineering for aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  9. Computer simulations of phase separation in quenched polymer solutions

    OpenAIRE

    Geeter, de, Bastiaan Alexander

    2002-01-01

    This thesis describes the computational study of phase separation in polymer solutions after a temperature quench. For the simulation of this process Cahn-Hilliard simulations and Bond-Fluctuation Monte Carlo simulations were performed.

  10. Virtual Environments for Advanced Trainers and Simulators

    NARCIS (Netherlands)

    Jense, G.J.; Kuijper, F.

    1993-01-01

    Virtual environment technology is expected to make a big impact on future training and simulation systems. Direct stimulation of human senses (eyesight, auditory, tactile) and new paradigms for user input will improve the realism of simulations and thereby the effectiveness of training systems. Afte

  11. Computational simulation of compact toroidal plasma formation

    International Nuclear Information System (INIS)

    The following computational efforts are part of the MARAUDER (magnetically accelerated rings to achieve ultra-high directed energy and radiation) research program at the High Energy Plasma Division of the Weapons Laboratory. The program is investigating plasma toroids with magnetic fields similar to those of tokamaks. These fields confine the plasma between a pair of cylindrical conductors. The objective of the research is to first form such toroids and then compress and accelerate them. A 500 kJ capacitor bank will be used for the formation, and the 9 MJ Shiva Star will be used for acceleration. The first set of experiments and current computational work consider only the formation process. The computer program used for these simulations is MACH2. It is a two-dimensional MHD code and was originally developed by Mission Research Corporation under a Weapons Laboratory contract to support z-pinch research. MACH2 is an Arbitrary Lagrangian-Eulerian code with an adaptive mesh capability. Its diffusion routines use a multigrid technique to accelerate convergence. Recently, a second-order advection scheme has been added

  12. Computer Simulation of the UMER Gridded Gun

    CERN Document Server

    Haber, Irving; Friedman, Alex; Grote, D P; Kishek, Rami A; Reiser, Martin; Vay, Jean-Luc; Zou, Yun

    2005-01-01

    The electron source in the University of Maryland Electron Ring (UMER) injector employs a grid 0.15 mm from the cathode to control the current waveform. Under nominal operating conditions, the grid voltage during the current pulse is sufficiently positive relative to the cathode potential to form a virtual cathode downstream of the grid. Three-dimensional computer simulations have been performed that use the mesh refinement capability of the WARP particle-in-cell code to examine a small region near the beam center in order to illustrate some of the complexity that can result from such a gridded structure. These simulations have been found to reproduce the hollowed velocity space that is observed experimentally. The simulations also predict a complicated time-dependent response to the waveform applied to the grid during the current turn-on. This complex temporal behavior appears to result directly from the dynamics of the virtual cathode formation and may therefore be representative of the expected behavior in...

  13. Computer simulation of surface and film processes

    Science.gov (United States)

    Tiller, W. A.; Halicioglu, M. T.

    1984-01-01

    All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

  14. Advanced computational methods for the assessment of reactor core behaviour during reactivity initiated accidents. Final report

    International Nuclear Information System (INIS)

    The document at hand serves as the final report for the reactor safety research project RS1183 ''Advanced Computational Methods for the Assessment of Reactor Core Behavior During Reactivity-Initiated Accidents''. The work performed in the framework of this project was dedicated to the development, validation and application of advanced computational methods for the simulation of transients and accidents of nuclear installations. These simulation tools describe in particular the behavior of the reactor core (with respect to neutronics, thermal-hydraulics and thermal mechanics) at a very high level of detail. The overall goal of this project was the deployment of a modern nuclear computational chain which provides, besides advanced 3D tools for coupled neutronics/ thermal-hydraulics full core calculations, also appropriate tools for the generation of multi-group cross sections and Monte Carlo models for the verification of the individual calculational steps. This computational chain shall primarily be deployed for light water reactors (LWR), but should beyond that also be applicable for innovative reactor concepts. Thus, validation on computational benchmarks and critical experiments was of paramount importance. Finally, appropriate methods for uncertainty and sensitivity analysis were to be integrated into the computational framework, in order to assess and quantify the uncertainties due to insufficient knowledge of data, as well as due to methodological aspects.

  15. Simulation of chemical reaction dynamics on an NMR quantum computer

    CERN Document Server

    Lu, Dawei; Xu, Ruixue; Chen, Hongwei; Gong, Jiangbin; Peng, Xinhua; Du, Jiangfeng

    2011-01-01

    Quantum simulation can beat current classical computers with minimally a few tens of qubits and will likely become the first practical use of a quantum computer. One promising application of quantum simulation is to attack challenging quantum chemistry problems. Here we report an experimental demonstration that a small nuclear-magnetic-resonance (NMR) quantum computer is already able to simulate the dynamics of a prototype chemical reaction. The experimental results agree well with classical simulations. We conclude that the quantum simulation of chemical reaction dynamics not computable on current classical computers is feasible in the near future.

  16. Design and verification of the integration of simulation environments, models of a nucleo electric plant and advanced computation languages, in the creation of multimedia applications for training and teaching

    International Nuclear Information System (INIS)

    The design process of a reliable and stable integration system is presented among the models that represent present elements in a nucleo electric plant and advanced programming environments in Windows platform. In particular it is analyzed in the case of the integration of the pattern corresponding to the system of feeding water and their associate controller in a graphic structure and of control of superior graphic capacities to the existent desk simulators, mainly because it gives direct access to the graph area and of maximum speed in their execution. In turn it is proven the capacity of the models to behave chord to the prospective answer for that type of systems and a comparative of the found answers is made directly in the models and that shown graphically. They are also described the characteristics that provide to the execution of real time, and jointly, a panorama of the diverse possibilities of representation of the graphic interface is given. Also, the capacities of the simulation environments are analyzed and of used programming, highlighting the advantages and disadvantages that took to the elected solution, considering the support objective in training and teaching. The design proposes a reliable methodology that can be used in the development of simulators, in graphic demonstration of concepts, prototypes, among other applications. (Author)

  17. ADVANCE, a modular vehicle simulation environment in MATLAB/SIMULINK

    NARCIS (Netherlands)

    Eelkema, J.; Vink, W.; Tillaart, E. van den

    2002-01-01

    This paper presents the development of a hybrid electric powertrain test platform. In the development process use has been made of ADVANCE, a modular vehicle simulation environment in MATLAB/Simulink. The background, philosophy, and the concept of the ADVANCE tool are discussed and a brief introduct

  18. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    Science.gov (United States)

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  19. Cluster computing for lattice QCD simulations

    International Nuclear Information System (INIS)

    Full text: Simulations of lattice quantum chromodynamics (QCD) require enormous amounts of compute power. In the past, this has usually involved sharing time on large, expensive machines at supercomputing centres. Over the past few years, clusters of networked computers have become very popular as a low-cost alternative to traditional supercomputers. The dramatic improvements in performance (and more importantly, the ratio of price/performance) of commodity PCs, workstations, and networks have made clusters of off-the-shelf computers an attractive option for low-cost, high-performance computing. A major advantage of clusters is that since they can have any number of processors, they can be purchased using any sized budget, allowing research groups to install a cluster for their own dedicated use, and to scale up to more processors if additional funds become available. Clusters are now being built for high-energy physics simulations. Wuppertal has recently installed ALiCE, a cluster of 128 Alpha workstations running Linux, with a peak performance of 158 G flops. The Jefferson Laboratory in the US has a 16 node Alpha cluster and plans to upgrade to a 256 processor machine. In Australia, several large clusters have recently been installed. Swinburne University of Technology has a cluster of 64 Compaq Alpha workstations used for astrophysics simulations. Early this year our DHPC group constructed a cluster of 116 dual Pentium PCs (i.e. 232 processors) connected by a Fast Ethernet network, which is used by chemists at Adelaide University and Flinders University to run computational chemistry codes. The Australian National University has recently installed a similar PC cluster with 192 processors. The Centre for the Subatomic Structure of Matter (CSSM) undertakes large-scale high-energy physics calculations, mainly lattice QCD simulations. The choice of the computer and network hardware for a cluster depends on the particular applications to be run on the machine. Our

  20. Innovations and Advances in Computer, Information, Systems Sciences, and Engineering

    CERN Document Server

    Sobh, Tarek

    2013-01-01

    Innovations and Advances in Computer, Information, Systems Sciences, and Engineering includes the proceedings of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2011). The contents of this book are a set of rigorously reviewed, world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of  Industrial Electronics, Technology and Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.

  1. Advances in computers dependable and secure systems engineering

    CERN Document Server

    Hurson, Ali

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  2. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  3. Computational simulation of hot composite structures

    Science.gov (United States)

    Chamis, C. C.; Murthy, P. L. N.; Singhal, S. N.

    1991-01-01

    Three different computer codes developed in-house are described for application to hot composite structures. These codes include capabilities for: (1) laminate behavior (METCAN); (2) thermal/structural analysis of hot structures made from high temperature metal matrix composites (HITCAN); and (3) laminate tailoring (MMLT). Results for select sample cases are described to demonstrate the versatility as well as the application of these codes to specific situations. The sample case results show that METCAN can be used to simulate cyclic life in high temperature metal matrix composites; HITCAN can be used to evaluate the structural performance of curved panels as well as respective sensitivities of various nonlinearities, and MMLT can be used to tailor the fabrication process in order to reduce residual stresses in the matrix upon cool-down.

  4. Computational simulation of hot composites structures

    Science.gov (United States)

    Chamis, Christos C.; Murthy, P. L. N.; Singhal, S. N.

    1991-01-01

    Three different computer codes developed in-house are described for application to hot composite structures. These codes include capabilities for: (1) laminate behavior (METCAN); (2) thermal/structural analysis of hot structures made from high temperature metal matrix composites (HITCAN); and (3) laminate tailoring (MMLT). Results for select sample cases are described to demonstrate the versatility as well as the application of these codes to specific situations. The sample case results show that METCAN can be used to simulate cyclic life in high temperature metal matrix composites; HITCAN can be used to evaluate the structural performance of curved panels as well as respective sensitivities of various nonlinearities, and MMLT can be used to tailor the fabrication process in order to reduce residual stresses in the matrix upon cool-down.

  5. Computer simulation of a magnetohydrodynamic dynamo II

    International Nuclear Information System (INIS)

    We performed a computer simulation of a magnetohydrodynamic dynamo in a rapidly rotating spherical shell. Extensive parameter runs are carried out changing the electrical resistivity. It is found that the total magnetic energy can grow more than ten times larger than the total kinetic energy of the convection motion when the resistivity is sufficiently small. When the resistivity is relatively large and the magnetic energy is comparable or smaller than the kinetic energy, the convection motion maintains its well-organized structure. However, when the resistivity is small and the magnetic energy becomes larger than the kinetic energy, the well-organized convection motion is highly disturbed. The generated magnetic field is organized as a set of flux tubes which can be divided into two categories. The magnetic field component parallel to the rotation axis tends to be confined inside the anticyclonic columnar convection cells. On the other hand, the component perpendicular to the rotation axis is confined outside the convection cells. (author)

  6. 2014 National Workshop on Advances in Communication and Computing

    CERN Document Server

    Prasanna, S; Sarma, Kandarpa; Saikia, Navajit

    2015-01-01

    The present volume is a compilation of research work in computation, communication, vision sciences, device design, fabrication, upcoming materials and related process design, etc. It is derived out of selected manuscripts submitted to the 2014 National Workshop on Advances in Communication and Computing (WACC 2014), Assam Engineering College, Guwahati, Assam, India which is emerging out to be a premier platform for discussion and dissemination of knowhow in this part of the world. The papers included in the volume are indicative of the recent thrust in computation, communications and emerging technologies. Certain recent advances in ZnO nanostructures for alternate energy generation provide emerging insights into an area that has promises for the energy sector including conservation and green technology. Similarly, scholarly contributions have focused on malware detection and related issues. Several contributions have focused on biomedical aspects including contributions related to cancer detection using act...

  7. Associative Memory computing power and its simulation

    CERN Document Server

    Ancu, L S; The ATLAS collaboration; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  8. Associative Memory Computing Power and Its Simulation

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  9. Computer simulations of the mouse spermatogenic cycle

    Directory of Open Access Journals (Sweden)

    Debjit Ray

    2014-12-01

    Full Text Available The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal–spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.

  10. Computer simulation of heterogeneous polymer photovoltaic devices

    International Nuclear Information System (INIS)

    Polymer-based photovoltaic devices have the potential for widespread usage due to their low cost per watt and mechanical flexibility. Efficiencies close to 9.0% have been achieved recently in conjugated polymer based organic solar cells (OSCs). These devices were fabricated using solvent-based processing of electron-donating and electron-accepting materials into the so-called bulk heterojunction (BHJ) architecture. Experimental evidence suggests that a key property determining the power-conversion efficiency of such devices is the final morphological distribution of the donor and acceptor constituents. In order to understand the role of morphology on device performance, we develop a scalable computational framework that efficiently interrogates OSCs to investigate relationships between the morphology at the nano-scale with the device performance. In this work, we extend the Buxton and Clarke model (2007 Modelling Simul. Mater. Sci. Eng. 15 13–26) to simulate realistic devices with complex active layer morphologies using a dimensionally independent, scalable, finite-element method. We incorporate all stages involved in current generation, namely (1) exciton generation and diffusion, (2) charge generation and (3) charge transport in a modular fashion. The numerical challenges encountered during interrogation of realistic microstructures are detailed. We compare each stage of the photovoltaic process for two microstructures: a BHJ morphology and an idealized sawtooth morphology. The results are presented for both two- and three-dimensional structures. (paper)

  11. Engineering Fracking Fluids with Computer Simulation

    Science.gov (United States)

    Shaqfeh, Eric

    2015-11-01

    There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.

  12. Sonification of simulations in computational physics

    International Nuclear Information System (INIS)

    Sonification is the translation of information for auditory perception, excluding speech itself. The cognitive performance of pattern recognition is striking for sound, and has too long been disregarded by the scientific mainstream. Examples of 'spontaneous sonification' and systematic research for about 20 years have proven that sonification provides a valuable tool for the exploration of scientific data. The data in this thesis stem from computational physics, where numerical simulations are applied to problems in physics. Prominent examples are spin models and lattice quantum field theories. The corresponding data lend themselves very well to innovative display methods: they are structured on discrete lattices, often stochastic, high-dimensional and abstract, and they provide huge amounts of data. Furthermore, they have no inher- ently perceptual dimension. When designing the sonification of simulation data, one has to make decisions on three levels, both for the data and the sound model: the level of meaning (phenomenological; metaphoric); of structure (in time and space), and of elements ('display units' vs. 'gestalt units'). The design usually proceeds as a bottom-up or top-down process. This thesis provides a 'toolbox' for helping in these decisions. It describes tools that have proven particularly useful in the context of simulation data. An explicit method of top-down sonification design is the metaphoric sonification method, which is based on expert interviews. Furthermore, qualitative and quantitative evaluation methods are presented, on the basis of which a set of evaluation criteria is proposed. The translation between a scientific and the sound synthesis domain is elucidated by a sonification operator. For this formalization, a collection of notation modules is provided. Showcases are discussed in detail that have been developed in the interdisciplinary research projects SonEnvir and QCD-audio, during the second Science By Ear workshop and during a

  13. Magnetic Resonance Imaging (MRI) Simulation on a Grid Computing Architecture

    OpenAIRE

    Benoit-Cattin, Hugues; Bellet, Fabrice; Montagnat, Johan; Odet, Christophe

    2010-01-01

    In this paper, we present the implementation of a Magnetic Resonance Imaging (MRI) simulator on a GRID computing architecture. The simulation process is based on the resolution of Bloch equation [1] in a 3D space. The computation kernel of the simulator is distributed to the grid nodes using MPICH-G2 [2]. The results presented show that simulation of 3D MRI data is achieved with a reasonable cost which gives new perspectives to MRI simulations usage.

  14. Towards A Novel Environment For Simulation Of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Joanna Patrzyk

    2015-01-01

    Full Text Available In this paper we analyze existing quantum computer simulation techniquesand their realizations to minimize the impact of the exponentialcomplexity of simulated quantum computations. As a result of thisinvestigation, we propose a quantum computer simulator with an integrateddevelopment environment - QuIDE - supporting development of algorithms forfuture quantum computers. The simulator simplifies building and testingquantum circuits and understand quantum algorithms in an efficient way.The development environment provides  flexibility of source codeedition and ease of graphical building of circuit diagrams.  We alsodescribe and analyze the complexity of algorithms used for simulationand present performance results of the simulator as well as results ofits deployment during university classes.

  15. Computational Fluid Dynamic Simulation of Square Array Rod Subchannel

    International Nuclear Information System (INIS)

    The results are compared against the data available in the literature to check on the simulation accuracy. In the simulation of the subchannel geometry, appropriate selection of the boundary conditions in the gap region is important in obtaining good predictions. The Reynolds stress turbulence model shows improved calculation accuracy over the eddy diffusivity model by accounting for the effect of anisotropy in turbulence. The multiple shear stress peaks differ in the calculation from the test results calling for further investigation. However, the lack of other turbulence related data in the 45 .deg. to 90 .deg. in the same literature may necessitate a dedicated new set of experiment. Crucial safety concerns are imposed upon fuel region where fission reactions take place. The rod array configuration is commonly utilized to maintain good heat transfer characteristics and the structural integrity. Design of the fuel assembly requires the various engineering considerations such as pressure drop, manufacturability, and generation of turbulence. A best way to estimate performance of the fuel assembly is to build a mockup and measure its physical parameters. However, the required cost and time render it rather difficult to perform experiments for all designs. Instead, with advances in numerical simulation techniques, computational fluid dynamics methods are extensively utilized to assist the experiments by reducing the trial and errors in the mockup. A numerical simulation is carried out on the bare square array rod geometry as a preliminary exercise for the grid spacer design

  16. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    Science.gov (United States)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics "core simulator" based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M

  17. Foreword: Advanced Science Letters (ASL), Special Issue on Computational Astrophysics

    OpenAIRE

    Mayer, Lucio

    2009-01-01

    Computational astrophysics has undergone unprecedented development over the last decade, becoming a field of its own. The challenge ahead of us will involve increasingly complex multi-scale simulations. These will bridge the gap between areas of astrophysics such as star and planet formation, or star formation and galaxy formation, that have evolved separately until today. A global knowledge of the physics and modeling techniques of astrophysical simulations is thus an important asset for the...

  18. Computer-aided simulation study of photomultiplier tubes

    Science.gov (United States)

    Zaghloul, Mona E.; Rhee, Do Jun

    1989-01-01

    A computer model that simulates the response of photomultiplier tubes (PMTs) and the associated voltage divider circuit is developed. An equivalent circuit that approximates the operation of the device is derived and then used to develop a computer simulation of the PMT. Simulation results are presented and discussed.

  19. Microbial Enhanced Oil Recovery - Advanced Reservoir Simulation

    DEFF Research Database (Denmark)

    Nielsen, Sidsel Marie

    to formation of biofilm. The construction of a one-dimensional simulator enables us to investigate how the different mechanisms and the combination of these influence the displacement processes, the saturation profiles and thus the oil recovery curves. The reactive transport model describes...... of the relative permeabilities. Overall, these methods produce similar results. Separate investigations of the surfactant effect have been performed through exemplifying simulation cases, where no biofilm is formed. The water phase saturation profiles are found to contain a waterfront initially...... investigated. A super efficient surfactant produces an incremental recovery recovery around 40 % OOIP over that of waterflooding. Application of the less efficient -- and probably more realistic -- surfactant results in an incremental oil recovery of 9 % OOIP, but it is still considered a significant...

  20. Microbial Enhanced Oil Recovery - Advanced Reservoir Simulation

    OpenAIRE

    Nielsen, Sidsel Marie; Shapiro, Alexander; Stenby, Erling Halfdan; Michelsen, Michael Locht

    2010-01-01

    In this project, a generic model has been set up to include the two main mechanisms in the microbial enhanced oil recovery (MEOR) process; reduction of the interfacial tension (IFT) due to surfactant production, and microscopic fluid diversion as a part of the overall fluid diversion mechanism due to formation of biofilm. The construction of a one-dimensional simulator enables us to investigate how the different mechanisms and the combination of these influence the displacement processes, the...

  1. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  2. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  3. Advanced computer graphics techniques as applied to the nuclear industry

    International Nuclear Information System (INIS)

    Computer graphics is a rapidly advancing technological area in computer science. This is being motivated by increased hardware capability coupled with reduced hardware costs. This paper will cover six topics in computer graphics, with examples forecasting how each of these capabilities could be used in the nuclear industry. These topics are: (1) Image Realism with Surfaces and Transparency; (2) Computer Graphics Motion; (3) Graphics Resolution Issues and Examples; (4) Iconic Interaction; (5) Graphic Workstations; and (6) Data Fusion - illustrating data coming from numerous sources, for display through high dimensional, greater than 3-D, graphics. All topics will be discussed using extensive examples with slides, video tapes, and movies. Illustrations have been omitted from the paper due to the complexity of color reproduction. 11 refs., 2 figs., 3 tabs

  4. Computer code qualification program for the Advanced CANDU Reactor

    International Nuclear Information System (INIS)

    Atomic Energy of Canada Ltd (AECL) has developed and implemented a Software Quality Assurance program (SQA) to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. This paper provides an overview of the computer programs used in Advanced CANDU Reactor (ACR) safety analysis, and assessment of their applicability in the safety analyses of the ACR design. An outline of the incremental validation program, and an overview of the experimental program in support of the code validation are also presented. An outline of the SQA program used to qualify these computer codes is also briefly presented. To provide context to the differences in the SQA with respect to current CANDUs, the paper also provides an overview of the ACR design features that have an impact on the computer code qualification. (author)

  5. Process simulation for advanced composites production

    Energy Technology Data Exchange (ETDEWEB)

    Allendorf, M.D.; Ferko, S.M.; Griffiths, S. [Sandia National Labs., Livermore, CA (United States)] [and others

    1997-04-01

    The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coating techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.

  6. Interoperable Technologies for Advanced Petascale Simulations (ITAPS)

    Energy Technology Data Exchange (ETDEWEB)

    Shephard, Mark S

    2010-02-05

    Efforts during the past year have contributed to the continued development of the ITAPS interfaces and services as well as specific efforts to support ITAPS applications. The ITAPS interface efforts have two components. The first is working with the ITAPS team on improving the ITAPS software infrastructure and level of compliance of our implementations of ITAPS interfaces (iMesh, iMeshP, iRel and iGeom). The second is being involved with the discussions on the design of the iField fields interface. Efforts to move the ITAPS technologies to petascale computers has identified a number of key technical developments that are required to effectively execute the ITAPS interfaces and services. Research to address these parallel method developments has been a major emphasis of the RPI’s team efforts over the past year. Efforts to move the ITAPS technologies to petascale computers has identified a number of key technical developments that are required to effectively execute the ITAPS interfaces and services. Research to address these parallel method developments has been a major emphasis of the RPI’s team efforts over the past year. The development of parallel unstructured mesh methods has considered the need to scale unstructured mesh solves to massively parallel computers. These efforts, summarized in section 2.1 show that with the addition of the ITAPS procedures described in sections 2.2 and 2.3 we are able to obtain excellent strong scaling with our unstructured mesh CFD code on up to 294,912 cores of IBM Blue Gene/P which is the highest core count machine available. The ITAPS developments that have contributed to the scaling and performance of PHASTA include an iterative migration algorithm to improve the combined region and vertex balance of the mesh partition, which increases scalability, and mesh data reordering, which improves computational performance. The other developments are associated with the further development of the ITAPS parallel unstructured mesh

  7. Activities of the Research Institute for Advanced Computer Science

    Science.gov (United States)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  8. Advanced computer architecture specification for automated weld systems

    Science.gov (United States)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  9. Advances in the Simulation-Based Analysis of Attitude Change

    OpenAIRE

    Voinea, Camelia Florela

    2012-01-01

    In this paper we provide an overview of the most relevant research work on the simulation of attitudes which evolved in the late 90’s and mainly after the year 2000. The general framework for the modeling, simulation and computational research on attitudes integrates research approaches (both fundamental and applicative) which combine theories from sociology, social psychology, social economics, political science, conflict theories, human-computer interaction areas with complexity theory, com...

  10. Soft computing in design and manufacturing of advanced materials

    Science.gov (United States)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  11. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  12. Interoperable Technologies for Advanced Petascale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xiaolin [SUNY at Stony Brook

    2013-01-14

    Our final report on the accomplishments of ITAPS at Stony Brook during period covered by the research award includes component service, interface service and applications. On the component service, we have designed and implemented a robust functionality for the Lagrangian tracking of dynamic interface. We have migrated the hyperbolic, parabolic and elliptic solver from stage-wise second order toward global second order schemes. We have implemented high order coupling between interface propagation and interior PDE solvers. On the interface service, we have constructed the FronTier application programer's interface (API) and its manual page using doxygen. We installed the FronTier functional interface to conform with the ITAPS specifications, especially the iMesh and iMeshP interfaces. On applications, we have implemented deposition and dissolution models with flow and implemented the two-reactant model for a more realistic precipitation at the pore level and its coupling with Darcy level model. We have continued our support to the study of fluid mixing problem for problems in inertial comfinement fusion. We have continued our support to the MHD model and its application to plasma liner implosion in fusion confinement. We have simulated a step in the reprocessing and separation of spent fuels from nuclear power plant fuel rods. We have implemented the fluid-structure interaction for 3D windmill and parachute simulations. We have continued our collaboration with PNNL, BNL, LANL, ORNL, and other SciDAC institutions.

  13. Computer simulation and the features of novel empirical data.

    Science.gov (United States)

    Lusk, Greg

    2016-04-01

    In an attempt to determine the epistemic status of computer simulation results, philosophers of science have recently explored the similarities and differences between computer simulations and experiments. One question that arises is whether and, if so, when, simulation results constitute novel empirical data. It is often supposed that computer simulation results could never be empirical or novel because simulations never interact with their targets, and cannot go beyond their programming. This paper argues against this position by examining whether, and under what conditions, the features of empiricality and novelty could be displayed by computer simulation data. I show that, to the extent that certain familiar measurement results have these features, so can some computer simulation results. PMID:27083094

  14. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    Science.gov (United States)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  15. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2012-01-01

    Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.

  16. Computer Simulation of IT-diagrams of Steel

    Institute of Scientific and Technical Information of China (English)

    B. Smoljan

    2004-01-01

    Computer simulation of austenite decomposition has been investigated. The inversion method of prediction of phase portion in steel based on hardenability curve of Jominy-specimen has been established. The designed method of prediction austenite decomposition has been used in computer simulation of isothermal transformation (IT) diagram of low alloyed steel. IT-diagrams of low alloyed steel can be successfully predicted by proposed method of computer simulation.

  17. Faster Quantum Chemistry Simulation on Fault-Tolerant Quantum Computers

    OpenAIRE

    Jones, N. Cody; Whitfield, James D.; McMahon, Peter L.; Yung, Man-Hong; Van Meter, Rodney; Aspuru-Guzik, Alan; Yamamoto, Yoshihisa

    2012-01-01

    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. We propose methods which substantially improve the performance of a particular form of simulation, ab initio quantum chemistry, on fault-tolerant quantum computers; these methods generalize readily to other quantum simulation problems. Quantum teleportation plays a key role in these improvements and is used extensively as a computing resource...

  18. Simulation of chemical reaction dynamics on an NMR quantum computer

    OpenAIRE

    Lu, Dawei; Xu, Nanyang; Xu, Ruixue; Chen, Hongwei; Gong, Jiangbin; Peng, Xinhua; Du, Jiangfeng

    2011-01-01

    Quantum simulation can beat current classical computers with minimally a few tens of qubits and will likely become the first practical use of a quantum computer. One promising application of quantum simulation is to attack challenging quantum chemistry problems. Here we report an experimental demonstration that a small nuclear-magnetic-resonance (NMR) quantum computer is already able to simulate the dynamics of a prototype chemical reaction. The experimental results agree well with classical ...

  19. Three-dimensional computer simulation of grain coarsening during sintering

    OpenAIRE

    Nikolic Zoran S.

    2012-01-01

    This paper presents a computational study of the three-dimensional computer simulation of grain coarsening using a sintering model based on sintering law (a rate law of inter-grain distance reduction) describing the evolution of neck geometry.

  20. Condition Monitoring Through Advanced Sensor and Computational Technology

    International Nuclear Information System (INIS)

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties

  1. Factors promoting engaged exploration with computer simulations

    Directory of Open Access Journals (Sweden)

    Noah S. Podolefsky

    2010-10-01

    Full Text Available This paper extends prior research on student use of computer simulations (sims to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze interviews with college students using PhET sims in order to demonstrate engaged exploration, and to identify factors that can promote this type of inquiry. With minimal explicit guidance, students explore the topic of wave interference in ways that bear similarity to how scientists explore phenomena. PhET sims are flexible tools which allow students to choose their own learning path, but also provide constraints such that students’ choices are generally productive. This type of inquiry is supported by sim features such as concrete connections to the real world, representations that are not available in the real world, analogies to help students make meaning of and connect across multiple representations and phenomena, and a high level of interactivity with real-time, dynamic feedback from the sim. These features of PhET sims enable students to pose questions and answer them in ways that may not be supported by more traditional educational materials.

  2. COMPUTER SIMULATIONS IN SCIENCE EDUCATION: Implications for Distance Education

    Directory of Open Access Journals (Sweden)

    Sami SAHIN

    2006-10-01

    Full Text Available This paper is a review of the literature about the use of computer simulations in science education. This review examines the types and good examples of computer simulations. The literature review indicated that although computer simulations cannot replace science classroom and laboratory activities completely, they offer various advantages both for classroom and distance education. This paper consists of four parts. The first part describes computer simulations; the second part reviews the benefits in science education; the third part looks for the relation with science process skills; and the last part makes connections with the distance education. The literature suggests that the success of computer simulations use in science education depends on how they incorporated into curriculum and how teacher use it. The most appropriate use of computer simulations seems that use them for a supplementary tools for classroom instruction and laboratory. Multimedia supported, highly interactive, collaborative computer simulations appealing growing interest because of their potentials to supplement constructivist learning. They offer inquiry environments and cognitive tools to scaffold learning and apply problem-solving skills. Computer simulations are good tools to improve students’ hypothesis construction, graphic interpretation and prediction skills. The literature review also implied that computer simulations have potential for distance education laboratories. Yet this area is elusive and needs to be researched further.

  3. Advances in computational design and analysis of airbreathing propulsion systems

    Science.gov (United States)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  4. Modeling cost/performance of a parallel computer simulator

    OpenAIRE

    Falsafi, Babak; Wood, David A.

    1997-01-01

    This article examines the cost/performance of simulating a hypothetical target parallel computer using a commercial host parallel computer. We address the question of whether parallel simulation is simply faster than sequential simulation, or if it is also more cost-effective. To answer this, we develop a performance model of the Wisconsin Wind Tunnel (WWT), a system that simulates cache-coherent shared-memory machines on a message-passing Thinking Machines CM-5. The performance model uses Kr...

  5. Cost/performance of a parallel computer simulator

    OpenAIRE

    Falsafi, Babak; Wood, David A.

    1994-01-01

    This paper examines the cost/performance of simulating a hypothetical target parallel computer using a commercial host parallel computer. We address the question of whether parallel simulation is simply faster than sequential simulation, and whether it is also more cost-effective. To answer this, we develop a performance model of the Wisconsin Wind Tunnel (WWT), a system that simulates cache-coherent shared-memory machines on a message- passing Thinking Machines CM-5. The performance mod...

  6. Computer Simulations of Quantum Theory of Hydrogen Atom for Natural Science Education Students in a Virtual Lab

    Science.gov (United States)

    Singh, Gurmukh

    2012-01-01

    The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…

  7. Recent advances in computational methods and clinical applications for spine imaging

    CERN Document Server

    Glocker, Ben; Klinder, Tobias; Li, Shuo

    2015-01-01

    This book contains the full papers presented at the MICCAI 2014 workshop on Computational Methods and Clinical Applications for Spine Imaging. The workshop brought together scientists and clinicians in the field of computational spine imaging. The chapters included in this book present and discuss the new advances and challenges in these fields, using several methods and techniques in order to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modeling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis. The book also includes papers and reports from the first challenge on vertebra segmentation held at the workshop.

  8. Advance in research on aerosol deposition simulation methods

    International Nuclear Information System (INIS)

    A comprehensive analysis of the health effects of inhaled toxic aerosols requires exact data on airway deposition. A knowledge of the effect of inhaled drugs is essential to the optimization of aerosol drug delivery. Sophisticated analytical deposition models can be used for the computation of total, regional and generation specific deposition efficiencies. The continuously enhancing computer seem to allow us to study the particle transport and deposition in more and more realistic airway geometries with the help of computational fluid dynamics (CFD) simulation method. In this article, the trends in aerosol deposition models and lung models, and the methods for achievement of deposition simulations are also reviewed. (authors)

  9. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  10. Computational simulations of vorticity enhanced diffusion

    Science.gov (United States)

    Vold, Erik L.

    1999-11-01

    Computer simulations are used to investigate a phenomenon of vorticity enhanced diffusion (VED), a net transport and mixing of a passive scalar across a prescribed vortex flow field driven by a background gradient in the scalar quantity. The central issue under study here is the increase in scalar flux down the gradient and across the vortex field. The numerical scheme uses cylindrical coordinates centered with the vortex flow which allows an exact advective solution and 1D or 2D diffusion using simple numerical methods. In the results, the ratio of transport across a localized vortex region in the presence of the vortex flow over that expected for diffusion alone is evaluated as a measure of VED. This ratio is seen to increase dramatically while the absolute flux across the vortex decreases slowly as the diffusion coefficient is decreased. Similar results are found and compared for varying diffusion coefficient, D, or vortex rotation time, τv, for a constant background gradient in the transported scalar vs an interface in the transported quantity, and for vortex flow fields constant in time vs flow which evolves in time from an initial state and with a Schmidt number of order unity. A simple analysis shows that for a small diffusion coefficient, the flux ratio measure of VED scales as the vortex radius over the thickness for mass diffusion in a viscous shear layer within the vortex characterized by (Dτv)1/2. The phenomenon is linear as investigated here and suggests that a significant enhancement of mixing in fluids may be a relatively simple linear process. Discussion touches on how this vorticity enhanced diffusion may be related to mixing in nonlinear turbulent flows.

  11. Recent advances in computer modelling of granular systems

    OpenAIRE

    Jullien, R.; Meakin, P.; Pavlovitch, A.

    1993-01-01

    We present simple computer algorithms able to build random packings of spheres using the ballistic deposition model and we show how they can be used to investigate several size segregation phenomena occuring in granular systems : 1) penetration of a small sphere in a packing of large ones, 2) size-segregation in the formation of a heap or when pouring a silo, 3) size-segregation by shaking. In the last case, the computer simulation provides a very simple geometrical explanation of the phenome...

  12. Safety Assessment of Advanced Imaging Sequences II: Simulations

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    . The simulation time is between 0.67 ms to 2.8 ms per emission and imaging point, making it possible to simulate even complex emission sequences in less than 1 s for a single spatial position. The linear simulations yield a relative accuracy on MI between -12.1% to 52.3% and for Ispta.3 between -38......An automatic approach for simulating the emitted pressure, intensity, and MI of advanced ultrasound imaging sequences is presented. It is based on a linear simulation of pressure fields using Field II, and it is hypothesized that linear simulation can attain the needed accuracy for predicting...... Mechanical Index (MI) and Ispta.3 as required by FDA. The method is performed on four different imaging schemes and compared to measurements conducted using the SARUS experimental scanner. The sequences include focused emissions with an F-number of 2 with 64 elements that generate highly non-linear fields...

  13. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    Energy Technology Data Exchange (ETDEWEB)

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow

  14. Teaching Computer Organization and Architecture Using Simulation and FPGA Applications

    OpenAIRE

    D. K.M. Al-Aubidy

    2007-01-01

    This paper presents the design concepts and realization of incorporating micro-operation simulation and FPGA implementation into a teaching tool for computer organization and architecture. This teaching tool helps computer engineering and computer science students to be familiarized practically with computer organization and architecture through the development of their own instruction set, computer programming and interfacing experiments. A two-pass assembler has been designed and implemente...

  15. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  16. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    Science.gov (United States)

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  17. Computer Simulation Models of Economic Systems in Higher Education.

    Science.gov (United States)

    Smith, Lester Sanford

    The increasing complexity of educational operations make analytical tools, such as computer simulation models, especially desirable for educational administrators. This MA thesis examined the feasibility of developing computer simulation models of economic systems in higher education to assist decision makers in allocating resources. The report…

  18. Explore Effective Use of Computer Simulations for Physics Education

    Science.gov (United States)

    Lee, Yu-Fen; Guo, Yuying

    2008-01-01

    The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…

  19. The visual simulators for architecture and computer organization learning

    OpenAIRE

    Nikolić Boško; Grbanović Nenad; Đorđević Jovan

    2009-01-01

    The paper proposes a method of an effective distance learning of architecture and computer organization. The proposed method is based on a software system that is possible to be applied in any course in this field. Within this system students are enabled to observe simulation of already created computer systems. The system provides creation and simulation of switch systems, too.

  20. Computer Simulation of Technetium Scrubbing Section of Purex Ⅰ: Computer Simulation and Technical Parameter Analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN; Yan-xin; HE; Hui; ZHANG; Chun-long; CHANG; Li; LI; Rui-xue; TANG; Hong-bin; YU; Ting

    2012-01-01

    <正>A computer program was developed to simulate technetium scrubbing section (TcS) in Purex based on the theory of cascade extraction. The program can simulate the steady-state behavior of HNO3, U, Pu and Tc in TcS. The reliability of the program was verified by cascade extraction experiment, the relative error between calculation value and experiment value is 10% more or less except few spots. The comparison between experiment and calculation results is illustrated in Fig. 1. The technical parameters of TcS were analyzed by this program, it is found that the Decontamination factor (DFTc/U) in TcS is remarkably affected by the overall consumption (multiply molarity by volume flux) of HNO3, DFTc/U is

  1. Numerical Simulation of Multi-phase Flow in Porous Media on Parallel Computers

    CERN Document Server

    Liu, Hui; Chen, Zhangxin; Luo, Jia; Deng, Hui; He, Yanfeng

    2016-01-01

    This paper is concerned with developing parallel computational methods for two-phase flow on distributed parallel computers; techniques for linear solvers and nonlinear methods are studied, and the standard and inexact Newton methods are investigated. A multi-stage preconditioner for two-phase flow is proposed and advanced matrix processing strategies are implemented. Numerical experiments show that these computational methods are scalable and efficient, and are capable of simulating large-scale problems with tens of millions of grid blocks using thousands of CPU cores on parallel computers. The nonlinear techniques, preconditioner and matrix processing strategies can also be applied to three-phase black oil, compositional and thermal models.

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Planning of development strategy for establishment of advanced simulation of nuclear system

    International Nuclear Information System (INIS)

    In this product, the long term development plan in each technical area has been prosed with the plan of coupled code system. The consolidated code system for safety analysis has been proposing for future needs. The computing hardware needed for te advanced simulation is also proposing. The best approach for future safety analysis simulation capabilities may be a dual-path program. i. e. the development programs for an integrated analysis tool and multi-scale/multi-physic analysis tools, where the former aims at reducing uncertainty and the latter at enhancing accuracy. Integrated analysis tool with risk informed safety margin quantification It requires a significant extension of the phenomenological and geometric capabilities of existing reactor safety analysis software, capable of detailed simulations that reduce the uncertainties. Multi-scale, multi-physics analysis tools. Simplifications of complex phenomenological models and dependencies have been made in current safety analyses to accommodate computer hardware limitations. With the advent of modern computer hardware, these limitations may be removed to permit greater accuracy in representation of physical behavior of materials in design basis and beyond design basis conditions, and hence more accurate assessment of the true safety margins based on first principle methodology. The proposals can be utilized to develop the advanced simulation project and formulation of organization and establishment of high performance computing system in KAERI

  4. Free-boundary simulations of ITER advanced scenarios

    International Nuclear Information System (INIS)

    The successful operation of ITER advanced scenarios is likely to be a major step forward in the development of controlled fusion as a power production source. ITER advanced scenarios raise specific challenges that are not encountered in presently-operated tokamaks. In this thesis, it is argued that ITER advanced operation may benefit from optimal control techniques. Optimal control ensures high performance operation while guaranteeing tokamak integrity. The application of optimal control techniques for ITER operation is assessed and it is concluded that robust optimisation is appropriate for ITER operation of advanced scenarios. Real-time optimisation schemes are discussed and it is concluded that the necessary conditions of optimality tracking approach may potentially be appropriate for ITER operation, thus offering a viable closed-loop optimal control approach. Simulations of ITER advanced operation are necessary in order to assess the present ITER design and uncover the main difficulties that may be encountered during advanced operation. The DINA-CH and CRONOS full tokamak simulator is used to simulate the operation of the ITER hybrid and steady-state scenarios. It is concluded that the present ITER design is appropriate for performing a hybrid scenario pulse lasting more than 1000 sec, with a flat-top plasma current of 12 MA, and a fusion gain of Q ≅ 8. Similarly, a steady-state scenario without internal transport barrier, with a flat-top plasma current of 10 MA, and with a fusion gain of Q ≅ 5 can be realised using the present ITER design. The sensitivity of the advanced scenarios with respect to transport models and physical assumption is assessed using CRONOS. It is concluded that the hybrid scenario and the steady-state scenario are highly sensitive to the L-H transition timing, to the value of the confinement enhancement factor, to the heating and current drive scenario during ramp-up, and, to a lesser extent, to the density peaking and pedestal

  5. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  6. GPU-accelerated micromagnetic simulations using cloud computing

    Science.gov (United States)

    Jermain, C. L.; Rowlands, G. E.; Buhrman, R. A.; Ralph, D. C.

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  7. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  8. Prediction and Calibration Using Outputs from Multiple Computer Simulators

    OpenAIRE

    Goh, Joslin Tze Ching

    2014-01-01

    Computer simulators are widely used to describe and explore physical processes. In some cases, several simulators, which can be of different or similar fidelities, are available for this task. A big part of this thesis focuses on combining observations and model runs from multiple computer simulators to build a predictive model for the real process. The resulting models can be used to perform sensitivity analysis for the system, solve inverse problems and make predictions. The approaches ...

  9. Computational Efforts in Support of Advanced Coal Research

    Energy Technology Data Exchange (ETDEWEB)

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  10. Computer simulated plant design for waste minimization/pollution prevention

    International Nuclear Information System (INIS)

    The book discusses several paths to pollution prevention and waste minimization by using computer simulation programs. It explains new computer technologies used in the field of pollution prevention and waste management; provides information pertaining to overcoming technical, economic, and environmental barriers to waste reduction; gives case-studies from industries; and covers computer aided flow sheet design and analysis for nuclear fuel reprocessing

  11. Advanced computational modelling for drying processes – A review

    International Nuclear Information System (INIS)

    Highlights: • Understanding the product dehydration process is a key aspect in drying technology. • Advanced modelling thereof plays an increasingly important role for developing next-generation drying technology. • Dehydration modelling should be more energy-oriented. • An integrated “nexus” modelling approach is needed to produce more energy-smart products. • Multi-objective process optimisation requires development of more complete multiphysics models. - Abstract: Drying is one of the most complex and energy-consuming chemical unit operations. R and D efforts in drying technology have skyrocketed in the past decades, as new drivers emerged in this industry next to procuring prime product quality and high throughput, namely reduction of energy consumption and carbon footprint as well as improving food safety and security. Solutions are sought in optimising existing technologies or developing new ones which increase energy and resource efficiency, use renewable energy, recuperate waste heat and reduce product loss, thus also the embodied energy therein. Novel tools are required to push such technological innovations and their subsequent implementation. Particularly computer-aided drying process engineering has a large potential to develop next-generation drying technology, including more energy-smart and environmentally-friendly products and dryers systems. This review paper deals with rapidly emerging advanced computational methods for modelling dehydration of porous materials, particularly for foods. Drying is approached as a combined multiphysics, multiscale and multiphase problem. These advanced methods include computational fluid dynamics, several multiphysics modelling methods (e.g. conjugate modelling), multiscale modelling and modelling of material properties and the associated propagation of material property variability. Apart from the current challenges for each of these, future perspectives should be directed towards material property

  12. Creating Science Simulations through Computational Thinking Patterns

    Science.gov (United States)

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  13. Advanced intelligent computational technologies and decision support systems

    CERN Document Server

    Kountchev, Roumen

    2014-01-01

    This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down’s syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.

  14. Computer Simulations as an Integral Part of Intermediate Macroeconomics.

    Science.gov (United States)

    Millerd, Frank W.; Robertson, Alastair R.

    1987-01-01

    Describes the development of two interactive computer simulations which were fully integrated with other course materials. The simulations illustrate the effects of various real and monetary "demand shocks" on aggregate income, interest rates, and components of spending and economic output. Includes an evaluation of the simulations' effects on…

  15. On architectural acoustic design using computer simulation

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2004-01-01

    discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper is that the application of acoustic simulation programs is most beneficial in the last of three phases but an application of the...

  16. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  17. Development and implementation of advanced control methods for hybrid simulation

    OpenAIRE

    Kim, Hong

    2011-01-01

    Hybrid simulation is an effective way of testing structures that combines the benefits of a computational analysis and experimental testing techniques. Innovative structures consists of state-ofthe-art components and assemblages whose function as a system needs to be tested experimentally. Often times, these components and assemblages push the controller and other testing equipment to its limits. Performing hybrid simulation with the controller in displacement control mode does not always suf...

  18. Advance Reservation based DAG Application Scheduling Simulator for Grid Environment

    OpenAIRE

    Prajapati, Harshad B.; Shah, Vipul A.

    2012-01-01

    In the last decade, scheduling of Directed Acyclic Graph (DAG) application in the context of Grid environment has attracted attention of many researchers. However, deployment of Grid environment requires skills, efforts, budget, and time. Although various simulation toolkits or frameworks are available for simulating Grid environment, either they support different possible studies in Grid computing area or takes lot of efforts in molding them to make them suitable for scheduling of DAG applic...

  19. Classical simulation of restricted quantum computations

    OpenAIRE

    Nebhwani, Mrityunjaya

    2013-01-01

    Treball final de màster oficial fet en col·laboració amb Universitat Autònoma de Barcelona (UAB), Universitat de Barcelona (UB) i Institut de Ciències Fotòniques (ICFO) [ANGLÈS] We study restricted models of measurement-based quantum computation; and we investigate whether their output probability distributions can be sampled from efficiently on a classical computer. We find that even for non-adaptive models of MBQC, if this task were feasible then a major conjecture of computational compl...

  20. Simulating advanced life support systems to test integrated control approaches

    Science.gov (United States)

    Kortenkamp, D.; Bell, S.

    Simulations allow for testing of life support control approaches before hardware is designed and built. Simulations also allow for the safe exploration of alternative control strategies during life support operation. As such, they are an important component of any life support research program and testbed. This paper describes a specific advanced life support simulation being created at NASA Johnson Space Center. It is a discrete-event simulation that is dynamic and stochastic. It simulates all major components of an advanced life support system, including crew (with variable ages, weights and genders), biomass production (with scalable plantings of ten different crops), water recovery, air revitalization, food processing, solid waste recycling and energy production. Each component is modeled as a producer of certain resources and a consumer of certain resources. The control system must monitor (via sensors) and control (via actuators) the flow of resources throughout the system to provide life support functionality. The simulation is written in an object-oriented paradigm that makes it portable, extensible and reconfigurable.

  1. Patient Simulation Software to Augment an Advanced Pharmaceutics Course

    Science.gov (United States)

    Schonder, Kristine

    2011-01-01

    Objective To implement and assess the effectiveness of adding a pharmaceutical care simulation program to an advanced therapeutics course. Design PharmaCAL (University of Pittsburgh), a software program that uses a branched-outcome decision making model, was used to create patient simulations to augment lectures given in the course. In each simulation, students were presented with a challenge, given choices, and then provided with consequences specific to their choices. Assessments A survey was administered at the end of the course and students indicated the simulations were enjoyable (92%), easy to use (90%), stimulated interest in critically ill patients (82%), and allowed for application of lecture material (91%). A 5-item presimulation and postsimulation test on the anemia simulation was administered to assess learning. Students answered significantly more questions correctly on the postsimulation test than on the presimulation test (p < 0.001). Seventy-eight percent of students answered the same 5 questions correctly on the final examination. Conclusion Patient simulation software that used a branched-outcome decision model was an effective supplement to class lectures in an advanced pharmaceutics course and was well-received by pharmacy students. PMID:21519411

  2. Integration of adaptive process control with computational simulation for spin-forming

    International Nuclear Information System (INIS)

    Improvements in spin-forming capabilities through upgrades to a metrology and machine control system and advances in numerical simulation techniques were studied in a two year project funded by Laboratory Directed Research and Development (LDRD) at Lawrence Livermore National Laboratory. Numerical analyses were benchmarked with spin-forming experiments and computational speeds increased sufficiently to now permit actual part forming simulations. Extensive modeling activities examined the simulation speeds and capabilities of several metal forming computer codes for modeling flat plate and cylindrical spin-forming geometries. Shape memory research created the first numerical model to describe this highly unusual deformation behavior in Uranium alloys. A spin-forming metrology assessment led to sensor and data acquisition improvements that will facilitate future process accuracy enhancements, such as a metrology frame. Finally, software improvements (SmartCAM) to the manufacturing process numerically integrate the part models to the spin-forming process and to computational simulations

  3. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little ...... ventilated enclosure is considered, as little or no influence of geometry was observed at some distance from the computer simulated person. For local flow conditions, though, a more detailed geometry should be applied in order to assess thermal and atmospheric comfort....... effort, however, has been focused on the influence of the geometry. This work provides an investigation of geometrically different computer simulated persons with respect to both local and global airflow distribution. The results show that a simple geometry is sufficient when the global airflow of a...

  4. Inovation of the computer system for the WWER-440 simulator

    International Nuclear Information System (INIS)

    The configuration of the WWER-440 simulator computer system consists of four SMEP computers. The basic data processing unit consists of two interlinked SM 52/11.M1 computers with 1 MB of main memory. This part of the computer system of the simulator controls the operation of the entire simulator, processes the programs of technology behavior simulation, of the unit information system and of other special systems, guarantees program support and the operation of the instructor's console. An SM 52/11 computer with 256 kB of main memory is connected to each unit. It is used as a communication unit for data transmission using the DASIO 600 interface. Semigraphic color displays are based on the microprocessor modules of the SM 50/40 and SM 53/10 kit supplemented with a modified TESLA COLOR 110 ST tv receiver. (J.B.). 1 fig

  5. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...... effort, however, has been focused on the influence of the geometry. This work provides an investigation of geometrically different computer simulated persons with respect to both local and global airflow distribution. The results show that a simple geometry is sufficient when the global airflow of a...... ventilated enclosure is considered, as little or no influence of geometry was observed at some distance from the computer simulated person. For local flow conditions, though, a more detailed geometry should be applied in order to assess thermal and atmospheric comfort....

  6. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  7. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  8. SiMR: A simulator for learning computer architecture

    OpenAIRE

    Sánchez Carracedo, Fermín; Megías Jiménez, David (Coord.); Prieto Blázquez, Josep

    2011-01-01

    This paper presents SiMR, a simulator of the Rudimentary Machine designed to be used in a first course of computer architecture of Software Engineering and Computer Engineering programmes. The Rudimentary Machine contains all the basic elements in a RISC computer, and SiMR allows editing, assembling and executing programmes for this processor. SiMR is used at the Universitat Oberta de Catalunya as one of the most important resources in the Virtual Computing Architecture and Organisation Labor...

  9. Lessons Learned From Dynamic Simulations of Advanced Fuel Cycles

    Energy Technology Data Exchange (ETDEWEB)

    Steven J. Piet; Brent W. Dixon; Jacob J. Jacobson; Gretchen E. Matthern; David E. Shropshire

    2009-04-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe “lessons learned” from dynamic simulations but attempt to answer the “so what” question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof.

  10. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  11. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  12. Computer simulation of gas flow in the flash smelting furnace

    Energy Technology Data Exchange (ETDEWEB)

    Jokilaakso, A.; Yang, Yongxiang; Teppo, O.; Ahokainen, T.; Haenninen, A.

    1993-12-31

    This report presents the detailed results of computer simulation of flow in the Outokumpu flash smelting furnace. The work belongs to the project SULA-10: `Reaction and flow dynamic modelling of suspension smelting technology`, and was carried out in Laboratory of Materials Processing and Powder Metallurgy at Helsinki University of Technology. The geometry of the flash smelting furnace was simulated with a laboratory scale water and gas models. The same geometry was created as computational grids, in which the flow situation was analysed. The results from the laboratory models and computational grids were found to be in good agreement. The computer simulation was then extended to an industrial-scale flash smelting furnace. Cold gas flow was analysed in order to obtain a general flow pattern and working computational grid for future work with heat transfer, two-phase flow and chemical reactions. The computation was carried out with PHOENICS-based software EasyFlow and CFD2000. (orig.)

  13. Computer simulations: tools for population and evolutionary genetics

    OpenAIRE

    Hoban, Sean; Bertorelle, Giorgio; Gaggiotti, Oscar E

    2012-01-01

    Computer simulations are excellent tools for understanding the evolutionary and genetic consequences of complex processes whose interactions cannot be analytically predicted. Simulations have traditionally been used in population genetics by a fairly small community with programming expertise, but the recent availability of dozens of sophisticated, customizable software packages for simulation now makes simulation an accessible option for researchers in many fields. The in silico genetic data...

  14. Computer Simulation of Fire Dynamics in Industrial Hall

    International Nuclear Information System (INIS)

    In this paper, computer simulation of smoke spread dynamics in industrial hall is investigated. A set of simulations of fire in three industrial halls with the same geometry varying in the height of ceiling is realized using the FDS fire simulator, version 6. The obtained simulation results are described focusing on the impact of the ceiling height and fire barriers on the fire course and smoke spread dynamics

  15. Computed radiography simulation using the Monte Carlo code MCNPX

    International Nuclear Information System (INIS)

    Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)

  16. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  17. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  18. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  19. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Daniel [University of Iowa; Berzins, Martin [University of Utah; Pennington, Robert; Sarkar, Vivek [Rice University; Taylor, Valerie [Texas A& M University

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  20. Integration of Advanced Simulation and Visualization for Manufacturing Process Optimization

    Science.gov (United States)

    Zhou, Chenn; Wang, Jichao; Tang, Guangwu; Moreland, John; Fu, Dong; Wu, Bin

    2016-05-01

    The integration of simulation and visualization can provide a cost-effective tool for process optimization, design, scale-up and troubleshooting. The Center for Innovation through Visualization and Simulation (CIVS) at Purdue University Northwest has developed methodologies for such integration with applications in various manufacturing processes. The methodologies have proven to be useful for virtual design and virtual training to provide solutions addressing issues on energy, environment, productivity, safety, and quality in steel and other industries. In collaboration with its industrial partnerships, CIVS has provided solutions to companies, saving over US38 million. CIVS is currently working with the steel industry to establish an industry-led Steel Manufacturing Simulation and Visualization Consortium through the support of National Institute of Standards and Technology AMTech Planning Grant. The consortium focuses on supporting development and implementation of simulation and visualization technologies to advance steel manufacturing across the value chain.

  1. 3rd International Workshop on Advances in Simulation-Driven Optimization and Modeling

    CERN Document Server

    Leifsson, Leifur; Yang, Xin-She

    2016-01-01

    This edited volume is devoted to the now-ubiquitous use of computational models across most disciplines of engineering and science, led by a trio of world-renowned researchers in the field. Focused on recent advances of modeling and optimization techniques aimed at handling computationally-expensive engineering problems involving simulation models, this book will be an invaluable resource for specialists (engineers, researchers, graduate students) working in areas as diverse as electrical engineering, mechanical and structural engineering, civil engineering, industrial engineering, hydrodynamics, aerospace engineering, microwave and antenna engineering, ocean science and climate modeling, and the automotive industry, where design processes are heavily based on CPU-heavy computer simulations. Various techniques, such as knowledge-based optimization, adjoint sensitivity techniques, and fast replacement models (to name just a few) are explored in-depth along with an array of the latest techniques to optimize the...

  2. Implementation of a blade element UH-60 helicopter simulation on a parallel computer architecture in real-time

    Science.gov (United States)

    Moxon, Bruce C.; Green, John A.

    1990-01-01

    A high-performance platform for development of real-time helicopter flight simulations based on a simulation development and analysis platform combining a parallel simulation development and analysis environment with a scalable multiprocessor computer system is described. Simulation functional decomposition is covered, including the sequencing and data dependency of simulation modules and simulation functional mapping to multiple processors. The multiprocessor-based implementation of a blade-element simulation of the UH-60 helicopter is presented, and a prototype developed for a TC2000 computer is generalized in order to arrive at a portable multiprocessor software architecture. It is pointed out that the proposed approach coupled with a pilot's station creates a setting in which simulation engineers, computer scientists, and pilots can work together in the design and evaluation of advanced real-time helicopter simulations.

  3. Computational simulation of the extensive air shower

    International Nuclear Information System (INIS)

    Full text: Cosmic Rays (CRs) are defined as particles of cosmic origin reaching at the top of Earth's atmosphere. Two central questions guide the research of CRs. The detection of CRs occurs by indirect experimental techniques, due to the very low flux of primaries. The primary particle interacts with the atmosphere and generates secondary particles in a large cascade called Extensive Air Shower (EAS). The code CORSIKA (COsmic Ray SImulations for KAscade) is a detailed Monte Carlo, whose aim is to simulate the evolution of EASs through the atmosphere. With this project we aim to produce an extensive library of showers, using several interaction models and different energies (from very-high to ultra-high energies), generating important data to be studied by the cosmic rays group of UFABC. The project aim is to simulate EAS through the CORSIKA code, generating a wide library for understanding the influence of the models of high energy hadronic interactions in the particles production of their physical parameters. After performing some simulations, studying the elongation rate from different primary compositions and determining the ratio of events formation and energy the threshold of the main current events. Finally, we will propose processes not contemplated in the simulation or possible improvements in existing models. (author)

  4. Advanced Simulation Capability for Environmental Management (ASCEM) Phase II Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Freshley, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hubbard, S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flach, G. [Savannah River National Lab. (SRNL), Aiken, SC (United States); Freedman, V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Agarwal, D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Andre, B. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bott, Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, X. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Davis, J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Faybishenko, B. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gorton, I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Murray, C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moulton, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meyer, J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rockhold, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Shoshani, A. [LBNL; Steefel, C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wainwright, H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Waichler, S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2012-09-28

    In 2009, the National Academies of Science (NAS) reviewed and validated the U.S. Department of Energy Office of Environmental Management (EM) Technology Program in its publication, Advice on the Department of Energy’s Cleanup Technology Roadmap: Gaps and Bridges. The NAS report outlined prioritization needs for the Groundwater and Soil Remediation Roadmap, concluded that contaminant behavior in the subsurface is poorly understood, and recommended further research in this area as a high priority. To address this NAS concern, the EM Office of Site Restoration began supporting the development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific approach that uses an integration of toolsets for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM modeling toolset is modular and open source. It is divided into three thrust areas: Multi-Process High Performance Computing (HPC), Platform and Integrated Toolsets, and Site Applications. The ASCEM toolsets will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. During fiscal year 2012, the ASCEM project continued to make significant progress in capabilities development. Capability development occurred in both the Platform and Integrated Toolsets and Multi-Process HPC Simulator areas. The new Platform and Integrated Toolsets capabilities provide the user an interface and the tools necessary for end-to-end model development that includes conceptual model definition, data management for model input, model calibration and uncertainty analysis, and model output processing including visualization. The new HPC Simulator capabilities target increased functionality of process model representations, toolsets for interaction with the Platform, and model confidence testing and verification for

  5. Requirements for advanced simulation of nuclear reactor and chemicalseparation plants.

    Energy Technology Data Exchange (ETDEWEB)

    Palmiotti, G.; Cahalan, J.; Pfeiffer, P.; Sofu, T.; Taiwo, T.; Wei,T.; Yacout, A.; Yang, W.; Siegel, A.; Insepov, Z.; Anitescu, M.; Hovland,P.; Pereira, C.; Regalbuto, M.; Copple, J.; Willamson, M.

    2006-12-11

    This report presents requirements for advanced simulation of nuclear reactor and chemical processing plants that are of interest to the Global Nuclear Energy Partnership (GNEP) initiative. Justification for advanced simulation and some examples of grand challenges that will benefit from it are provided. An integrated software tool that has its main components, whenever possible based on first principles, is proposed as possible future approach for dealing with the complex problems linked to the simulation of nuclear reactor and chemical processing plants. The main benefits that are associated with a better integrated simulation have been identified as: a reduction of design margins, a decrease of the number of experiments in support of the design process, a shortening of the developmental design cycle, and a better understanding of the physical phenomena and the related underlying fundamental processes. For each component of the proposed integrated software tool, background information, functional requirements, current tools and approach, and proposed future approaches have been provided. Whenever possible, current uncertainties have been quoted and existing limitations have been presented. Desired target accuracies with associated benefits to the different aspects of the nuclear reactor and chemical processing plants were also given. In many cases the possible gains associated with a better simulation have been identified, quantified, and translated into economical benefits.

  6. Two inviscid computational simulations of separated flow about airfoils

    Science.gov (United States)

    Barnwell, R. W.

    1976-01-01

    Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

  7. Advances in Simulation of Wave Interaction with Extended MHD Phenomena

    International Nuclear Information System (INIS)

    The Integrated Plasma Simulator (IPS) provides a framework within which some of the most advanced, massively-parallel fusion modeling codes can be interoperated to provide a detailed picture of the multi-physics processes involved in fusion experiments. The presentation will cover four topics: (1) recent improvements to the IPS, (2) application of the IPS for very high resolution simulations of ITER scenarios, (3) studies of resistive and ideal MHD stability in tokamk discharges using IPS facilities, and (4) the application of RF power in the electron cyclotron range of frequencies to control slowly growing MHD modes in tokamaks and initial evaluations of optimized location for RF power deposition.

  8. Advanced simulation of windmills in electric power supply

    DEFF Research Database (Denmark)

    Akhmatov, Vladislav; Knudsen, Hans; Nielsen, Arne Hejde

    2000-01-01

    An advanced model of a grid-connected windmill is set up where the windmill is a complex electro-mechanical system. The windmill model is implemented as a standardised component in the dynamic simulation tool, PSS/E, which makes it possible to investigate dynamic behaviour of grid-connected windm......An advanced model of a grid-connected windmill is set up where the windmill is a complex electro-mechanical system. The windmill model is implemented as a standardised component in the dynamic simulation tool, PSS/E, which makes it possible to investigate dynamic behaviour of grid......-connected windmills as a part of realistic electrical grid models. That means an arbitrary number of wind farms or single windmills within an arbitrary network configuration. The windmill model may be applied to study of electric power system stability and of power quality as well. It is found that a grid...

  9. Computer simulation of confined liquid crystal dynamics

    CERN Document Server

    Webster, R E

    2001-01-01

    are performed of the formation of structures in confined smectic systems where layer tilt is induced by an imposed surface pretilt. Results show that bookshelf, chevron and tilled layer structures are observable in a confined Gay-Berne system. The formation and stability of the chevron structure are shown to be influenced by surface slip. Results are presented from a series of simulations undertaken to determine whether dynamic processes observed in device-scale liquid crystal cells confined between aligning substrates can be simulated in a molecular system using parallel molecular dynamics of the Gay-Berne model. In a nematic cell, on removal of an aligning field, initial near-surface director relaxation can induce flow, termed 'backflow' in the liquid. This, in turn, can cause director rotation, termed 'orientational kickback', in the centre of the cell. Simulations are performed of the relaxation in nematic systems confined between substrates with a common alignment on removal of an aligning field. Results...

  10. COMPUTER SIMULATION SYSTEM OF STRETCH REDUCING MILL

    Institute of Scientific and Technical Information of China (English)

    B.Y. Sun; S.J. Yuan

    2007-01-01

    The principle of the stretch reducing process is analyzed and three models of pass design areestablished. The simulations are done about variables, such as, stress, strain, the stretches betweenthe stands, the size parameters of the steel tube, and the roll force parameters. According to itsproduct catalogs the system can automatically divide the pass series, formulate the rolling table,and simulate the basic technological parameters in the stretch reducing process. All modules areintegrated based on the developing environment of VB6. The system can draw simulation curvesand pass pictures. Three kinds of database including the material database, pass design database,and product database are devised using Microsoft Access, which can be directly edited, corrected,and searched.

  11. Advanced 3D Photocathode Modeling and Simulations Final Report

    International Nuclear Information System (INIS)

    High brightness electron beams required by the proposed Next Linear Collider demand strong advances in photocathode electron gun performance. Significant improvement in the production of such beams with rf photocathode electron guns is hampered by the lack high-fidelity simulations. The critical missing piece in existing gun codes is a physics-based, detailed treatment of the very complex and highly nonlinear photoemission process

  12. Computer Simulation of Electric Field Lines.

    Science.gov (United States)

    Kirkup, L.

    1985-01-01

    Describes a computer program which plots electric field line plots. Includes program listing, sample diagrams produced on a BBC model B microcomputer (which could be produced on other microcomputers by modifying the program), and a discussion of the properties of field lines. (JN)

  13. Polyelectrolytes in Solution - Recent Computer Simulations

    OpenAIRE

    Holm, Christian; Kremer, Kurt

    1998-01-01

    We present a short overview over recent MD simulations of systems of fully flexible polyelectrolyte chains with explicitly treated counter ions using the full Coulomb potential. The main emphasis is given on the conformational properties of the polymers, with a short discussion on counter ion condensation.

  14. On Architectural Acoustics Design using Computer Simulation

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2004-01-01

    architect without this information is discussed. The conclusion of the paper is that the application of acoustical simulation programs is most beneficial in the last of three phases but that an application of the program to the two first phases would be preferable and possible with an improvement of the...

  15. Evaluating fluid behavior in advanced reactor systems using coupled computational fluid dynamics and systems analysis tools

    International Nuclear Information System (INIS)

    Simulation of some fluid phenomena associated with Generation IV reactors require the capability of modeling mixing in two- or three-dimensional flow. At the same time, the flow condition of interest is often transient and depends upon boundary conditions dictated by the system behavior as a whole. Computational Fluid Dynamics (CFD) is an ideal tool for simulating mixing and three-dimensional flow in system components, whereas a system analysis tool is ideal for modeling the entire system. This paper presents the reasoning which has led to coupled CFD and systems analysis code software to analyze the behavior of advanced reactor fluid system behavior. In addition, the kinds of scenarios where this capability is important are identified. The important role of a coupled CFD/systems analysis code tool in the overall calculation scheme for a Very High Temperature Reactor is described. The manner in which coupled systems analysis and CFD codes will be used to evaluate the mixing behavior in a plenum for transient boundary conditions is described. The calculation methodology forms the basis for future coupled calculations that will examine the behavior of such systems at a spectrum of conditions, including transient accident conditions, that define the operational and accident envelope of the subject system. The methodology and analysis techniques demonstrated herein are a key technology that in part forms the backbone of the advanced techniques employed in the evaluation of advanced designs and their operational characteristics for the Generation IV advanced reactor systems. (authors)

  16. Process Training Derived from a Computer Simulation Theory

    Science.gov (United States)

    Holzman, Thomas G.; And Others

    1976-01-01

    Discusses a study which investigated whether a computer simulation model could suggest subroutines that were instructable and whether instruction on these subroutines could facilitate subjects' solutions to the problem task. (JM)

  17. MINEXP, A Computer-Simulated Mineral Exploration Program

    Science.gov (United States)

    Smith, Michael J.; And Others

    1978-01-01

    This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

  18. Computer simulation of sulfhydryl collectors and their derivatives

    International Nuclear Information System (INIS)

    Present work is devoted to computer simulation of sulfhydryl collectors and their derivatives. Thus, the short chain carboxylic acids modified by dithio fragments are synthesized. Modified sulfhydryl collectors are synthesized as well. The properties of reagents are studied.

  19. Computer simulation of confined liquid crystal dynamics

    International Nuclear Information System (INIS)

    Results are presented from a series of simulations undertaken to determine whether dynamic processes observed in device-scale liquid crystal cells confined between aligning substrates can be simulated in a molecular system using parallel molecular dynamics of the Gay-Berne model. In a nematic cell, on removal of an aligning field, initial near-surface director relaxation can induce flow, termed 'backflow' in the liquid. This, in turn, can cause director rotation, termed 'orientational kickback', in the centre of the cell. Simulations are performed of the relaxation in nematic systems confined between substrates with a common alignment on removal of an aligning field. Results show /that relaxation timescales of medium sized systems are accessible. Following this, simulations are performed of relaxation in hybrid aligned nematic systems, where each surface induces a different alignment. Flow patterns associated with director reorientation are observed. The damped oscillatory nature of the relaxation process suggests that the behaviour of these systems is dominated by orientational elastic forces and that the observed director motion and flow do not correspond to the macroscopic processes of backflow and kickback. Chevron structures can occur in confined smectic cells which develop two domains of equal and opposite layer tilt on cooling. Layer lilting is thought to be caused by a need to reconcile a mismatch between bulk and surface smectic layer spacing. Here, simulations are performed of the formation of structures in confined smectic systems where layer tilt is induced by an imposed surface pretilt. Results show that bookshelf, chevron and tilled layer structures are observable in a confined Gay-Berne system. The formation and stability of the chevron structure are shown to be influenced by surface slip. (author)

  20. Decontamination planning based on computer simulation code CDE

    International Nuclear Information System (INIS)

    Decontamination planning based on a computer simulation code CDE is discussed in this paper. Large amount of radionuclides had been discharged to environment in the accident of the Tokyo Electronic Power Company Fukushima Dai-ichi Nuclear Power Plant. CDE has been developed to support planning the decontamination. From the present study, it is validated that the computer simulation is very useful to predict the effect of the scenario before actions, and to plan the decontamination. (J.P.N.)

  1. Computer Simulations of Soft Matter: Linking the Scales

    OpenAIRE

    Raffaello Potestio; Christine Peter; Kurt Kremer

    2014-01-01

    In the last few decades, computer simulations have become a fundamental tool in the field of soft matter science, allowing researchers to investigate the properties of a large variety of systems. Nonetheless, even the most powerful computational resources presently available are, in general, sufficient to simulate complex biomolecules only for a few nanoseconds. This limitation is often circumvented by using coarse-grained models, in which only a subset of the system’s degrees of freedom is r...

  2. Computer simulation, rhetoric, and the scientific imagination how virtual evidence shapes science in the making and in the news

    CERN Document Server

    Roundtree, Aimee Kendall

    2013-01-01

    Computer simulations help advance climatology, astrophysics, and other scientific disciplines. They are also at the crux of several high-profile cases of science in the news. How do simulation scientists, with little or no direct observations, make decisions about what to represent? What is the nature of simulated evidence, and how do we evaluate its strength? Aimee Kendall Roundtree suggests answers in Computer Simulation, Rhetoric, and the Scientific Imagination. She interprets simulations in the sciences by uncovering the argumentative strategies that underpin the production and disseminati

  3. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  4. QCE: An intelligent aid towards self-explanatory computer simulation

    International Nuclear Information System (INIS)

    The paper aims to propose an attempt towards such self-explanatory computer simulation. Qualitative Constraint-based Explanation (QCE) is a technique of generating explanations on the results of numerical computer simulation by qualitative reasoning based on the system equations used in the simulation, which include natural laws, mathematical formulae and other domain-specific knowledge. The QCE system does this work by propagating qualitative constraints among the equations. In the paper, the concept of self-explanatory simulation is discussed, and then the detail of the methodology used in QCE as well as some examples of its application will be presented to show the prospect of self-explanatory computer simulation. (orig./DG)

  5. Computer simulation of reconnection in planetary magnetospheres

    International Nuclear Information System (INIS)

    The earth's magnetosphere provides an ideal opportunity to model reconnection in well known geometries that are close enough to the idealized analytic models to make a comparison of the computer models with analytic theory meaningful. In addition more detailed, even three-dimensional, models can be used for a comparison with extended data from in situ observations. The computer studies have basically confirmed the reconnection picture that was based on two-dimensional steady state models and linear analytic theory. The three-dimensional models in particular have also added a lot more information on the reconnection process and the structure of flow, magnetic fields, and currents including many features that are consistent with observations and empirical models of geomagnetic substorms

  6. Advanced information processing system: Inter-computer communication services

    Science.gov (United States)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  7. Computer Simulation of a Plasma Vibrator Antenna

    OpenAIRE

    Nikolay N. Bogachev; Irina L. Bogdankevich; Namik G. Gusein-zade; Vladimir P. Tarakanov

    2013-01-01

    The use of new plasma technologies in antenna technology is widely discussed nowadays. The plasma antenna must receive and transmit signals in the frequency range of a transceiver. Many experiments have been carried out with plasma antennas to transmit and receive signals. Due to lack of experimental data and because experiments are difficult to carry out, there is a need for computer (numerical) modeling to calculate the parameters and characteristics of antennas, and to verify the parameter...

  8. Computer simulation of electronic excitations in beryllium

    CERN Document Server

    Popov, A V

    2016-01-01

    An effective method for the quantitative description of the electronic excited states of polyatomic systems is developed by using computer technology. The proposed method allows calculating various properties of matter at the atomic level within the uniform scheme. A special attention is paid to the description of beryllium atoms interactions with the external fields, comparable by power to the fields in atoms, molecules and clusters.

  9. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  10. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  11. High-Performance Computing in Astrophysical Simulations

    Science.gov (United States)

    Protasov, Viktor; Serenko, Alexander; Nenashev, Vladislav; Kulikov, Igor; Chernykh, Igor

    2016-02-01

    The author's approach for simulating of multiscale astrophysical objects with using of supercomputers is described in the paper. Astrophysical objects consists of several components with different nature, and as a result are described with different mathematical models. This fact leads us to need of formulation of mathematical model and numerical method for each component. The two-phase model (gas + particles) was used in case of simulation of protoplanetary disks. The numerical method and details of parallel implementation for that model were disclosed. The mathematical model for galactic objects, describing stellar component and dark matter, based on the first momenta of Boltzmann equation was built. Such approach allows us to use unified numerical method to describe collisionless and gas component of galaxies.

  12. Computer simulation of proton channelling in silicon

    Indian Academy of Sciences (India)

    N K Deepak; K Rajasekharan; K Neelakandan

    2000-06-01

    The channelling of 3 MeV protons in the $\\langle 110\\rangle$ direction of silicon has been simulated using Vineyard model taking into account thermally vibrating nuclei and energy loss due to ionelectron interactions. A beam made up of constant energy particles but with spatial divergence has been simulated for the purpose. The values of the minimum scattering yield and half width of the channelling dip are shown to be depth sensitive and agree well with the measured values. The dependence of yield on the angle of incidence has been found to give information of all three types of channelling. The critical angles for the three types of channelling and wavelength of planar oscillations are consistent with the previous calculations.

  13. Computer Simulation of Turbulent Reactive Gas Dynamics

    Directory of Open Access Journals (Sweden)

    Bjørn H. Hjertager

    1984-10-01

    Full Text Available A simulation procedure capable of handling transient compressible flows involving combustion is presented. The method uses the velocity components and pressure as primary flow variables. The differential equations governing the flow are discretized by integration over control volumes. The integration is performed by application of up-wind differencing in a staggered grid system. The solution procedure is an extension of the SIMPLE-algorithm accounting for compressibility effects.

  14. An Effective Data Representation and Computation Scheme in Computer Simulation for Neural Networks

    Institute of Scientific and Technical Information of China (English)

    CHENHoujin; YUANBaozong

    2004-01-01

    A Biological neural network (BNN) is composed of a vast number of neurons interconnected by synapses. It has the ability to process information and generate a specific pattern of electrical activity. To analyze its interior structure and exterior properties, computational models were combined with experimental data and one computer simulation system was implemented. As BNN is a complicated nonlinear system and the simulation deals with a great amount of numeric computations,so data representation and computation scheme are critical to simulation process. In this paper, Object-oriented data representation (OODR) was designed to have sharable and reusable properties, and one novel hybrid computation scheme was presented. With OODR, data share and computation share were simultaneously achieved. According to the hybrid computation scheme, individual computation method was applied to corresponding object based on its model characteristics and the computation efficiency was obviously increased. Now they were adopted in one BNN simulation system which was implemented in platform independent language JAVA. As the simulation system took advantage of the data representation and the computation scheme, so its performances were greatly improved, and it has got practical applications in many countries.

  15. Reliability of an interactive computer program for advance care planning.

    Science.gov (United States)

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  16. Reliability of an Interactive Computer Program for Advance Care Planning

    Science.gov (United States)

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  17. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  18. A computer simulator for development of engineering system design methodologies

    Science.gov (United States)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  19. Computer simulation of martensitic transformations in idealized systems

    International Nuclear Information System (INIS)

    Very little theoretical work on the development of the martensitic transformation and the characteristics of the resulting microstructure exists. This thesis advances the theory of the martensite transformation by constructing a computer model of a martensitic transformation in an idealized system. The model has its source in the general observation that the characteristics of martensitic transformations in solids are largely determined by accomodating the strain associated with the martensitic distortion of the crystal lattice. A review and adaptation of prior theoretical work leads to the development of a theory which allows the straightforward computation of the elastic energy associated with an arbitrary distribution of defects in an elastically anisotropic body under the assumption that the body has uniform elastic constants and that anharmonic effects may be neglected. Equations are cast in which the energy is written as a simple sum of binary interactions in which the defects influence one another according to an elastic potential whose form can be calculated. At the time that the energetic equations take a simple form the kinematics of the process involving the appearance of elastic inclusions are also known to be simple. The martiensitic transformation is modeled as a transformation which occurs through the sequential formation of individual martensitic elements, each carries the elementary transformation strain. Statistical equations developed govern the selection of the transformation path, or sequence that elementary martensite particles appear in the model, and specifies the kinetics of transformation.A useful representative path is defined as the minimum energy path. The model is used for the detailed simulation of a martensitic transformation in a pseudo two-dimensional system. Virtually all interesting qualitative aspects of the developing martensitic transformation are shown to be inherently present within it

  20. Preliminary Evaluation of a Computer Simulation of Long Cane Use.

    Science.gov (United States)

    Chubon, Robert A.; Keith, Ashley D.

    1989-01-01

    Developed and evaluated long cane mobility computer simulation as visual rehabilitation training device and research tool in graduate students assigned to instruction (BI) (N=10) or enhanced instruction (EI) (N=9). Found higher percentage of EI students completed simulation task. Concluded that students registered positive understanding changes,…

  1. Evaluating changes of writhe in computer simulations of supercoiled DNA

    NARCIS (Netherlands)

    Vries, de R.J.

    2005-01-01

    We compute changes in the writhe of a polygonal space curve when one of the vertices is displaced. The resulting expressions can be used in simulations of supercoiled DNA. For Brownian dynamics simulations, the expressions can be used to eliminate the explicit twisting degree of freedom. For Monte C

  2. Computer simulation program is adaptable to industrial processes

    Science.gov (United States)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  3. A Computer Aided System for Simulating Weld Metal Solidification Crack

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A computer-aided system for simulating weld solidification crack has been developed by which a welding engineer can carry out the welding solidification crack simulation on the basis of a commercial finite element analysis software package. Its main functions include calculating the heat generations of the moving arc, mesh generation, calculating stress-strain distributions with element rebirth technique.

  4. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  5. Effectiveness of an Endodontic Diagnosis Computer Simulation Program.

    Science.gov (United States)

    Fouad, Ashraf F.; Burleson, Joseph A.

    1997-01-01

    Effectiveness of a computer simulation to teach endodontic diagnosis was assessed using three groups (n=34,32,24) of dental students. All were lectured on diagnosis, pathology, and radiographic interpretation. One group then used the simulation, another had a seminar on the same material, and the third group had no further instruction. Results…

  6. Teaching Objectives of a Simulation Game for Computer Security

    OpenAIRE

    Irvine, Cynthia E.; Thompson, Michael

    2003-01-01

    This paper describes a computer simulation game being developed to teach computer security principles. The player of the game constructs computer networks and makes choices affecting the ability of these networks and the game's virtual users to protect valuable assets from attack by both vandals and well-motivated professionals. The game introduces the player to the need for well formed information security policies, allowing the player to deploy a variety of means to enforce security policie...

  7. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    Science.gov (United States)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  8. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  9. Computer simulator for training operators of thermal cameras

    Science.gov (United States)

    Chrzanowski, Krzysztof; Krupski, Marcin

    2004-08-01

    A PC-based image generator SIMTERM developed for training operators of non-airborne military thermal imaging systems is presented in this paper. SIMTERM allows its users to generate images closely resembling thermal images of many military type targets at different scenarios obtained with the simulated thermal camera. High fidelity of simulation was achieved due to use of measurable parameters of thermal camera as input data. Two modified versions of this computer simulator developed for designers and test teams are presented, too.

  10. Flexing Computational Muscle: Modeling and Simulation of Musculotendon Dynamics

    OpenAIRE

    Millard, Matthew; Uchida, Thomas; Seth, Ajay; Delp, Scott L.

    2013-01-01

    Muscle-driven simulations of human and animal motion are widely used to complement physical experiments for studying movement dynamics. Musculotendon models are an essential component of muscle-driven simulations, yet neither the computational speed nor the biological accuracy of the simulated forces has been adequately evaluated. Here we compare the speed and accuracy of three musculotendon models: two with an elastic tendon (an equilibrium model and a damped equilibrium model) and one with ...

  11. Digital control computer upgrade at the Cernavoda NPP simulator

    International Nuclear Information System (INIS)

    The Plant Process Computer equips some Nuclear Power Plants, like CANDU-600, with Centralized Control performed by an assembly of two computers known as Digital Control Computers (DCC) and working in parallel for safely driving of the plan at steady state and during normal maneuvers but also during abnormal transients when the plant is automatically steered to a safe state. The Centralized Control means both hardware and software with obligatory presence in the frame of the Full Scope Simulator and subject to changing its configuration with specific requirements during the plant and simulator life and covered by this subsection

  12. Advances in computer technology: impact on the practice of medicine.

    Science.gov (United States)

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease. PMID:8721907

  13. Computer Simulation of an Armoured Battalion Swarming

    Directory of Open Access Journals (Sweden)

    Radomir Jankovic

    2011-01-01

    Full Text Available Swarming is a tactical approach considered in modern armies combat activities conceptualisation. More intensive research of military application of swarming began after 2000, mostly in the areas of unmanned air, underwater, and ground vehicles, as well as in air force, navy and some special ground force units. In spite of relative inconveniences of contemporary main battle tanks to act as swarmers, some of initial results of the armoured mobile platforms swarming research have been presented. The motivation for the research is that adaptation of contemporary tanks for swarming could prolong their working life until the new generation comes, and could be the best investment in medium and smaller countries armies' modernisation. Brief survey of the till date research, description of the simulation model and the results of experiments simulating swarming of the battalion-sized group of armed mobile platforms, defending territory from superior adversary unit, have been considered in the paper.Defence Science Journal, 2011, 61(1, pp.36-43, DOI:http://dx.doi.org/10.14429/dsj.61.49

  14. Computer based training simulator for Hunterston Nuclear Power Station

    International Nuclear Information System (INIS)

    For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)

  15. The Osseus platform: a prototype for advanced web-based distributed simulation

    Science.gov (United States)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  16. Computer Simulation of a Plasma Vibrator Antenna

    Directory of Open Access Journals (Sweden)

    Nikolay N. Bogachev

    2013-01-01

    Full Text Available The use of new plasma technologies in antenna technology is widely discussed nowadays. The plasma antenna must receive and transmit signals in the frequency range of a transceiver. Many experiments have been carried out with plasma antennas to transmit and receive signals. Due to lack of experimental data and because experiments are difficult to carry out, there is a need for computer (numerical modeling to calculate the parameters and characteristics of antennas, and to verify the parameters for future studies. Our study has modeled plasma vibrator (dipole antennas (PDA and metal vibrator (dipole antennas (MDA, and has calculated the characteristics of PDAs and MDAs in the full KARAT electro-code. The correctness of the modeling has been tested by calculating a metal antenna using the MMANA program.

  17. Computer Simulations in Mechanics at the Secondary School

    Directory of Open Access Journals (Sweden)

    Stanislav HOLEC

    2004-10-01

    Full Text Available Computer simulations seem to be one of the most effective ways to use computers in physics education. They encourage students to carry out the processes used in physics research: to question, predict, hypothesise, observe, interpret results etc. Their effective use requires an availability of appropriate teaching resources fitting secondary schools curricula. This paper presents a set of computer simulations that cover the curriculum area of Mechanics and are designed to fit directly to curricula and textbooks used at Slovak grammar schools. All simulations are accompanied by brief instructions for teachers, including suggestions for learning activities and problem tasks for students. Some of them are designed as virtual laboratories. The developed simulations were tested with a group of secondary school students and evaluated also by groups of future and practising physics teachers. The paper presents and discusses findings and conclusions from the both runs of the testing.

  18. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  19. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States)

    2016-06-21

    The focus of the project is the development of mathematical methods and high-performance com- putational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly e cient and scalable numer- ical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  20. Computer simulations of adsorbed liquid crystal films

    Science.gov (United States)

    Wall, Greg D.; Cleaver, Douglas J.

    2003-01-01

    The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.

  1. Teaching Computer Organization and Architecture Using Simulation and FPGA Applications

    Directory of Open Access Journals (Sweden)

    D. K.M. Al-Aubidy

    2007-01-01

    Full Text Available This paper presents the design concepts and realization of incorporating micro-operation simulation and FPGA implementation into a teaching tool for computer organization and architecture. This teaching tool helps computer engineering and computer science students to be familiarized practically with computer organization and architecture through the development of their own instruction set, computer programming and interfacing experiments. A two-pass assembler has been designed and implemented to write assembly programs in this teaching tool. In addition to the micro-operation simulation, the complete configuration can be run on Xilinx Spartan-3 FPGA board. Such implementation offers good code density, easy customization, easily developed software, small area, and high performance at low cost.

  2. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  3. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  4. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  5. Further development of the Dynamic Control Assemblies Worth Measurement Method for Advanced Reactivity Computers

    International Nuclear Information System (INIS)

    The dynamic control assemblies worth measurement technique is a quick method for validation of predicted control assemblies worth. The dynamic control assemblies worth measurement utilize space-time corrections for the measured out of core ionization chamber readings calculated by DYN 3D computer code. The space-time correction arising from the prompt neutron density redistribution in the measured ionization chamber reading can be directly applied in the advanced reactivity computer. The second correction concerning the difference of spatial distribution of delayed neutrons can be calculated by simulation the measurement procedure by dynamic version of the DYN 3D code. In the paper some results of dynamic control assemblies worth measurement applied for NPP Mochovce are presented (Authors)

  6. Simulated herbivory advances autumn phenology in Acer rubrum

    Science.gov (United States)

    Forkner, Rebecca E.

    2014-05-01

    To determine the degree to which herbivory contributes to phenotypic variation in autumn phenology for deciduous trees, red maple ( Acer rubrum) branches were subjected to low and high levels of simulated herbivory and surveyed at the end of the season to assess abscission and degree of autumn coloration. Overall, branches with simulated herbivory abscised ˜7 % more leaves at each autumn survey date than did control branches within trees. While branches subjected to high levels of damage showed advanced phenology, abscission rates did not differ from those of undamaged branches within trees because heavy damage induced earlier leaf loss on adjacent branch nodes in this treatment. Damaged branches had greater proportions of leaf area colored than undamaged branches within trees, having twice the amount of leaf area colored at the onset of autumn and having ˜16 % greater leaf area colored in late October when nearly all leaves were colored. When senescence was scored as the percent of all leaves abscised and/or colored, branches in both treatments reached peak senescence earlier than did control branches within trees: dates of 50 % senescence occurred 2.5 days earlier for low herbivory branches and 9.7 days earlier for branches with high levels of simulated damage. These advanced rates are of the same time length as reported delays in autumn senescence and advances in spring onset due to climate warming. Thus, results suggest that should insect damage increase as a consequence of climate change, it may offset a lengthening of leaf life spans in some tree species.

  7. Transition of Monju simulator training owing to Monju accident and upgrade of Monju advanced reactor simulator (MARS)

    International Nuclear Information System (INIS)

    The Monju advanced reactor simulator (MARS) has been operated for training of Monju operators and for verification of Monju operating manual's appropriateness since 1991 for over 11 years. This report covers transition of Monju training system and modified of MARS owing to Monju accident as operating experience of MARS on from 1994 to 2001. The principal points mentioned are as follows: (1) Improved Monju training system owing to Monju accident 1) Reinforcement of sodium handling and sodium fire-fighting exercise. 2) Improved of training system and revised of training frequency. 3) Introduced of evaluation and analysis system regarding training results. 4) Providing of training guide line. 5) Step up of fundamental education by introducing of CAI (Computer Assisted Instruction System). (2) Upgrade of MARS for Monju restarting. 1) Reflected of the real plant data obtained from Monju performance test. 2) Addition of malfunction items. 3) Development of simulation software and addition of simulation panel concerning reinforced sodium leakage corresponding training. 4) Improvement of simulation ability and remodeling of calculating model by renewal of computer system. 5) Up graded program in the future. (author)

  8. Associative Memory computing power and its simulation.

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) chip is ASIC device specifically designed to perform ``pattern matching'' at very high speed and with parallel access to memory locations. The most extensive use for such device will be the ATLAS Fast Tracker (FTK) processor, where more than 8000 chips will be installed in 128 VME boards, specifically designed for high throughput in order to exploit the chip's features. Each AM chip will store a database of about 130000 pre-calculated patterns, allowing FTK to use about 1 billion patterns for the whole system, with any data inquiry broadcast to all memory elements simultaneously within the same clock cycle (10 ns), thus data retrieval time is independent of the database size. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS FTK processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 $\\mathrm{\\mu s}$. The simulation of such a parallelized system is an extremely complex task when executed in comm...

  9. Large Scale Computer Simulation of Erthocyte Membranes

    Science.gov (United States)

    Harvey, Cameron; Revalee, Joel; Laradji, Mohamed

    2007-11-01

    The cell membrane is crucial to the life of the cell. Apart from partitioning the inner and outer environment of the cell, they also act as a support of complex and specialized molecular machinery, important for both the mechanical integrity of the cell, and its multitude of physiological functions. Due to its relative simplicity, the red blood cell has been a favorite experimental prototype for investigations of the structural and functional properties of the cell membrane. The erythrocyte membrane is a composite quasi two-dimensional structure composed essentially of a self-assembled fluid lipid bilayer and a polymerized protein meshwork, referred to as the cytoskeleton or membrane skeleton. In the case of the erythrocyte, the polymer meshwork is mainly composed of spectrin, anchored to the bilayer through specialized proteins. Using a coarse-grained model, recently developed by us, of self-assembled lipid membranes with implicit solvent and using soft-core potentials, we simulated large scale red-blood-cells bilayers with dimensions ˜ 10-1 μm^2, with explicit cytoskeleton. Our aim is to investigate the renormalization of the elastic properties of the bilayer due to the underlying spectrin meshwork.

  10. Phase 1b MARAUDER computer simulations

    International Nuclear Information System (INIS)

    The MARAUDER (magnetically accelerated rings to achieve ultrahigh directed energy and radiation) program at the Air Force Weapons Laboratory is a study of magnetically-confined plasma toroids that will convert stored electrostatic energy into plasma kinetic energy. This energy may then be converted into other forms such as radiation. Using the 9.4 MJ SHIVA STAR fast capacitor bank to accelerate the toroids, the authors expect to achieve kinetic energies greater than 1 MJ. The first phase of the experimentation only forms the toroids and does not attempt to compress and accelerate them. This paper presents MHD calculations that have been performed with the 2-1/2 dimensional code, MACH22 in support of this phase. The simulations demonstrate the formation of the torus including reconnection of the poloidal magnetic field components. Number densities in the toroids are on the order of 1016 per cm3, and magnetic induction is on the order of 1 Tesla. A series of calculations shows that only a limited range of discharge energy will produce toroids. Too little energy will not push the plasma through the initial, injected poloidal field; too much will not allow a good reconnection. These results are compared with experimental measurements

  11. An advanced leakage scheme for neutrino treatment in astrophysical simulations

    CERN Document Server

    Perego, Albino; Käppeli, Roger

    2015-01-01

    We present an Advanced Spectral Leakage (ASL) scheme to model neutrinos in the context of core-collapse supernovae and compact binary mergers. Based on previous gray leakage schemes, the ASL scheme computes the neutrino cooling rates by interpolating local production and diffusion rates (relevant in optically thin and thick regimes, respectively), separately for discretized values of the neutrino energy. Neutrino trapped components are also modeled, based on equilibrium and timescale arguments. The better accuracy achieved by the spectral treatment allows a more reliable computation of neutrino heating rates in optically thin conditions. The scheme has been calibrated and tested against Boltzmann transport in the context of Newtonian spherically symmetric models of core-collapse supernovae. ASL shows a very good qualitative and a partial quantitative agreement, for key quantities from collapse to a few hundreds of milliseconds after core bounce. We have proved the adaptability and flexibility of our ASL schem...

  12. Computer-intensive simulation of solid-state NMR experiments using SIMPSON

    Science.gov (United States)

    Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr.; Vosegaard, Thomas

    2014-09-01

    Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations.

  13. Arrhythmic risk biomarkers for the assessment of drug cardiotoxicity: from experiments to computer simulations

    OpenAIRE

    Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.

    2010-01-01

    In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of si...

  14. Computer simulations in teaching physics: Development and implementation of a hypermedia system for high school teachers

    Science.gov (United States)

    da Silva, A. M. R.; de Macêdo, J. A.

    2016-06-01

    On the basis of the technological advancement in the middle and the difficulty of learning by the students in the discipline of physics, this article describes the process of elaboration and implementation of a hypermedia system for high school teachers involving computer simulations for teaching basic concepts of electromagnetism, using free tool. With the completion and publication of the project there will be a new possibility of interaction of students and teachers with the technology in the classroom and in labs.

  15. Advances in numerical simulation of nonlinear water waves

    CERN Document Server

    Ma, Qingwei

    2014-01-01

    Most of the Earth's surface is covered by water. Our everyday lives and activities are affected by water waves in oceans, such as the tsunami that occurred in the Indian Ocean on 26 December 2004. This indicates how important it is for us to fully understand water waves, in particular the very large ones. One way to do so is to perform numerical simulation based on the nonlinear theory. Considerable research advances have been made in this area over the past decade by developing various numerical methods and applying them to emerging problems; however, until now there has been no comprehensive

  16. A computer-based simulator of the atmospheric turbulence

    Science.gov (United States)

    Konyaev, Petr A.

    2015-11-01

    Computer software for modeling the atmospheric turbulence is developed on the basis of a time-varying random medium simulation algorithm and a split-step Fourier transform method for solving a wave propagation equation. A judicious choice of the simulator parameters, like the velocity of the evolution and motion of the medium, turbulence spectrum and scales, enables different effects of a random medium on the optical wavefront to be simulated. The implementation of the simulation software is shown to be simple and efficient due to parallel programming functions from the MKL Intel ® Parallel Studio libraries.

  17. Computer simulation tests of optimized neutron powder diffractometer configurations

    Science.gov (United States)

    Cussen, L. D.; Lieutenant, K.

    2016-06-01

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  18. Advanced Simulations of Optical Transition and Diffraction Radiation

    CERN Document Server

    AUTHOR|(CDS)2078350; Bobb, Lorraine Marie; Bolzon, B; Bravin, Enrico; Karataev, Pavel; Kruchinin, Konstantin; Lefevre, Thibaut; Mazzoni, Stefano

    2015-01-01

    Charged particle beam diagnostics is a key task in modern and future accelerator installations. The diagnostic tools are practically the “eyes” of the operators. The precision and resolution of the diagnostic equipment are crucial to define the performance of the accelerator. Transition and diffraction radiation (TR and DR) are widely used for electron beam parameter monitoring. However, the precision and resolution of those devices are determined by how well the production, transport and detection of these radiation types are understood. This paper reports on simulations of TR and DR spatial-spectral characteristics using the physical optics propagation (POP) mode of the Zemax advanced optics simulation software. A good consistency with theory is demonstrated. Also, realistic optical system alignment issues are discussed.

  19. The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method

    Directory of Open Access Journals (Sweden)

    Dipakkumar Gohil

    2012-06-01

    Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.

  20. Using EDUCache Simulator for the Computer Architecture and Organization Course

    Directory of Open Access Journals (Sweden)

    Sasko Ristov

    2013-07-01

    Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.